Science.gov

Sample records for automated sampling assessment

  1. Automated sampling assessment for molecular simulations using the effective sample size

    PubMed Central

    Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.

    2010-01-01

    To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418

  2. Automated Factor Slice Sampling.

    PubMed

    Tibbits, Matthew M; Groendyke, Chris; Haran, Murali; Liechty, John C

    2014-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the "factor slice sampler", a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  3. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  4. Automated Geospatial Watershed Assessment

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a Geographic Information Systems (GIS) interface jointly developed by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture (USDA) Agricultural Research Service, and the University of Arizona to a...

  5. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    SciTech Connect

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the accuracy of the

  6. Automated Grammatical Tagging of Child Language Samples.

    ERIC Educational Resources Information Center

    Channell, Ron W.; Johnson, Bonnie W.

    1999-01-01

    This study evaluated the accuracy of automated methods of grammatical categorization ("tagging") of transcribed conversational language samples from 30 normally developing children. On a word-by-word basis, automated accuracy levels averaged 95.1%; accuracy of tagging whole utterances averaged 77.7%. Results suggest that further improvement of…

  7. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  8. High throughput sample processing and automated scoring.

    PubMed

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R; Gutzkow, Kristine B

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  9. High throughput sample processing and automated scoring

    PubMed Central

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R.; Gutzkow, Kristine B.

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  10. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  11. Precise and automated microfluidic sample preparation.

    SciTech Connect

    Crocker, Robert W.; Patel, Kamlesh D.; Mosier, Bruce P.; Harnett, Cindy K.

    2004-07-01

    Autonomous bio-chemical agent detectors require sample preparation involving multiplex fluid control. We have developed a portable microfluidic pump array for metering sub-microliter volumes at flowrates of 1-100 {micro}L/min. Each pump is composed of an electrokinetic (EK) pump and high-voltage power supply with 15-Hz feedback from flow sensors. The combination of high pump fluid impedance and active control results in precise fluid metering with nanoliter accuracy. Automated sample preparation will be demonstrated by labeling proteins with fluorescamine and subsequent injection to a capillary gel electrophoresis (CGE) chip.

  12. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  13. Automated blood sampling systems for positron emission tomography

    SciTech Connect

    Eriksson, L.; Holte, S.; Bohm, C.; Kesselberg, M.; Hovander, B.

    1988-02-01

    An automated blood sampling system has been constructed and evaluated. Two different detector units in the blood sampling system are compared. Results from studies of blood-brain barrier transfer of a C-11 labelled receptor antagonist will be discussed.

  14. Automated Assessment in a Programming Tools Course

    ERIC Educational Resources Information Center

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  15. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    PubMed

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-01-01

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. PMID:26629776

  16. AGWA: The Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  17. Constructing Aligned Assessments Using Automated Test Construction

    ERIC Educational Resources Information Center

    Porter, Andrew; Polikoff, Morgan S.; Barghaus, Katherine M.; Yang, Rui

    2013-01-01

    We describe an innovative automated test construction algorithm for building aligned achievement tests. By incorporating the algorithm into the test construction process, along with other test construction procedures for building reliable and unbiased assessments, the result is much more valid tests than result from current test construction…

  18. Automated Geospatial Watershed Assessment Tool (AGWA)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University ...

  19. Automated Bone Age Assessment: Motivation, Taxonomies, and Challenges

    PubMed Central

    Ismail, Maizatul Akmar; Herawan, Tutut; Gopal Raj, Ram; Abdul Kareem, Sameem; Nasaruddin, Fariza Hanum

    2013-01-01

    Bone age assessment (BAA) of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research. PMID:24454534

  20. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  1. Rapid Automated Sample Preparation for Biological Assays

    SciTech Connect

    Shusteff, M

    2011-03-04

    Our technology utilizes acoustic, thermal, and electric fields to separate out contaminants such as debris or pollen from environmental samples, lyse open cells, and extract the DNA from the lysate. The objective of the project is to optimize the system described for a forensic sample, and demonstrate its performance for integration with downstream assay platforms (e.g. MIT-LL's ANDE). We intend to increase the quantity of DNA recovered from the sample beyond the current {approx}80% achieved using solid phase extraction methods. Task 1: Develop and test an acoustic filter for cell extraction. Task 2: Develop and test lysis chip. Task 3: Develop and test DNA extraction chip. All chips have been fabricated based on the designs laid out in last month's report.

  2. Enhanced training effectiveness using automated student assessment.

    SciTech Connect

    Forsythe, James Chris

    2010-05-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides an empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback.

  3. Automated Sample collection and Analysis unit

    SciTech Connect

    Latner, Norman; Sanderson, Colin G.; Negro, Vincent C.

    1999-03-31

    Autoramp is an atmospheric radionuclide collection and analysis unit designed for unattended operation. A large volume of air passes through one of 31 filter cartridges which is then moved from a sampling chamber and past a bar code reader, to a shielded enclosure. The collected dust-borne radionuclides are counted with a high resolution germanium gamma-ray detector. An analysis is made and the results are transmitted to a central station that can also remotely control the unit.

  4. Automated biowaste sampling system feces monitoring system

    NASA Technical Reports Server (NTRS)

    Hunt, S. R.; Glanfield, E. J.

    1979-01-01

    The Feces Monitoring System (FMS) Program designed, fabricated, assembled and tested an engineering model waste collector system (WCS) to be used in support of life science and medical experiments related to Shuttle missions. The FMS design was patterned closely after the Shuttle WCS, including: interface provisions; mounting; configuration; and operating procedures. These similarities make it possible to eventually substitute an FMS for the Shuttle WCS of Orbiter. In addition, several advanced waste collection features, including the capability of real-time inertial fecal separation and fecal mass measurement and sampling were incorporated into the FMS design.

  5. Automated data quality assessment of marine sensors.

    PubMed

    Timms, Greg P; de Souza, Paulo A; Reznik, Leon; Smith, Daniel V

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classifications of the gathered data; often as a binary decision of good or bad data that fails to quantify our confidence in the data for use in different applications. We propose a novel framework for automated data quality assessments that uses Fuzzy Logic to provide a continuous scale of data quality. This continuous quality scale is then used to compute error bars upon the data, which quantify the data uncertainty and provide a more meaningful measure of the data's fitness for purpose in a particular application compared with hard quality classifications. The design principles of the framework are presented and enable both data statistics and expert knowledge to be incorporated into the uncertainty assessment. We have implemented and tested the framework upon a real time platform of temperature and conductivity sensors that have been deployed to monitor the Derwent Estuary in Hobart, Australia. Results indicate that the error bars generated from the Fuzzy QA/QC implementation are in good agreement with the error bars manually encoded by a domain expert. PMID:22163714

  6. Automated Data Quality Assessment of Marine Sensors

    PubMed Central

    Timms, Greg P.; de Souza, Paulo A.; Reznik, Leon; Smith, Daniel V.

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classifications of the gathered data; often as a binary decision of good or bad data that fails to quantify our confidence in the data for use in different applications. We propose a novel framework for automated data quality assessments that uses Fuzzy Logic to provide a continuous scale of data quality. This continuous quality scale is then used to compute error bars upon the data, which quantify the data uncertainty and provide a more meaningful measure of the data’s fitness for purpose in a particular application compared with hard quality classifications. The design principles of the framework are presented and enable both data statistics and expert knowledge to be incorporated into the uncertainty assessment. We have implemented and tested the framework upon a real time platform of temperature and conductivity sensors that have been deployed to monitor the Derwent Estuary in Hobart, Australia. Results indicate that the error bars generated from the Fuzzy QA/QC implementation are in good agreement with the error bars manually encoded by a domain expert. PMID:22163714

  7. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  8. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  9. Automated Power Assessment for Helicopter Turboshaft Engines

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Litt, Jonathan S.

    2008-01-01

    An accurate indication of available power is required for helicopter mission planning purposes. Available power is currently estimated on U.S. Army Blackhawk helicopters by performing a Maximum Power Check (MPC), a manual procedure performed by maintenance pilots on a periodic basis. The MPC establishes Engine Torque Factor (ETF), an indication of available power. It is desirable to replace the current manual MPC procedure with an automated approach that will enable continuous real-time assessment of available power utilizing normal mission data. This report presents an automated power assessment approach which processes data currently collected within helicopter Health and Usage Monitoring System (HUMS) units. The overall approach consists of: 1) a steady-state data filter which identifies and extracts steady-state operating points within HUMS data sets; 2) engine performance curve trend monitoring and updating; and 3) automated ETF calculation. The algorithm is coded in MATLAB (The MathWorks, Inc.) and currently runs on a PC. Results from the application of this technique to HUMS mission data collected from UH-60L aircraft equipped with T700-GE-701C engines are presented and compared to manually calculated ETF values. Potential future enhancements are discussed.

  10. Detection of carryover in automated milk sampling equipment.

    PubMed

    Løvendahl, P; Bjerring, M A

    2006-09-01

    Equipment for sampling milk in automated milking systems may cause carryover problems if residues from one sample remain and are mixed with the subsequent sample. The degree of carryover can be estimated statistically by linear regression models. This study applied various regression analyses to several real and simulated data sets. The statistical power for detecting carryover milk improved considerably when information about cow identity was included and a mixed model was applied. Carryover may affect variation between animals, including genetic variation, and thereby have an impact on management decisions and diagnostic tools based on the milk content of somatic cells. An extended procedure is needed for approval of sampling equipment for automated milking with acceptable latitudes of carryover, and this could include the regression approach taken in this study. PMID:16899700

  11. Modular Automated Processing System (MAPS) for analysis of biological samples.

    SciTech Connect

    Gil, Geun-Cheol; Chirica, Gabriela S.; Fruetel, Julia A.; VanderNoot, Victoria A.; Branda, Steven S.; Schoeniger, Joseph S.; Throckmorton, Daniel J.; Brennan, James S.; Renzi, Ronald F.

    2010-10-01

    We have developed a novel modular automated processing system (MAPS) that enables reliable, high-throughput analysis as well as sample-customized processing. This system is comprised of a set of independent modules that carry out individual sample processing functions: cell lysis, protein concentration (based on hydrophobic, ion-exchange and affinity interactions), interferent depletion, buffer exchange, and enzymatic digestion of proteins of interest. Taking advantage of its unique capacity for enclosed processing of intact bioparticulates (viruses, spores) and complex serum samples, we have used MAPS for analysis of BSL1 and BSL2 samples to identify specific protein markers through integration with the portable microChemLab{trademark} and MALDI.

  12. Automated Imaging Techniques for Biosignature Detection in Geologic Samples

    NASA Astrophysics Data System (ADS)

    Williford, K. H.

    2015-12-01

    Robust biosignature detection in geologic samples typically requires the integration of morphological/textural data with biogeochemical data across a variety of scales. We present new automated imaging and coordinated biogeochemical analysis techniques developed at the JPL Astrobiogeochemistry Laboratory (abcLab) in support of biosignature detection in terrestrial samples as well as those that may eventually be returned from Mars. Automated gigapixel mosaic imaging of petrographic thin sections in transmitted and incident light (including UV epifluorescence) is supported by a microscopy platform with a digital XYZ stage. Images are acquired, processed, and co-registered using multiple software platforms at JPL and can be displayed and shared using Gigapan, a freely available, web-based toolset (e.g. . Automated large area (cm-scale) elemental mapping at sub-micrometer spatial resolution is enabled by a variable pressure scanning electron microscope (SEM) with a large (150 mm2) silicon drift energy dispersive spectroscopy (EDS) detector system. The abcLab light and electron microscopy techniques are augmented by additional elemental chemistry, mineralogy and organic detection/classification using laboratory Micro-XRF and UV Raman/fluorescence systems, precursors to the PIXL and SHERLOC instrument platforms selected for flight on the NASA Mars 2020 rover mission. A workflow including careful sample preparation followed by iterative gigapixel imaging, SEM/EDS, Micro-XRF and UV fluorescence/Raman in support of organic, mineralogic, and elemental biosignature target identification and follow up analysis with other techniques including secondary ion mass spectrometry (SIMS) will be discussed.

  13. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  14. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  15. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    PubMed Central

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  16. Critical assessment of automated flow cytometry data analysis techniques.

    PubMed

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R; Brinkman, Ryan; Gottardo, Raphael; Scheuermann, Richard H

    2013-03-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks: (i) mammalian cell population identification, to determine whether automated algorithms can reproduce expert manual gating and (ii) sample classification, to determine whether analysis pipelines can identify characteristics that correlate with external variables (such as clinical outcome). This analysis presents the results of the first FlowCAP challenges. Several methods performed well as compared to manual gating or external variables using statistical performance measures, which suggests that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  17. AUTOMATING GROUNDWATER SAMPLING AT HANFORD THE NEXT STEP

    SciTech Connect

    CONNELL CW; CONLEY SF; HILDEBRAND RD; CUNNINGHAM DE; R_D_Doug_Hildebrand@rl.gov; DeVon_E_Cunningham@rl.gov

    2010-01-21

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very "people intensive." Approximately 1500 wells are sampled each year by field personnel or "samplers." These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  18. DNA separations in microfabricated devices with automated capillary sample introduction.

    PubMed

    Smith, E M; Xu, H; Ewing, A G

    2001-01-01

    A novel method is presented for automated injection of DNA samples into microfabricated separation devices via capillary electrophoresis. A single capillary is used to electrokinetically inject discrete plugs of DNA into an array of separation lanes on a glass chip. A computer-controlled micromanipulator is used to automate this injection process and to repeat injections into five parallel lanes several times over the course of the experiment. After separation, labeled DNA samples are detected by laser-induced fluorescence. Five serial separations of 6-carboxyfluorescein (FAM)-labeled oligonucleotides in five parallel lanes are shown, resulting in the analysis of 25 samples in 25 min. It is estimated that approximately 550 separations of these same oligonucleotides could be performed in one hour by increasing the number of lanes to 37 and optimizing the rate of the manipulator movement. Capillary sample introduction into chips allows parallel separations to be continuously performed in serial, yielding high throughput and minimal need for operator intervention. PMID:11288906

  19. Automated assessment of mobility in bedridden patients.

    PubMed

    Bennett, Stephanie; Goubran, Rafik; Rockwood, Kenneth; Knoefel, Frank

    2013-01-01

    Immobility in older patients is a costly problem for both patients and healthcare workers. The Hierarchical Assessment of Balance and Mobility (HABAM) is a clinical tool able to assess immobile patients and predict morbidity, yet could become more reliable and informative through automation. This paper proposes an algorithm to automatically determine which of three enacted HABAM scores (associated with bedridden patients) had been performed by volunteers. A laptop was used to gather pressure data from three mats placed on a standard hospital bed frame while five volunteers performed three enactments each. A system of algorithms was created, consisting of three subsystems. The first subsystem used mattress data to calculate individual sensor sums and eliminate the weight of the mattress. The second subsystem established a baseline pressure reading for each volunteer and used percentage change to identify and distinguish between two enactments. The third subsystem used calculated weight distribution ratios to determine if the data represented the remaining enactment. The system was tested for accuracy by inputting the volunteer data and recording the assessment output (a score per data set). The system identified 13 of 15 sets of volunteer data as expected. Examination of these results indicated that the two sets of data were not misidentified; rather, the volunteers had made mistakes in performance. These results suggest that this system of algorithms is effective in distinguishing between the three HABAM score enactments examined here, and emphasizes the potential for pervasive computing to improve traditional healthcare. PMID:24110676

  20. Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments

    ERIC Educational Resources Information Center

    Blayney, Paul; Freeman, Mark

    2004-01-01

    This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…

  1. Rapid Quantification of Hepatitis B Virus DNA by Automated Sample Preparation and Real-Time PCR

    PubMed Central

    Stelzl, Evelyn; Muller, Zsofia; Marth, Egon; Kessler, Harald H.

    2004-01-01

    Monitoring of hepatitis B virus (HBV) DNA in serum by molecular methods has become the standard for assessment of the replicative activity of HBV. Several molecular assays for the detection and quantification of HBV DNA have been described. However, they usually lack automated sample preparation. Moreover, those assays, which are based on PCR, are limited by a short dynamic range (2 to 3 log units). In the present study, the use of RealArt HBV LC PCR Reagents in conjunction with automated extraction on the COBAS AMPLIPREP analyzer was evaluated. Members of an HBV proficiency program panel were tested; linearity, interassay, and intra-assay variations were determined. The performance of the assay in a routine clinical laboratory was evaluated with a total of 117 clinical specimens. When members of the HBV proficiency program panel were tested by the new molecular assay, the results were found to be within ±0.5 log unit of the results obtained by reference laboratories. Determination of linearity resulted in a quasilinear curve over more than 6 log units. The interassay variation of the RealArt HBV LC PCR Reagents by use of the automated sample preparation protocol ranged from 16 to 73%, and the intra-assay variation ranged from 9 to 40%. When clinical samples were tested by the new assay with the automated sample preparation protocol and the results were compared with those obtained by the COBAS AMPLICOR HBV MONITOR Test with manual sample preparation, the results for 76% of all samples with positive results by both tests were found to be within ±0.5 log unit and the results for another 18% were found to be within between 0.5 and 1.0 log unit. In conclusion, the real-time PCR assay with automated sample preparation proved to be suitable for the routine molecular laboratory and required less hands-on time. PMID:15184417

  2. Components for automated microfluidics sample preparation and analysis

    NASA Astrophysics Data System (ADS)

    Archer, M.; Erickson, J. S.; Hilliard, L. R.; Howell, P. B., Jr.; Stenger, D. A.; Ligler, F. S.; Lin, B.

    2008-02-01

    The increasing demand for portable devices to detect and identify pathogens represents an interdisciplinary effort between engineering, materials science, and molecular biology. Automation of both sample preparation and analysis is critical for performing multiplexed analyses on real world samples. This paper selects two possible components for such automated portable analyzers: modified silicon structures for use in the isolation of nucleic acids and a sheath flow system suitable for automated microflow cytometry. Any detection platform that relies on the genetic content (RNA and DNA) present in complex matrices requires careful extraction and isolation of the nucleic acids in order to ensure their integrity throughout the process. This sample pre-treatment step is commonly performed using commercially available solid phases along with various molecular biology techniques that require multiple manual steps and dedicated laboratory space. Regardless of the detection scheme, a major challenge in the integration of total analysis systems is the development of platforms compatible with current isolation techniques that will ensure the same quality of nucleic acids. Silicon is an ideal candidate for solid phase separations since it can be tailored structurally and chemically to mimic the conditions used in the laboratory. For analytical purposes, we have developed passive structures that can be used to fully ensheath one flow stream with another. As opposed to traditional flow focusing methods, our sheath flow profile is truly two dimensional, making it an ideal candidate for integration into a microfluidic flow cytometer. Such a microflow cytometer could be used to measure targets captured on either antibody- or DNA-coated beads.

  3. Automated acoustic matrix deposition for MALDI sample preparation.

    PubMed

    Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M

    2006-02-01

    Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented. PMID:16448057

  4. Validation of Automated Scoring of Science Assessments

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  5. Six Key Topics for Automated Assessment Utilisation and Acceptance

    ERIC Educational Resources Information Center

    Reiners, Torsten; Dreher, Carl; Dreher, Heinz

    2011-01-01

    Automated assessment technologies have been used in education for decades (e.g., computerised multiple choice tests). In contrast, Automated Essay Grading (AEG) technologies: have existed for decades; are "good in theory" (e.g., as accurate as humans, temporally and financially efficient, and can enhance formative feedback), and yet; are…

  6. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  7. Automation of sample plan creation for process model calibration

    NASA Astrophysics Data System (ADS)

    Oberschmidt, James; Abdo, Amr; Desouky, Tamer; Al-Imam, Mohamed; Krasnoperova, Azalia; Viswanathan, Ramya

    2010-04-01

    The process of preparing a sample plan for optical and resist model calibration has always been tedious. Not only because it is required to accurately represent full chip designs with countless combinations of widths, spaces and environments, but also because of the constraints imposed by metrology which may result in limiting the number of structures to be measured. Also, there are other limits on the types of these structures, and this is mainly due to the accuracy variation across different types of geometries. For instance, pitch measurements are normally more accurate than corner rounding. Thus, only certain geometrical shapes are mostly considered to create a sample plan. In addition, the time factor is becoming very crucial as we migrate from a technology node to another due to the increase in the number of development and production nodes, and the process is getting more complicated if process window aware models are to be developed in a reasonable time frame, thus there is a need for reliable methods to choose sample plans which also help reduce cycle time. In this context, an automated flow is proposed for sample plan creation. Once the illumination and film stack are defined, all the errors in the input data are fixed and sites are centered. Then, bad sites are excluded. Afterwards, the clean data are reduced based on geometrical resemblance. Also, an editable database of measurement-reliable and critical structures are provided, and their percentage in the final sample plan as well as the total number of 1D/2D samples can be predefined. It has the advantage of eliminating manual selection or filtering techniques, and it provides powerful tools for customizing the final plan, and the time needed to generate these plans is greatly reduced.

  8. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    PubMed

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis. PMID:25384595

  9. Semicontinuous automated measurement of organic carbon in atmospheric aerosol samples.

    PubMed

    Lu, Chao; Rashinkar, Shilpa M; Dasgupta, Purnendu K

    2010-02-15

    A fully automated measurement system for ambient aerosol organic carbon, capable of unattended operation over extended periods, is described. Particles are collected in a cyclone with water as the collection medium. The collected sample is periodically aspirated by a syringe pump into a holding loop and then delivered to a wet oxidation reactor (WOR). Acid is added, and the WOR is purged to measure dissolved CO(2) or inorganic carbonates (IC) as evolved CO(2). The IC background can often be small and sufficiently constant to be corrected for, without separate measurement, by a blank subtraction. The organic material is now oxidized stepwise or in one step to CO(2). The one-step oxidation involves UV-persulfate treatment in the presence of ozone. This treatment converts organic carbon (OC) to CO(2), but elemental carbon is not oxidized. The CO(2) is continuously purged from solution and collected by two sequential miniature diffusion scrubbers (DSs), a short DS preceding a longer one. Each DS consists of a LiOH-filled porous hydrophobic membrane tube with terminal stainless steel tubes that function as conductance-sensing electrodes. As CO(2) is collected by the LiOH-filled DSs, hydroxide is converted into carbonate and the resulting decrease in conductivity is monitored. The simultaneous use of the dual short and long DS units bearing different concentrations of LiOH permits both good sensitivity and a large dynamic range. The limit of detection (LOD, S/N = 3) is approximately 140 ng of C. With a typical sampling period of 30 min at a sampling rate of 30 L/min, this corresponds to an LOD of 160 ng/m(3). The approach also provides information on the ease of oxidation of the carbonaceous aerosol and hence the nature of the carbon contained therein. Ambient aerosol organic carbon data are presented. PMID:20092351

  10. Automated Assessment of Cognitive Health Using Smart Home Technologies

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen; Parsey, Carolyn

    2014-01-01

    BACKGROUND The goal of this work is to develop intelligent systems to monitor the well being of individuals in their home environments. OBJECTIVE This paper introduces a machine learning-based method to automatically predict activity quality in smart homes and automatically assess cognitive health based on activity quality. METHODS This paper describes an automated framework to extract set of features from smart home sensors data that reflects the activity performance or ability of an individual to complete an activity which can be input to machine learning algorithms. Output from learning algorithms including principal component analysis, support vector machine, and logistic regression algorithms are used to quantify activity quality for a complex set of smart home activities and predict cognitive health of participants. RESULTS Smart home activity data was gathered from volunteer participants (n=263) who performed a complex set of activities in our smart home testbed. We compare our automated activity quality prediction and cognitive health prediction with direct observation scores and health assessment obtained from neuropsychologists. With all samples included, we obtained statistically significant correlation (r=0.54) between direct observation scores and predicted activity quality. Similarly, using a support vector machine classifier, we obtained reasonable classification accuracy (area under the ROC curve = 0.80, g-mean = 0.73) in classifying participants into two different cognitive classes, dementia and cognitive healthy. CONCLUSIONS The results suggest that it is possible to automatically quantify the task quality of smart home activities and perform limited assessment of the cognitive health of individual if smart home activities are properly chosen and learning algorithms are appropriately trained. PMID:23949177

  11. Automating Assessment of Lifestyle Counseling in Electronic Health Records

    PubMed Central

    Hazlehurst, Brian L.; Lawrence, Jean M.; Donahoo, William T.; Sherwood, Nancy E; Kurtz, Stephen E; Xu, Stan; Steiner, John F

    2015-01-01

    Background Numerous population-based surveys indicate that overweight and obese patients can benefit from lifestyle counseling during routine clinical care. Purpose To determine if natural language processing (NLP) could be applied to information in the electronic health record (EHR) to automatically assess delivery of counseling related to weight management in clinical health care encounters. Methods The MediClass system with NLP capabilities was used to identify weight management counseling in EHR encounter records. Knowledge for the NLP application was derived from the 5As framework for behavior counseling: Ask (evaluate weight and related disease), Advise at-risk patients to lose weight, Assess patients’ readiness to change behavior, Assist through discussion of weight loss methods and programs and Arrange follow-up efforts including referral. Using samples of EHR data in 1/1/2007-3/31/2011 period from two health systems, the accuracy of the MediClass processor for identifying these counseling elements was evaluated in post-partum visits of 600 women with gestational diabetes mellitus (GDM) compared to manual chart review as gold standard. Data were analyzed in 2013. Results Mean sensitivity and specificity for each of the 5As compared to the gold standard was at or above 85%, with the exception of sensitivity for Assist which was measured at 40% and 60% respectively for each of the two health systems. The automated method identified many valid cases of Assist not identified in the gold standard. Conclusions The MediClass processor has performance capability sufficiently similar to human abstractors to permit automated assessment of counseling for weight loss in post-partum encounter records. PMID:24745635

  12. Using Software Tools to Automate the Assessment of Student Programs.

    ERIC Educational Resources Information Center

    Jackson, David

    1991-01-01

    Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…

  13. Ability-Training-Oriented Automated Assessment in Introductory Programming Course

    ERIC Educational Resources Information Center

    Wang, Tiantian; Su, Xiaohong; Ma, Peijun; Wang, Yuying; Wang, Kuanquan

    2011-01-01

    Learning to program is a difficult process for novice programmers. AutoLEP, an automated learning and assessment system, was developed by us, to aid novice programmers to obtain programming skills. AutoLEP is ability-training-oriented. It adopts a novel assessment mechanism, which combines static analysis with dynamic testing to analyze student…

  14. Case study: Automated utilities damage assessment (AUDA) system

    SciTech Connect

    Salavani, R.; Laventure, G.C.; Smith, M.D.

    1994-12-31

    A demonstration program of an automated utility damage assessment system (AUDA) at a United States Air Force facility (USAF) is described. The AUDA is designed to assess damage, in an efficient manner, to military equipment or utilities, such as electrical equipment, potable and waste water, HVAC systems, petroleum, oil and lubricants, and natural gas.

  15. Manual versus automated blood sampling: impact of repeated blood sampling on stress parameters and behavior in male NMRI mice

    PubMed Central

    Kalliokoski, Otto; Sørensen, Dorte B; Hau, Jann; Abelson, Klas S P

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters measured, and expressed less anxious behavior. We conclude that repeated blood sampling by automated blood sampling and from the tail vein is less stressful than cheek blood sampling. The choice between automated blood sampling and tail blood sampling should be based on the study requirements, the resources of the laboratory and skills of the staff. PMID:24958546

  16. Automated Mars surface sample return mission concepts for achievement of essential scientific objectives

    NASA Technical Reports Server (NTRS)

    Weaver, W. L.; Norton, H. N.; Darnell, W. L.

    1975-01-01

    Mission concepts were investigated for automated return to Earth of a Mars surface sample adequate for detailed analyses in scientific laboratories. The minimum sample mass sufficient to meet scientific requirements was determined. Types of materials and supporting measurements for essential analyses are reported. A baseline trajectory profile was selected for its low energy requirements and relatively simple implementation, and trajectory profile design data were developed for 1979 and 1981 launch opportunities. Efficient spacecraft systems were conceived by utilizing existing technology where possible. Systems concepts emphasized the 1979 launch opportunity, and the applicability of results to other opportunities was assessed. It was shown that the baseline missions (return through Mars parking orbit) and some comparison missions (return after sample transfer in Mars orbit) can be accomplished by using a single Titan III E/Centaur as the launch vehicle. All missions investigated can be accomplished by use of Space Shuttle/Centaur vehicles.

  17. Automated biowaste sampling system improved feces collection, mass measurement and sampling. [by use of a breadboard model

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Young, R.

    1974-01-01

    The capability of the basic automated Biowaste Sampling System (ABSS) hardware was extended and improved through the design, fabrication and test of breadboard hardware. A preliminary system design effort established the feasibility of integrating the breadboard concepts into the ABSS.

  18. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  19. Needs Assessments for Automated Manufacturing Training Programs.

    ERIC Educational Resources Information Center

    Northampton Community Coll., Bethlehem, PA.

    This document contains needs assessments used by Northampton Community College to develop training courses for a business-industry technology resource center for firms in eastern Pennsylvania. The following needs assessments are included: (1) individual skills survey for workers at Keystone Cement Company; (2) Keystone group skills survey; (3)…

  20. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  1. Automated Geospatial Watershed Assessment (AGWA) Documentation Version 2.0

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Http://www.epa.gov/nerlesd1/landsci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, University of Arizon...

  2. Human and Automated Assessment of Oral Reading Fluency

    ERIC Educational Resources Information Center

    Bolaños, Daniel; Cole, Ron A.; Ward, Wayne H.; Tindal, Gerald A.; Hasbrouck, Jan; Schwanenflugel, Paula J.

    2013-01-01

    This article describes a comprehensive approach to fully automated assessment of children's oral reading fluency (ORF), one of the most informative and frequently administered measures of children's reading ability. Speech recognition and machine learning techniques are described that model the 3 components of oral reading fluency: word accuracy,…

  3. Automated Assessment of Upper Extremity Movement Impairment due to Stroke

    PubMed Central

    Olesh, Erienne V.; Yakovenko, Sergiy; Gritsenko, Valeriya

    2014-01-01

    Current diagnosis and treatment of movement impairment post-stroke is based on the subjective assessment of select movements by a trained clinical specialist. However, modern low-cost motion capture technology allows for the development of automated quantitative assessment of motor impairment. Such outcome measures are crucial for advancing post-stroke treatment methods. We sought to develop an automated method of measuring the quality of movement in clinically-relevant terms from low-cost motion capture. Unconstrained movements of upper extremity were performed by people with chronic hemiparesis and recorded by standard and low-cost motion capture systems. Quantitative scores derived from motion capture were compared to qualitative clinical scores produced by trained human raters. A strong linear relationship was found between qualitative scores and quantitative scores derived from both standard and low-cost motion capture. Performance of the automated scoring algorithm was matched by averaged qualitative scores of three human raters. We conclude that low-cost motion capture combined with an automated scoring algorithm is a feasible method to assess objectively upper-arm impairment post stroke. The application of this technology may not only reduce the cost of assessment of post-stroke movement impairment, but also promote the acceptance of objective impairment measures into routine medical practice. PMID:25100036

  4. Validity Arguments for Diagnostic Assessment Using Automated Writing Evaluation

    ERIC Educational Resources Information Center

    Chapelle, Carol A.; Cotos, Elena; Lee, Jooyoung

    2015-01-01

    Two examples demonstrate an argument-based approach to validation of diagnostic assessment using automated writing evaluation (AWE). "Criterion"®, was developed by Educational Testing Service to analyze students' papers grammatically, providing sentence-level error feedback. An interpretive argument was developed for its use as part of…

  5. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  6. SOURCE ASSESSMENT SAMPLING SYSTEM: DESIGN AND DEVELOPMENT

    EPA Science Inventory

    The report chronologically describes the design and development of the Source Assessment Sampling System (SASS). The SASS train is the principal sampling element for ducted sources when performing EPA's Level 1 environmental assessment studies. As such, it samples process streams...

  7. A modular approach for automated sample preparation and chemical analysis

    NASA Technical Reports Server (NTRS)

    Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph

    1994-01-01

    Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.

  8. Operator-based metric for nuclear operations automation assessment

    SciTech Connect

    Zacharias, G.L.; Miao, A.X.; Kalkan, A.

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  9. Systematic Nursing Assessment: A Step toward Automation.

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. School of Nursing.

    The project's broad objective was to improve patient care through the development of a manual or computer-assisted tool for assessing patient health/illness status and recording essential information throughout a period of care. The project sought contributions from practicing nurses in identifying the descriptive clinical information needed to…

  10. Automated Geospatial Watershed Assessment Tool for Rangelands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil and water conservation is the keystone to sustainable livestock grazing and maintenance of native species on our western rangelands. Good rangeland management requires the ability to assess the potential impacts of climate and management actions on runoff and erosion at both hillslope and wate...

  11. Automated Assessment and Experiences of Teaching Programming

    ERIC Educational Resources Information Center

    Higgins, Colin A.; Gray, Geoffrey; Symeonidis, Pavlos; Tsintsifas, Athanasios

    2005-01-01

    This article reports on the design, implementation, and usage of the CourseMarker (formerly known as CourseMaster) courseware Computer Based Assessment (CBA) system at the University of Nottingham. Students use CourseMarker to solve (programming) exercises and to submit their solutions. CourseMarker returns immediate results and feedback to the…

  12. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  13. Automated FMV image quality assessment based on power spectrum statistics

    NASA Astrophysics Data System (ADS)

    Kalukin, Andrew

    2015-05-01

    Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).

  14. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED TOOL FOR WATERSHED ASSESSMENT AND PLANNING

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  15. Automating risk of bias assessment for clinical trials.

    PubMed

    Marshall, Iain J; Kuiper, Joël; Wallace, Byron C

    2015-07-01

    Systematic reviews, which summarize the entirety of the evidence pertaining to a specific clinical question, have become critical for evidence-based decision making in healthcare. But such reviews have become increasingly onerous to produce due to the exponentially expanding biomedical literature base. This study proposes a step toward mitigating this problem by automating risk of bias assessment in systematic reviews, in which reviewers determine whether study results may be affected by biases (e.g., poor randomization or blinding). Conducting risk of bias assessment is an important but onerous task. We thus describe a machine learning approach to automate this assessment, using the standard Cochrane Risk of Bias Tool which assesses seven common types of bias. Training such a system would typically require a large labeled corpus, which would be prohibitively expensive to collect here. Instead, we use distant supervision, using data from the Cochrane Database of Systematic Reviews (a large repository of systematic reviews), to pseudoannotate a corpus of 2200 clinical trial reports in PDF format. We then develop a joint model which, using the full text of a clinical trial report as input, predicts the risks of bias while simultaneously extracting the text fragments supporting these assessments. This study represents a step toward automating or semiautomating extraction of data necessary for the synthesis of clinical trials. PMID:25966488

  16. Automated Assessment of Medical Training Evaluation Text

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B.

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data. PMID:23304426

  17. Automated sample preparation and analysis using a sequential-injection-capillary electrophoresis (SI-CE) interface.

    PubMed

    Kulka, Stephan; Quintás, Guillermo; Lendl, Bernhard

    2006-06-01

    A fully automated sequential-injection-capillary electrophoresis (SI-CE) system was developed using commercially available components as the syringe pump, the selection and injection valves and the high voltage power supply. The interface connecting the SI with the CE unit consisted of two T-pieces, where the capillary was inserted in one T-piece and a Pt electrode in the other (grounded) T-piece. By pressurising the whole system using a syringe pump, hydrodynamic injection was feasible. For characterisation, the system was applied to a mixture of adenosine and adenosine monophosphate at different concentrations. The calibration curve obtained gave a detection limit of 0.5 microg g(-1) (correlation coefficient of 0.997). The reproducibility of the injection was also assessed, resulting in a RSD value (5 injections) of 5.4%. The total time of analysis, from injection, conditioning and separation to cleaning the capillary again was 15 minutes. In another application, employing the full power of the automated SIA-CE system, myoglobin was mixed directly using the flow system with different concentrations of sodium dodecyl sulfate (SDS), a known denaturing agent. The different conformations obtained in this way were analysed with the CE system and a distinct shift in migration time and decreasing of the native peak of myoglobin (Mb) could be observed. The protein samples prepared were also analysed with off-line infrared spectroscopy (IR), confirming these results. PMID:16732362

  18. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  19. Situation Awareness and Levels of Automation: Empirical Assessment of Levels of Automation in the Commercial Cockpit

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Schutte, Paul C. (Technical Monitor)

    2000-01-01

    This report has been prepared to closeout a NASA grant to Mississippi State University (MSU) for research into situation awareness (SA) and automation in the advanced commercial aircraft cockpit. The grant was divided into two obligations including $60,000 for the period from May 11, 2000 to December 25, 2000. The information presented in this report summarizes work completed through this obligation. It also details work to be completed with the balance of the current obligation and unobligated funds amounting to $50,043, which are to be granted to North Carolina State University for completion of the research project from July 31, 2000 to May 10, 2001. This research was to involve investigation of a broad spectrum of degrees of automation of complex systems on human-machine performance and SA. The work was to empirically assess the effect of theoretical levels of automation (LOAs) described in a taxonomy developed by Endsley & Kaber (1999) on naive and experienced subject performance and SA in simulated flight tasks. The study was to be conducted in the context of a realistic simulation of aircraft flight control. The objective of this work was to identify LOAs that effectively integrate humans and machines under normal operating conditions and failure modes. In general, the work was to provide insight into the design of automation in the commercial aircraft cockpit. Both laboratory and field investigations were to be conducted. At this point in time, a high-fidelity flight simulator of the McDonald Douglas (MD) 11 aircraft has been completed. The simulator integrates a reconfigurable flight simulator developed by the Georgia Institute of Technology and stand-alone simulations of MD-11 autoflight systems developed at MSU. Use of the simulator has been integrated into a study plan for the laboratory research and it is expected that the simulator will also be used in the field study with actual commercial pilots. In addition to the flight simulator, an electronic

  20. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    SciTech Connect

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.

  1. Automated Research Impact Assessment: A New Bibliometrics Approach

    PubMed Central

    Drew, Christina H.; Pettibone, Kristianna G.; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-01-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in “important” research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research. PMID:26989272

  2. The Stanford Automated Mounter: pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    PubMed Central

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-01-01

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. Sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade. PMID:27047309

  3. The Stanford Automated Mounter: Pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    DOE PAGESBeta

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-02-24

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.

  4. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  5. Automated biowaste sampling system, solids subsystem operating model, part 2

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Stauffer, R. E.

    1973-01-01

    The detail design and fabrication of the Solids Subsystem were implemented. The system's capacity for the collection, storage or sampling of feces and vomitus from six subjects was tested and verified.

  6. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P.; Gustafsson, U. R. C.

    1975-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of selected particulate and gaseous constituents of the atmosphere has been installed on a number of commercial airliners and is collecting data on commercial air routes covering the world. Measurements of constituents related to aircraft engine emissions and other pollutants are made in the upper troposphere and lower stratosphere (6 to 12 km) in support of the Global Air Sampling Program (GASP). Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This system includes specialized instrumentation for measuring carbon monoxide, ozone, water vapor, and particulates, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituents and related flight data are tape recorded in flight for later computer processing on the ground.

  7. An Automated Sample Divider for Farmers Stock Peanuts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In-shell peanuts are harvested, loaded into drying trailers, and delivered to a central facility where they are dried to a moisture content safe for long term storage, sampled, graded, then unloaded into bulk storage. Drying trailers have capacities ranging from five to twenty-five tons of dry farme...

  8. Automated biowaste sampling system urine subsystem operating model, part 1

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  9. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  10. Assessing respondent-driven sampling.

    PubMed

    Goel, Sharad; Salganik, Matthew J

    2010-04-13

    Respondent-driven sampling (RDS) is a network-based technique for estimating traits in hard-to-reach populations, for example, the prevalence of HIV among drug injectors. In recent years RDS has been used in more than 120 studies in more than 20 countries and by leading public health organizations, including the Centers for Disease Control and Prevention in the United States. Despite the widespread use and growing popularity of RDS, there has been little empirical validation of the methodology. Here we investigate the performance of RDS by simulating sampling from 85 known, network populations. Across a variety of traits we find that RDS is substantially less accurate than generally acknowledged and that reported RDS confidence intervals are misleadingly narrow. Moreover, because we model a best-case scenario in which the theoretical RDS sampling assumptions hold exactly, it is unlikely that RDS performs any better in practice than in our simulations. Notably, the poor performance of RDS is driven not by the bias but by the high variance of estimates, a possibility that had been largely overlooked in the RDS literature. Given the consistency of our results across networks and our generous sampling conditions, we conclude that RDS as currently practiced may not be suitable for key aspects of public health surveillance where it is now extensively applied. PMID:20351258

  11. Assessing respondent-driven sampling

    PubMed Central

    Goel, Sharad; Salganik, Matthew J.

    2010-01-01

    Respondent-driven sampling (RDS) is a network-based technique for estimating traits in hard-to-reach populations, for example, the prevalence of HIV among drug injectors. In recent years RDS has been used in more than 120 studies in more than 20 countries and by leading public health organizations, including the Centers for Disease Control and Prevention in the United States. Despite the widespread use and growing popularity of RDS, there has been little empirical validation of the methodology. Here we investigate the performance of RDS by simulating sampling from 85 known, network populations. Across a variety of traits we find that RDS is substantially less accurate than generally acknowledged and that reported RDS confidence intervals are misleadingly narrow. Moreover, because we model a best-case scenario in which the theoretical RDS sampling assumptions hold exactly, it is unlikely that RDS performs any better in practice than in our simulations. Notably, the poor performance of RDS is driven not by the bias but by the high variance of estimates, a possibility that had been largely overlooked in the RDS literature. Given the consistency of our results across networks and our generous sampling conditions, we conclude that RDS as currently practiced may not be suitable for key aspects of public health surveillance where it is now extensively applied. PMID:20351258

  12. Automated Assessment of the Quality of Depression Websites

    PubMed Central

    Tang, Thanh Tin; Hawking, David; Christensen, Helen

    2005-01-01

    Background Since health information on the World Wide Web is of variable quality, methods are needed to assist consumers to identify health websites containing evidence-based information. Manual assessment tools may assist consumers to evaluate the quality of sites. However, these tools are poorly validated and often impractical. There is a need to develop better consumer tools, and in particular to explore the potential of automated procedures for evaluating the quality of health information on the web. Objective This study (1) describes the development of an automated quality assessment procedure (AQA) designed to automatically rank depression websites according to their evidence-based quality; (2) evaluates the validity of the AQA relative to human rated evidence-based quality scores; and (3) compares the validity of Google PageRank and the AQA as indicators of evidence-based quality. Method The AQA was developed using a quality feedback technique and a set of training websites previously rated manually according to their concordance with statements in the Oxford University Centre for Evidence-Based Mental Health’s guidelines for treating depression. The validation phase involved 30 websites compiled from the DMOZ, Yahoo! and LookSmart Depression Directories by randomly selecting six sites from each of the Google PageRank bands of 0, 1-2, 3-4, 5-6 and 7-8. Evidence-based ratings from two independent raters (based on concordance with the Oxford guidelines) were then compared with scores derived from the automated AQA and Google algorithms. There was no overlap in the websites used in the training and validation phases of the study. Results The correlation between the AQA score and the evidence-based ratings was high and significant (r=0.85, P<.001). Addition of a quadratic component improved the fit, the combined linear and quadratic model explaining 82 percent of the variance. The correlation between Google PageRank and the evidence-based score was lower than

  13. Automated Portable Test System (APTS) - A performance envelope assessment tool

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Dunlap, W. P.; Jones, M. B.; Wilkes, R. L.; Bittner, A. C., Jr.

    1985-01-01

    The reliability and stability of microcomputer-based psychological tests are evaluated. The hardware, test programs, and system control of the Automated Portable Test System, which assesses human performance and subjective status, are described. Subjects were administered 11 pen-and-pencil and microcomputer-based tests for 10 sessions. The data reveal that nine of the 10 tests stabilized by the third administration; inertial correlations were high and consistent. It is noted that the microcomputer-based tests display good psychometric properties in terms of differential stability and reliability.

  14. Core sampling system spare parts assessment

    SciTech Connect

    Walter, E.J.

    1995-04-04

    Soon, there will be 4 independent core sampling systems obtaining samples from the underground tanks. It is desirable that these systems be available for sampling during the next 2 years. This assessment was prepared to evaluate the adequacy of the spare parts identified for the core sampling system and to provide recommendations that may remediate overages or inadequacies of spare parts.

  15. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Fire Management and Assessment.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impacts of wildland fire on runoff and erosion. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface joi...

  16. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    ERIC Educational Resources Information Center

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  17. Automated sample mounting and alignment system for biological crystallography at a synchrotron source.

    PubMed

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C; Earnest, Thomas

    2004-04-01

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting and alignment system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on "puck-shaped" cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed. PMID:15062077

  18. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    SciTech Connect

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-07

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on ''puck-shaped'' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed.

  19. Automated syringe sampler. [remote sampling of air and water

    NASA Technical Reports Server (NTRS)

    Purgold, G. C. (Inventor)

    1981-01-01

    A number of sampling services are disposed in a rack which slides into a housing. In response to a signal from an antenna, the circutry elements are activated which provide power individually, collectively, or selectively to a servomechanism thereby moving an actuator arm and the attached jawed bracket supporting an evaculated tube towards a stationary needle. One open end of the needle extends through the side wall of a conduit to the interior and the other open end is maintained within the protective sleeve, supported by a bifurcated bracket. A septum in punctured by the end of the needle within the sleeve and a sample of the fluid medium in the conduit flows through the needle and is transferred to a tube. The signal to the servo is then reversed and the actuator arm moves the tube back to its original position permitting the septum to expand and seal the hole made by the needle. The jawed bracket is attached by pivot to the actuator to facilitate tube replacement.

  20. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    NASA Astrophysics Data System (ADS)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  1. An automated integrated platform for rapid and sensitive multiplexed protein profiling using human saliva samples

    PubMed Central

    Nie, Shuai; Henley, W. Hampton; Miller, Scott E.; Zhang, Huaibin; Mayer, Kathryn M.; Dennis, Patty J.; Oblath, Emily A.; Alarie, Jean Pierre; Wu, Yue; Oppenheim, Frank G.; Little, Frédéric F.; Uluer, Ahmet Z.; Wang, Peidong; Ramsey, J. Michael

    2014-01-01

    During the last decade, saliva has emerged as a potentially ideal diagnostic biofluid for noninvasive testing. In this paper, we present an automated, integrated platform useable by minimally trained personnel in the field for the diagnosis of respiratory diseases using human saliva as a sample specimen. In this platform, a saliva sample is loaded onto a disposable microfluidic chip containing all the necessary reagents and components required for saliva analysis. The chip is then inserted into the automated analyzer, the SDReader, where multiple potential protein biomarkers for respiratory diseases are measured simultaneously using a microsphere-based array via fluorescence sandwich immunoassays. The results are read optically, and the images are analyzed by a custom-designed algorithm. The fully automated assay requires as little as 10 μL of saliva sample, and the results are reported in 70 min. The performance of the platform was characterized by testing protein standard solutions, and the results were comparable to those from the 3.5-h lab bench assay that we have previously reported. The device was also deployed in two clinical environments where 273 human saliva samples collected from different subjects were successfully tested, demonstrating the device’s potential to assist clinicians with the diagnosis of respiratory diseases by providing timely protein biomarker profiling information. This platform, which combines non-invasive sample collection and fully automated analysis, can also be utilized in point-of-care diagnostics. PMID:24448498

  2. Automated LSA Assessment of Summaries in Distance Education: Some Variables to Be Considered

    ERIC Educational Resources Information Center

    Jorge-Botana, Guillermo; Luzón, José M.; Gómez-Veiga, Isabel; Martín-Cordero, Jesús I.

    2015-01-01

    A latent semantic analysis-based automated summary assessment is described; this automated system is applied to a real learning from text task in a Distance Education context. We comment on the use of automated content, plagiarism, text coherence measures, and word weights average and their impact on predicting human judges summary scoring. A…

  3. Investigation of Automated Sampling Techniques to Measure Total Mercury in Stream- Water During Storm-Events

    NASA Astrophysics Data System (ADS)

    Riscassi, A. L.; Scanlon, T. M.

    2008-12-01

    High-flow events (storms and snowmelt) are a dominant transport mechanism for total mercury (HgT) from the terrestrial to the aqueous environment. High-gradient headwater catchments are a primary source of downstream contamination because they store large pools of Hg in soils and sediments. Consistent, high- frequency event-sampling of headwater streams is rare, however, because of the unpredictability of high flows, remoteness of sites, and the difficulties associated with the ultra-clean sampling procedures. The use of automated sampling techniques with an ISCO® sampler has been demonstrated in several studies for trace metals, but their use for collection of HgT samples has not been systematically evaluated in the literature. Even with clean equipment at deployment, subsequent contamination and loss by evasion are possible considering the bottles, as currently designed, are open to the atmosphere before sampling and until retrieval. Field tests are conducted using an ISCO® sampler retrofitted with pre- cleaned Teflon® sampling lines and glass bottles to determine the relative errors associated with the automated sampling method for a variety of HgT concentrations and preservation techniques. Differences between quality assurance and quality control results for automated and manual sampling are also investigated. Sample containers are filled with known standards of HgT solution and left in the ISCO® containers at the field site and each day (up to 7 days) are capped and returned for analysis. During a storm event, manual samples are taken from the middle of the water column concurrently with the ISCO® at hourly intervals using "clean hands" procedures. Evaluations of results are used to establish quality assurance guidelines for future field campaigns using automated techniques for HgT sampling.

  4. Automated Tobacco Assessment and Cessation Support for Cancer Patients

    PubMed Central

    Warren, Graham W.; Marshall, James R.; Cummings, K. Michael; Zevon, Michael A.; Reed, Robert; Hysert, Pat; Mahoney, Martin C.; Hyland, Andrew J.; Nwogu, Chukwumere; Demmy, Todd; Dexter, Elisabeth; Kelly, Maureen; O’Connor, Richard J.; Houstin, Teresa; Jenkins, Dana; Germain, Pamela; Singh, Anurag K.; Epstein, Jennifer; Dobson Amato, Katharine A.; Reid, Mary E.

    2015-01-01

    BACKGROUND Tobacco assessment and cessation support are not routinely included in cancer care. An automated tobacco assessment and cessation program was developed to increase the delivery of tobacco cessation support for cancer patients. METHODS A structured tobacco assessment was incorporated into the electronic health record at Roswell Park Cancer Institute to identify tobacco use in cancer patients at diagnosis and during follow-up. All patients who reported tobacco use within the past 30 days were automatically referred to a dedicated cessation program that provided cessation counseling. Data were analyzed for referral accuracy and interest in cessation support. RESULTS Between October 2010 and December 2012, 11,868 patients were screened for tobacco use, and 2765 were identified as tobacco users and were referred to the cessation service. In referred patients, 1381 of those patients received only a mailed invitation to contact the cessation service, and 1384 received a mailing as well as telephone contact attempts from the cessation service. In the 1126 (81.4%) patients contacted by telephone, 51 (4.5%) reported no tobacco use within the past 30 days, 35 (3.1%) were medically unable to participate, and 30 (2.7%) declined participation. Of the 1381 patients who received only a mailed invitation, 16 (1.2%) contacted the cessation program for assistance. Three questions at initial consult and follow-up generated over 98% of referrals. Tobacco assessment frequency every 4 weeks delayed referral in <1% of patients. CONCLUSIONS An automated electronic health record-based tobacco assessment and cessation referral program can identify substantial numbers of smokers who are receptive to enrollment in a cessation support service. PMID:24496870

  5. Automated laboratory based X-ray beamline with multi-capillary sample chamber

    NASA Astrophysics Data System (ADS)

    Purushothaman, S.; Gauthé, B. L. L. E.; Brooks, N. J.; Templer, R. H.; Ces, O.

    2013-08-01

    An automated laboratory based X-ray beamline with a multi-capillary sample chamber capable of undertaking small angle X-ray scattering measurements on a maximum of 104 samples at a time as a function of temperature between 5 and 85 °C has been developed. The modular format of the system enables the user to simultaneously equilibrate samples at eight different temperatures with an accuracy of ±0.005 °C. This system couples a rotating anode generator and 2D optoelectronic detector with Franks X-ray optics, leading to typical exposure times of less than 5 min for lyotropic liquid crystalline samples. Beamline control including sample exchange and data acquisition has been fully automated via a custom designed LabVIEW framework.

  6. Automated laboratory based X-ray beamline with multi-capillary sample chamber

    SciTech Connect

    Purushothaman, S.; Gauthé, B. L. L. E.; Brooks, N. J.; Templer, R. H.; Ces, O.

    2013-08-15

    An automated laboratory based X-ray beamline with a multi-capillary sample chamber capable of undertaking small angle X-ray scattering measurements on a maximum of 104 samples at a time as a function of temperature between 5 and 85 °C has been developed. The modular format of the system enables the user to simultaneously equilibrate samples at eight different temperatures with an accuracy of ±0.005 °C. This system couples a rotating anode generator and 2D optoelectronic detector with Franks X-ray optics, leading to typical exposure times of less than 5 min for lyotropic liquid crystalline samples. Beamline control including sample exchange and data acquisition has been fully automated via a custom designed LabVIEW framework.

  7. Automated bone age assessment of older children using the radius

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Gertych, Arkadiusz; Zhang, Aifeng; Liu, Brent J.; Huang, Han K.

    2008-03-01

    The Digital Hand Atlas in Assessment of Skeletal Development is a large-scale Computer Aided Diagnosis (CAD) project for automating the process of grading Skeletal Development of children from 0-18 years of age. It includes a complete collection of 1,400 normal hand X-rays of children between the ages of 0-18 years of age. Bone Age Assessment is used as an index of skeletal development for detection of growth pathologies that can be related to endocrine, malnutrition and other disease types. Previous work at the Image Processing and Informatics Lab (IPILab) allowed the bone age CAD algorithm to accurately assess bone age of children from 1 to 16 (male) or 14 (female) years of age using the Phalanges as well as the Carpal Bones. At the older ages (16(male) or 14(female) -19 years of age) the Phalanges as well as the Carpal Bones are fully developed and do not provide well-defined features for accurate bone age assessment. Therefore integration of the Radius Bone as a region of interest (ROI) is greatly needed and will significantly improve the ability to accurately assess the bone age of older children. Preliminary studies show that an integrated Bone Age CAD that utilizes the Phalanges, Carpal Bones and Radius forms a robust method for automatic bone age assessment throughout the entire age range (1-19 years of age).

  8. Automated Video Quality Assessment for Deep-Sea Video

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating

  9. An automated system for global atmospheric sampling using B-747 airliners

    NASA Technical Reports Server (NTRS)

    Lew, K. Q.; Gustafsson, U. R. C.; Johnson, R. E.

    1981-01-01

    The global air sampling program utilizes commercial aircrafts in scheduled service to measure atmospheric constituents. A fully automated system designed for the 747 aircraft is described. Airline operational constraints and data and control subsystems are treated. The overall program management, system monitoring, and data retrieval from four aircraft in global service is described.

  10. Automated gas sampling system for laboratory analysis of CH4 and N2O

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analyzing the flux of CH4 and N2O from soil is labor intensive when conventional hand injection techniques are utilized in gas chromatography. An automated gas sampling system was designed and assembled from a prototype developed at the National Soil Tilth Laboratory in Ames, Iowa. The sampler was e...

  11. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. PMID:26429557

  12. Automated quality assessment in three-dimensional breast ultrasound images.

    PubMed

    Schwaab, Julia; Diez, Yago; Oliver, Arnau; Martí, Robert; van Zelst, Jan; Gubern-Mérida, Albert; Mourri, Ahmed Bensouda; Gregori, Johannes; Günther, Matthias

    2016-04-01

    Automated three-dimensional breast ultrasound (ABUS) is a valuable adjunct to x-ray mammography for breast cancer screening of women with dense breasts. High image quality is essential for proper diagnostics and computer-aided detection. We propose an automated image quality assessment system for ABUS images that detects artifacts at the time of acquisition. Therefore, we study three aspects that can corrupt ABUS images: the nipple position relative to the rest of the breast, the shadow caused by the nipple, and the shape of the breast contour on the image. Image processing and machine learning algorithms are combined to detect these artifacts based on 368 clinical ABUS images that have been rated manually by two experienced clinicians. At a specificity of 0.99, 55% of the images that were rated as low quality are detected by the proposed algorithms. The areas under the ROC curves of the single classifiers are 0.99 for the nipple position, 0.84 for the nipple shadow, and 0.89 for the breast contour shape. The proposed algorithms work fast and reliably, which makes them adequate for online evaluation of image quality during acquisition. The presented concept may be extended to further image modalities and quality aspects. PMID:27158633

  13. Automated navigation assessment for earth survey sensors using island targets

    NASA Technical Reports Server (NTRS)

    Patt, Frederick S.; Woodward, Robert H.; Gregg, Watson W.

    1997-01-01

    An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalog of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean color sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.

  14. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33.

    PubMed

    Round, A R; Franke, D; Moritz, S; Huchler, R; Fritsche, M; Malthan, D; Klaering, R; Svergun, D I; Roessle, M

    2008-10-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client-server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  15. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33

    PubMed Central

    Round, A. R.; Franke, D.; Moritz, S.; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D. I.; Roessle, M.

    2008-01-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client–server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  16. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies

    PubMed Central

    Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.

    2016-01-01

    Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139

  17. An automated system for assessing cognitive function in any environment

    NASA Astrophysics Data System (ADS)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  18. Automated semiquantitative direct-current-arc spectrographic analysis of eight argonne premium coal ash samples

    USGS Publications Warehouse

    Skeen, C.J.; Libby, B.J.; Crandell, W.B.

    1990-01-01

    The automated semiquantitative direct-current-arc spectre-graphic method was used to analyze 62 elements in eight Argonne Premium Coal Ash samples. All eight coal ash samples were analyzed in triplicate to verify precision and accuracy of the method. The precision for most elements was within ??10%. The accuracy of this method is limited to +50% or -33% because of the nature of the standard curves for each of the elements. Adjustments to the computer program were implemented to account for unique matrix interferences in these particular coal ash samples.

  19. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  20. Functional Profiling of Live Melanoma Samples Using a Novel Automated Platform

    PubMed Central

    Schayowitz, Adam; Bertenshaw, Greg; Jeffries, Emiko; Schatz, Timothy; Cotton, James; Villanueva, Jessie; Herlyn, Meenhard; Krepler, Clemens; Vultur, Adina; Xu, Wei; Yu, Gordon H.; Schuchter, Lynn; Clark, Douglas P.

    2012-01-01

    Aims This proof-of-concept study was designed to determine if functional, pharmacodynamic profiles relevant to targeted therapy could be derived from live human melanoma samples using a novel automated platform. Methods A series of 13 melanoma cell lines was briefly exposed to a BRAF inhibitor (PLX-4720) on a platform employing automated fluidics for sample processing. Levels of the phosphoprotein p-ERK in the mitogen-activated protein kinase (MAPK) pathway from treated and untreated sample aliquots were determined using a bead-based immunoassay. Comparison of these levels provided a determination of the pharmacodynamic effect of the drug on the MAPK pathway. A similar ex vivo analysis was performed on fine needle aspiration (FNA) biopsy samples from four murine xenograft models of metastatic melanoma, as well as 12 FNA samples from patients with metastatic melanoma. Results Melanoma cell lines with known sensitivity to BRAF inhibitors displayed marked suppression of the MAPK pathway in this system, while most BRAF inhibitor-resistant cell lines showed intact MAPK pathway activity despite exposure to a BRAF inhibitor (PLX-4720). FNA samples from melanoma xenografts showed comparable ex vivo MAPK activity as their respective cell lines in this system. FNA samples from patients with metastatic melanoma successfully yielded three categories of functional profiles including: MAPK pathway suppression; MAPK pathway reactivation; MAPK pathway stimulation. These profiles correlated with the anticipated MAPK activity, based on the known BRAF mutation status, as well as observed clinical responses to BRAF inhibitor therapy. Conclusion Pharmacodynamic information regarding the ex vivo effect of BRAF inhibitors on the MAPK pathway in live human melanoma samples can be reproducibly determined using a novel automated platform. Such information may be useful in preclinical and clinical drug development, as well as predicting response to targeted therapy in individual patients

  1. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    PubMed

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. PMID:26227870

  2. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    SciTech Connect

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  3. Electrothermal Fluid Manipulation of High-Conductivity Samples for Laboratory Automation Applications

    PubMed Central

    Sin, Mandy L. Y.; Gau, Vincent; Liao, Joseph C.; Wong, Pak Kin

    2010-01-01

    Electrothermal flow is a promising technique in microfluidic manipulation toward laboratory automation applications, such as clinical diagnostics and high throughput drug screening. Despite the potential of electrothermal flow in biomedical applications, relative little is known about electrothermal manipulation of highly conductive samples, such as physiological fluids and buffer solutions. In this study, the characteristics and challenges of electrothermal manipulation of fluid samples with different conductivities were investigated systematically. Electrothermal flow was shown to create fluid motion for samples with a wide range of conductivity when the driving frequency was above 100 kHz. For samples with low conductivities (below 1 S/m), the characteristics of the electrothermal fluid motions were in quantitative agreement with the theory. For samples with high conductivities (above 1 S/m), the fluid motion appeared to deviate from the model as a result of potential electrochemical reactions and other electrothermal effects. These effects should be taken into consideration for electrothermal manipulation of biological samples with high conductivities. This study will provide insights in designing microfluidic devices for electrokinetic manipulation of biological samples toward laboratory automation applications in the future. PMID:21180401

  4. Development of automated preparation system for isotopocule analysis of N2O in various air samples

    NASA Astrophysics Data System (ADS)

    Toyoda, Sakae; Yoshida, Naohiro

    2016-05-01

    Nitrous oxide (N2O), an increasingly abundant greenhouse gas in the atmosphere, is the most important stratospheric ozone-depleting gas of this century. Natural abundance ratios of isotopocules of N2O, NNO molecules substituted with stable isotopes of nitrogen and oxygen, are a promising index of various sources or production pathways of N2O and of its sink or decomposition pathways. Several automated methods have been reported to improve the analytical precision for the isotopocule ratio of atmospheric N2O and to reduce the labor necessary for complicated sample preparation procedures related to mass spectrometric analysis. However, no method accommodates flask samples with limited volume or pressure. Here we present an automated preconcentration system which offers flexibility with respect to the available gas volume, pressure, and N2O concentration. The shortest processing time for a single analysis of typical atmospheric sample is 40 min. Precision values of isotopocule ratio analysis are < 0.1 ‰ for δ15Nbulk (average abundances of 14N15N16O and 15N14N16O relative to 14N14N16O), < 0.2 ‰ for δ18O (relative abundance of 14N14N18O), and < 0.5 ‰ for site preference (SP; difference between relative abundance of 14N15N16O and 15N14N16O). This precision is comparable to that of other automated systems, but better than that of our previously reported manual measurement system.

  5. An automated system for performance assessment of airport lighting

    NASA Astrophysics Data System (ADS)

    Niblock, James; Peng, Jian-Xun; McMenemy, Karen; Irwin, George

    2008-02-01

    This paper presents work undertaken into the development of an automated air-based vision system for assessing the performance of an approach lighting system (ALS) installation in accordance with International Civil Aviation Organisation (ICAO) standards. The measuring device consists of an image sensor with associated lens system fitted to the interior of an aircraft. The vision system is capable of capturing sequences of airport lighting images during a normal approach to the airport. These images are then processed to determine the uniformity of the ALS. To assess the uniformity of the ALS the luminaires must first be uniquely identified and tracked through an image sequence. A model-based matching technique is utilised which uses a camera projection system to match a set of template data to the extracted image data. From the matching results the associated position and pose of the camera is estimated. Each luminaire emits an intensity which is dependant on its angular displacement from the camera. As such, it is possible to predict the intensity that each luminaire within the ALS emits during an approach. Luminaires emitting the same intensity are banded together for the uniformity analysis. Uniformity assumes that luminaires in close proximity exhibit similar luminous intensity characteristics. During a typical approach grouping information is obtained for the various sectors of luminaires. This grouping information is used to compare luminaires against one another in terms of their extracted grey level information. The developed software is validated using data acquired during an actual approach to a UK airport.

  6. Automated mango fruit assessment using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  7. Multiplexed NMR: An Automated CapNMR Dual-Sample Probe

    PubMed Central

    Norcross, James A.; Milling, Craig T.; Olson, Dean L.; Xu, Duanxiang; Audrieth, Anthony; Albrecht, Robert; Ruan, Ke; Likos, John; Jones, Claude; Peck, Timothy L.

    2010-01-01

    A new generation of micro-scale, nuclear magnetic resonance (CapNMR™) probe technology employs two independent detection elements to accommodate two samples simultaneously. Each detection element in the Dual-Sample CapNMR Probe (DSP) delivers the same spectral resolution and S/N as in a CapNMR probe configured to accommodate one sample at a time. A high degree of electrical isolation allows the DSP to be used in a variety of data acquisition modes. Both samples are shimmed simultaneously to achieve high spectral resolution for simultaneous data acquisition, or alternatively, a flowcell-specific shim set is readily called via spectrometer subroutines to enable acquisition from one sample while the other is being loaded. An automation system accommodates loading of two samples via dual injection ports on an autosampler and two completely independent flowpaths leading to dedicated flowcells in the DSP probe. PMID:20681560

  8. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    PubMed

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles. PMID:26938837

  9. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    PubMed

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  10. Electrochemical pesticide detection with AutoDip--a portable platform for automation of crude sample analyses.

    PubMed

    Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils

    2015-02-01

    Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission. PMID:25415182

  11. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    PubMed Central

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  12. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    PubMed Central

    2014-01-01

    posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training. PMID:24987523

  13. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    PubMed

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method. PMID:25033319

  14. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    NASA Astrophysics Data System (ADS)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  15. Automated AFM force curve analysis for determining elastic modulus of biomaterials and biological samples.

    PubMed

    Chang, Yow-Ren; Raghunathan, Vijay Krishna; Garland, Shaun P; Morgan, Joshua T; Russell, Paul; Murphy, Christopher J

    2014-09-01

    The analysis of atomic force microscopy (AFM) force data requires the selection of a contact point (CP) and is often time consuming and subjective due to influence from intermolecular forces and low signal-to-noise ratios (SNR). In this report, we present an automated algorithm for the selection of CPs in AFM force data and the evaluation of elastic moduli. We propose that CP may be algorithmically easier to detect by identifying a linear elastic indentation region of data (high SNR) rather than the contact point itself (low SNR). Utilizing Hertzian mechanics, the data are fitted for the CP. We first detail the algorithm and then evaluate it on sample polymeric and biological materials. As a demonstration of automation, 64 × 64 force maps were analyzed to yield spatially varying topographical and mechanical information of cells. Finally, we compared manually selected CPs to automatically identified CPs and demonstrated that our automated approach is both accurate (< 10nm difference between manual and automatic) and precise for non-interacting polymeric materials. Our data show that the algorithm is useful for analysis of both biomaterials and biological samples. PMID:24951927

  16. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    PubMed Central

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-01-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method. PMID:26260921

  17. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  18. Automated performance assessment of ultrasound systems using a dynamic phantom

    PubMed Central

    Riedel, F; Valente, AA; Cochran, S; Corner, GA

    2014-01-01

    Quality assurance of medical ultrasound imaging systems is limited by repeatability, difficulty in quantifying results, and the time involved. A particularly interesting approach is demonstrated in the Edinburgh pipe phantom which, with an accompanying mathematical transformation, produces a single figure of merit for image quality from individual measurements of resolution over a range of depths. However, the Edinburgh pipe phantom still requires time-consuming manual scanning, mitigating against its routine use. This paper presents a means to overcome this limitation with a new device, termed the Dundee dynamic phantom, allowing rapid set-up and automated operation. The Dundee dynamic phantom is based on imaging two filamentary targets, positioned by computer control at different depths in a tank of 9.4% ethanol–water solution. The images are analysed in real time to assess if the targets are resolved, with individual measurements at different depths again used to calculate a single figure of merit, in this case for lateral resolution only. Test results are presented for a total of 18 scanners in clinical use for different applications. As a qualitative indication of viability, the figure of merit produced by the Dundee dynamic phantom is shown to differentiate between scanners operating at different frequencies and between a relatively new, higher quality system and an older, lower quality system.

  19. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    NASA Technical Reports Server (NTRS)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  20. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  1. Device and method for automated separation of a sample of whole blood into aliquots

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  2. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  3. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  4. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  5. A Bayesian Framework for the Automated Online Assessment of Sensor Data Quality

    PubMed Central

    Smith, Daniel; Timms, Greg; De Souza, Paulo; D'Este, Claire

    2012-01-01

    Online automated quality assessment is critical to determine a sensor's fitness for purpose in real-time applications. A Dynamic Bayesian Network (DBN) framework is proposed to produce probabilistic quality assessments and represent the uncertainty of sequentially correlated sensor readings. This is a novel framework to represent the causes, quality state and observed effects of individual sensor errors without imposing any constraints upon the physical deployment or measured phenomenon. It represents the casual relationship between quality tests and combines them in a way to generate uncertainty estimates of samples. The DBN was implemented for a particular marine deployment of temperature and conductivity sensors in Hobart, Australia. The DBN was shown to offer a substantial average improvement (34%) in replicating the error bars that were generated by experts when compared to a fuzzy logic approach. PMID:23012554

  6. Automated sample exchange and tracking system for neutron research at cryogenic temperatures.

    PubMed

    Rix, J E; Weber, J K R; Santodonato, L J; Hill, B; Walker, L M; McPherson, R; Wenzel, J; Hammons, S E; Hodges, J; Rennich, M; Volin, K J

    2007-01-01

    An automated system for sample exchange and tracking in a cryogenic environment and under remote computer control was developed. Up to 24 sample "cans" per cycle can be inserted and retrieved in a programed sequence. A video camera acquires a unique identification marked on the sample can to provide a record of the sequence. All operations are coordinated via a LABVIEW program that can be operated locally or over a network. The samples are contained in vanadium cans of 6-10 mm in diameter and equipped with a hermetically sealed lid that interfaces with the sample handler. The system uses a closed-cycle refrigerator (CCR) for cooling. The sample was delivered to a precooling location that was at a temperature of approximately 25 K, after several minutes, it was moved onto a "landing pad" at approximately 10 K that locates the sample in the probe beam. After the sample was released onto the landing pad, the sample handler was retracted. Reading the sample identification and the exchange operation takes approximately 2 min. The time to cool the sample from ambient temperature to approximately 10 K was approximately 7 min including precooling time. The cooling time increases to approximately 12 min if precooling is not used. Small differences in cooling rate were observed between sample materials and for different sample can sizes. Filling the sample well and the sample can with low pressure helium is essential to provide heat transfer and to achieve useful cooling rates. A resistive heating coil can be used to offset the refrigeration so that temperatures up to approximately 350 K can be accessed and controlled using a proportional-integral-derivative control loop. The time for the landing pad to cool to approximately 10 K after it has been heated to approximately 240 K was approximately 20 min. PMID:17503933

  7. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    NASA Astrophysics Data System (ADS)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  8. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples. PMID:19105753

  9. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  10. SAMPL4 & DOCK3.7: Lessons for automated docking procedures

    PubMed Central

    Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.

    2014-01-01

    The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: 1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, 2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, 3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed. PMID:24515818

  11. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol. PMID:18471209

  12. Automated CBED processing: sample thickness estimation based on analysis of zone-axis CBED pattern.

    PubMed

    Klinger, M; Němec, M; Polívka, L; Gärtnerová, V; Jäger, A

    2015-03-01

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar(+) ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. PMID:25544679

  13. A proposed protocol for remote control of automated assessment devices

    SciTech Connect

    Kissock, P.S.

    1996-09-01

    Systems and devices that are controlled remotely are becoming more common in security systems in the US Air Force and other government agencies to provide protection of valuable assets. These systems reduce the number of needed personnel while still providing a high level of protection. However, each remotely controlled device usually has its own communication protocol. This limits the ability to change devices without changing the system that provides the communications control to the device. Sandia is pursuing a standard protocol that can be used to communicate with the different devices currently in use, or may be used in the future, in the US Air Force and other government agencies throughout the security community. Devices to be controlled include intelligent pan/tilt mounts, day/night video cameras., thermal imaging cameras, and remote data processors. Important features of this protocol include the ability to send messages of varying length, identify the sender, and more importantly, control remote data processors. As camera and digital signal processor (DSP) use expands, the DSP will begin to reside in the camera itself. The DSP can be used to provide auto-focus, frame-to- frame image registration, video motion detection (VMD), target detection, tracking, image compression, and many other functions. With the serial data control link, the actual DSP software can be updated or changed as required. Coaxial video cables may become obsolete once a compression algorithm is established in the DSP. This paper describes the proposed public domain protocol, features, and examples of use. The authors hope to elicit comments from security technology developers regarding format and use of remotely controlled automated assessment devices. 2 figs., 1 tab.

  14. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  15. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    PubMed Central

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  16. Collection, storage, and filtration of in vivo study samples using 96-well filter plates to facilitate automated sample preparation and LC/MS/MS analysis.

    PubMed

    Berna, M; Murphy, A T; Wilken, B; Ackermann, B

    2002-03-01

    The benefits of high-throughput bioanalysis within the pharmaceutical industry are well established. One of the most significant bottlenecks in bioanalysis is transferring in vivo-generated study samples from their collection tubes during sample preparation and extraction. In most cases, the plasma samples must be stored frozen prior to analysis, and the freeze/thaw (F/T) process introduces thrombin clots that are capable of plugging pipets and automated liquid-transfer systems. A new approach to dealing with this problem involves the use of Ansys Captiva 96-well 20-microm polypropylene filter plates to collect, store frozen, and filter plasma samples prior to bioanalysis. The samples are collected from the test subjects, and the corresponding plasma samples are placed directly into the wells of the filter plate. Two Duoseal (patent pending) covers are used to seal the top and bottom of the plate, and the plate is stored at down to -70 degrees C. Prior to sample analysis, the seals are removed and the plate is placed in a 96-well SPE manifold. As the plasma thaws, it passes (by gravity or mild vacuum) through the polypropylene filter into a 96-well collection plate. A multichannel pipet or automated liquid-transfer system is used to transfer sample aliquots without fear of plugging. A significant advantage of this approach is that, unlike other methods, issues related to incomplete pipetting are virtually eliminated. The entire process is rapid since thawing and filtering take place simultaneously, and if a second F/T cycle is required for reanalysis, it is not necessary to refilter the samples (additional clotting was not observed after three F/T cycles). This technique was tested using monkey, rat, and dog plasma and sodium heparin and EDTA anticoagulants. To assess the possibility of nonspecific binding to the polypropylene filter, a variety of drug candidates from diverse drug classes were studied. Validation data generated for two Lilly compounds from distinct

  17. Automation of high-frequency sampling of environmental waters for reactive species

    NASA Astrophysics Data System (ADS)

    Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.

    2011-12-01

    Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the

  18. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  19. Automated high-throughput in vitro screening of the acetylcholine esterase inhibiting potential of environmental samples, mixtures and single compounds.

    PubMed

    Froment, Jean; Thomas, Kevin V; Tollefsen, Knut Erik

    2016-08-01

    A high-throughput and automated assay for testing the presence of acetylcholine esterase (AChE) inhibiting compounds was developed, validated and applied to screen different types of environmental samples. Automation involved using the assay in 96-well plates and adapting it for the use with an automated workstation. Validation was performed by comparing the results of the automated assay with that of a previously validated and standardised assay for two known AChE inhibitors (paraoxon and dichlorvos). The results show that the assay provides similar concentration-response curves (CRCs) when run according to the manual and automated protocol. Automation of the assay resulted in a reduction in assay run time as well as in intra- and inter-assay variations. High-quality CRCs were obtained for both of the model AChE inhibitors (dichlorvos IC50=120µM and paraoxon IC50=0.56µM) when tested alone. The effect of co-exposure of an equipotent binary mixture of the two chemicals were consistent with predictions of additivity and best described by the concentration addition model for combined toxicity. Extracts of different environmental samples (landfill leachate, wastewater treatment plant effluent, and road tunnel construction run-off) were then screened for AChE inhibiting activity using the automated bioassay, with only landfill leachate shown to contain potential AChE inhibitors. Potential uses and limitations of the assay were discussed based on the present results. PMID:27085000

  20. Toward Automated Computer-Based Visualization and Assessment of Team-Based Performance

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk

    2014-01-01

    A considerable amount of research has been undertaken to provide insights into the valid assessment of team performance. However, in many settings, manual and therefore labor-intensive assessment instruments for team performance have limitations. Therefore, automated assessment instruments enable more flexible and detailed insights into the…

  1. Automated Generation and Assessment of Autonomous Systems Test Cases

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  2. Harmonization of automated hemolysis index assessment and use: Is it possible?

    PubMed

    Dolci, Alberto; Panteghini, Mauro

    2014-05-15

    The major source of errors producing unreliable laboratory test results is the pre-analytical phase with hemolysis accounting for approximately half of them and being the leading cause of unsuitable blood specimens. Hemolysis may produce interference in many laboratory tests by a variety of biological and analytical mechanisms. Consequently, laboratories need to systematically detect and reliably quantify hemolysis in every collected sample by means of objective and consistent technical tools that assess sample integrity. This is currently done by automated estimation of hemolysis index (HI), available on almost all clinical chemistry platforms, making the hemolysis detection reliable and reportable patient test results more accurate. Despite these advantages, a degree of variability still affects the HI estimate and more efforts should be placed on harmonization of this index. The harmonization of HI results from different analytical systems should be the immediate goal, but the scope of harmonization should go beyond analytical steps to include other aspects, such as HI decision thresholds, criteria for result interpretation and application in clinical practice as well as report formats. With regard to this, relevant issues to overcome remain the objective definition of a maximum allowable bias for hemolysis interference based on the clinical application of the measurements and the management of unsuitable samples. Particularly, for the latter a recommended harmonized approach is required when not reporting numerical results of unsuitable samples with significantly increased HI and replacing the test result with a specific comment highlighting hemolysis of the sample. PMID:24513329

  3. Automated Scoring in Context: Rapid Assessment for Placed Students

    ERIC Educational Resources Information Center

    Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal

    2013-01-01

    This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…

  4. Automating the assessment of citrus canker symptoms with image analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus canker (CC, caused by Xanthomonas citri) is a serious disease of citrus in Florida and other citrus-growing regions. Severity of symptoms can be estimated by visual rating, but there is inter- and intra-rater variation. Automated image analysis (IA) may offer a way of reducing some of ...

  5. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. PMID:26241930

  6. Adapting Assessment Procedures for Delivery via an Automated Format.

    ERIC Educational Resources Information Center

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  7. Automated Assessment of Speech Fluency for L2 English Learners

    ERIC Educational Resources Information Center

    Yoon, Su-Youn

    2009-01-01

    This dissertation provides an automated scoring method of speech fluency for second language learners of English (L2 learners) based that uses speech recognition technology. Non-standard pronunciation, frequent disfluencies, faulty grammar, and inappropriate lexical choices are crucial characteristics of L2 learners' speech. Due to the ease of…

  8. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  9. Microbiological monitoring and automated event sampling at karst springs using LEO-satellites.

    PubMed

    Stadler, H; Skritek, P; Sommer, R; Mach, R L; Zerobin, W; Farnleitner, A H

    2008-01-01

    Data communication via Low-Earth-Orbit (LEO) Satellites between portable hydrometeorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E. coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples compared with the auto-sampling procedure revealed corresponding results and were in agreement with the ISO 9308-1 reference method. E. coli concentrations were individually corrected by event specific inactivation coefficients (0.10-0.14 day(-1)), compensating losses due to sample storage at spring temperature in the auto sampler.Two large summer events in 2005/2006 at an important alpine karst spring (LKAS2) were monitored including detailed analysis of E. coli dynamics (n = 271) together with comprehensive hydrological characterisations. High-resolution time series demonstrated a sudden increase of E. coli concentrations in spring water (approximately 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorption coefficient measured at 254 nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-satellite based system it is a helpful tool for early-warning systems in the field of drinking water protection. PMID:18776628

  10. Automated MALDI matrix coating system for multiple tissue samples for imaging mass spectrometry.

    PubMed

    Mounfield, William P; Garrett, Timothy J

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method. PMID:22234508

  11. MICROBIOLOGICAL MONITORING AND AUTOMATED EVENT SAMPLING AT KARST SPRINGS USING LEO-SATELLITES

    PubMed Central

    Stadler, Hermann; Skritek, Paul; Sommer, Regina; Mach, Robert L.; Zerobin, Wolfgang; Farnleitner, Andreas H.

    2010-01-01

    Data communication via Low-Earth-Orbit Satellites between portable hydro-meteorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E.coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples by hand compared with the auto-sampling procedure revealed corresponding results and were in agreement to the ISO 9308-1 reference method. E.coli concentrations were individually corrected by event specific die-off rates (0.10–0.14 day−1) compensating losses due to sample storage at spring temperature in the auto sampler. Two large summer events 2005/2006 at a large alpine karst spring (LKAS2) were monitored including detailed analysis of E.coli dynamics (n = 271) together with comprehensive hydrological characterisations. High resolution time series demonstrated a sudden increase of E.coli concentrations in spring water (approx. 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorbent coefficient measured at 254nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-Satellite based system it is a helpful tool for Early-Warning-Systems in the field of drinking water protection. PMID:18776628

  12. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  13. Automated sample preparation in a microfluidic culture device for cellular metabolomics.

    PubMed

    Filla, Laura A; Sanders, Katherine L; Filla, Robert T; Edwards, James L

    2016-06-21

    Sample pretreatment in conventional cellular metabolomics entails rigorous lysis and extraction steps which increase the duration as well as limit the consistency of these experiments. We report a biomimetic cell culture microfluidic device (MFD) which is coupled with an automated system for rapid, reproducible cell lysis using a combination of electrical and chemical mechanisms. In-channel microelectrodes were created using facile fabrication methods, enabling the application of electric fields up to 1000 V cm(-1). Using this platform, average lysing times were 7.12 s and 3.03 s for chips with no electric fields and electric fields above 200 V cm(-1), respectively. Overall, the electroporation MFDs yielded a ∼10-fold improvement in lysing time over standard chemical approaches. Detection of multiple intracellular nucleotides and energy metabolites in MFD lysates was demonstrated using two different MS platforms. This work will allow for the integrated culture, automated lysis, and metabolic analysis of cells in an MFD which doubles as a biomimetic model of the vasculature. PMID:27118418

  14. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    USGS Publications Warehouse

    Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.

  15. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin D.; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  16. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    SciTech Connect

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-05-12

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper.

  17. To the development of an automated system of assessment of radiological images of joints

    NASA Astrophysics Data System (ADS)

    Grechikhin, A. I.; Grunina, E. A.; Karetnikova, I. R.

    2008-03-01

    An algorithm developed for the adaptive automated computer processing of radiological images of hands and feet in order to assess the degree of bone and cartilage destruction in rheumatoid arthritis is described. A set of new numeral signs was proposed in order to assess a degree of arthritis radiological progression.

  18. Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing

    ERIC Educational Resources Information Center

    Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt

    2012-01-01

    The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…

  19. ADDING GLOBAL SOILS DATA TO THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL (AGWA)

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA) is a GIS-based hydrologic modeling tool that is available as an extension for ArcView 3.x from the USDA-ARS Southwest Watershed Research Center (www.tucson.ars.ag.gov/agwa). AGWA is designed to facilitate the assessment of...

  20. Development and verification of an automated sample processing protocol for quantitation of human immunodeficiency virus type 1 RNA in plasma.

    PubMed

    Lee, Brenda G; Fiebelkorn, Kristin R; Caliendo, Angela M; Nolte, Frederick S

    2003-05-01

    We developed and verified an automated sample processing protocol for use with the AMPLICOR HIV-1 MONITOR test, version 1.5 (Roche Diagnostics, Indianapolis, Ind.). The automated method uses the MagNA Pure LC instrument and total nucleic acid reagents (Roche Applied Science, Indianapolis, Ind.) to extract human immunodeficiency virus type 1 (HIV-1) RNA from plasma specimens. We compared the HIV-1 load results for a dilution series (1 to 5 nominal log(10) copies/ml) and 175 clinical specimens processed by the automated method to those for the same samples processed by the manual methods specified by the manufacturer. The sensitivity, dynamic range, and precision of the viral load assay obtained by automated processing of specimens were similar to those obtained by an ultrasensitive manual processing method. The results were highly correlated (R(2), 0.95), and were in close agreement, with a mean difference of 0.09 log(10) (standard deviation, 0.292). The limits of agreement were +/-0.58 log(10) for results for samples processed by both the manual and the automated methods. These performance characteristics were achieved with a smaller sample volume (200 versus 500 microl) and without a high-speed centrifugation step and required only 15 min of labor for a batch of 32 samples. In conclusion, the automated sample preparation protocol can replace both the standard and the ultrasensitive manual methods used with the AMPLICOR HIV-1 MONITOR test and can substantially reduce the labor associated with this test. PMID:12734249

  1. Automated system for sampling, counting, and biological analysis of rotifer populations

    PubMed Central

    Stelzer, Claus-Peter

    2010-01-01

    Zooplankton organisms with short generation times, such as rotifers, are ideal models to study general ecological and evolutionary questions on the population level, because meaningful experiments can often be completed within a couple of weeks. Yet biological analysis of such populations is often extremely time consuming, owing to abundance estimation by counting, measuring body size, or determining the investment into sexual versus asexual reproduction. An automated system for sampling and analyzing experimental rotifer populations is described. It relies on image analysis of digital photographs taken from subsamples of the culture. The system works completely autonomously for up to several weeks and can sample up to 12 cultures at time intervals down to a few hours. It allows quantitative analysis of female population density at a precision equivalent to that of conventional methods (i.e., manual counts of samples fixed in Lugol solution), and it can also recognize males, which allows detecting temporal variation of sexual reproduction in such cultures. Another parameter that can be automatically measured with the image analysis system is female body size. This feature may be useful for studies of population productivity and/or in competition experiments with clones of different body size. In this article, I describe the basic setup of the system and tests on the efficiency of data collection, and show some example data sets on the population dynamics of different strains of the rotifer Brachionus calyciflorus. PMID:21151824

  2. Use of an automated sampler to assess bovine adrenal hormone response to transportation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automated blood sampling would aid characterization of acute endocrine responses to transportation procedures. In this study, the IceSampler™ device was programmed to collect blood samples via jugular catheter from the herd's 7 calmest (C; temperament score=0.84±0.03) and 8 most temperamental (T; te...

  3. Renewable Microcolumns for Automated DNA Purification and Flow-through Amplification: From Sediment Samples through Polymerase Chain Reaction

    SciTech Connect

    Bruckner-Lea, Cindy J. ); Tsukuda, Toyoko ); Dockendorff, Brian P. ); Follansbee, James C. ); Kingsley, Mark T. ); Ocampo, Catherine O.; Stults, Jennie R.; Chandler, Darrell P.

    2001-12-01

    There is an increasing need for field-portable systems for the detection and characterization of microorganisms in the environment. Nucleic acids analysis is frequently the method of choice for discriminating between bacteria in complex systems, but standard protocols are difficult to automate and current microfluidic devices are not configured specifically for environmental sample analysis. In this report, we describe the development of an integrated DNA purification and PCR amplification system and demonstrate its use for the automated purification and amplification of Geobacter chapelli DNA (genomic DNA or plasmid targets) from sediments. The system includes renewable separation columns for the automated capture and release of microparticle purification matrices, and can be easily reprogrammed for new separation chemistries and sample types. The DNA extraction efficiency for the automated system ranged from 3 to 25 percent, depending on the length and concentration of the DNA target . The system was more efficient than batch capture methods for the recovery of dilute genomic DNA even though the reagen volumes were smaller than required for the batch procedure. The automated DNA concentration and purification module was coupled to a flow-through, Peltier-controlled DNA amplification chamber, and used to successfully purify and amplify genomic and plasmid DNA from sediment extracts. Cleaning protocols were also developed to allow reuse of the integrated sample preparation system, including the flow-through PCR tube.

  4. Erratum to: Automated Sample Preparation Method for Suspension Arrays using Renewable Surface Separations with Multiplexed Flow Cytometry Fluorescence Detection

    SciTech Connect

    Grate, Jay W.; Bruckner-Lea, Cindy J.; Jarrell, Ann E.; Chandler, Darrell P.

    2003-04-10

    In this paper we describe a new method of automated sample preparation for multiplexed biological analysis systems that use flow cytometry fluorescence detection. In this approach, color-encoded microspheres derivatized to capture particular biomolecules are temporarily trapped in a renewable surface separation column to enable perfusion with sample and reagents prior to delivery to the detector. This method provides for separation of the biomolecules of interest from other sample matrix components as well as from labeling solutions.

  5. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    NASA Astrophysics Data System (ADS)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  6. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    SciTech Connect

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  7. Analysis of zearalenone in cereal and Swine feed samples using an automated flow-through immunosensor.

    PubMed

    Urraca, Javier L; Benito-Peña, Elena; Pérez-Conde, Concepción; Moreno-Bondi, María C; Pestka, James J

    2005-05-01

    The development of a sensitive flow-though immunosensor for the analysis of the mycotoxin zearalenone in cereal samples is described. The sensor was completely automated and was based on a direct competitive immunosorbent assay and fluorescence detection. The mycotoxin competes with a horseradish-peroxidase-labeled derivative for the binding sites of a rabbit polyclonal antibody. Control pore glass covalently bound to Prot A was used for the oriented immobilization of the antibody-antigen immunocomplexes. The immunosensor shows an IC(50) value of 0.087 ng mL(-1) (RSD = 2.8%, n = 6) and a dynamic range from 0.019 to 0.422 ng mL(-1). The limit of detection (90% of blank signal) of 0.007 ng mL(-1) (RSD = 3.9%, n = 3) is lower than previously published methods. Corn, wheat, and swine feed samples have been analyzed with the device after extraction of the analyte using accelerated solvent extraction (ASE). The immunosensor has been validated using a corn certificate reference material and HPLC with fluorescence detection. PMID:15853369

  8. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  9. Plasma cortisol and norepinephrine concentrations in pigs: automated sampling of freely moving pigs housed in the PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and norepinephrine (NE) responses in automated versus manual blood sampling. A ...

  10. An automated maze task for assessing hippocampus-sensitive memory in mice☆

    PubMed Central

    Pioli, Elsa Y.; Gaskill, Brianna N.; Gilmour, Gary; Tricklebank, Mark D.; Dix, Sophie L.; Bannerman, David; Garner, Joseph P.

    2014-01-01

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery. PMID:24333574

  11. Rapid magnetic bead based sample preparation for automated and high throughput N-glycan analysis of therapeutic antibodies.

    PubMed

    Váradi, Csaba; Lew, Clarence; Guttman, András

    2014-06-17

    Full automation to enable high throughput N-glycosylation profiling and sequencing with good reproducibility is vital to fulfill the contemporary needs of the biopharmaceutical industry and requirements of national regulatory agencies. The most prevalently used glycoanalytical methods of capillary electrophoresis and hydrophilic interaction liquid chromatography, while very efficient, both necessitate extensive sample preparation and cleanup, including glycoprotein capture, N-glycan release, fluorescent derivatization, purification, and preconcentration steps during the process. Currently used protocols to fulfill these tasks require multiple centrifugation and vacuum-centrifugation steps, making liquid handling robot mediated automated sample preparation difficult and expensive. In this paper we report on a rapid magnetic bead based sample preparation approach that enables full automation including all the process phases just in a couple of hours without requiring any centrifugation and/or vacuum centrifugation steps. This novel protocol has been compared to conventional glycan sample preparation strategies using standard glycoproteins (IgG, fetuin, and RNase B) and featured rapid processing time, high release and labeling efficiency, good reproducibility, and the potential of easy automation. PMID:24909945

  12. AST: An Automated Sequence-Sampling Method for Improving the Taxonomic Diversity of Gene Phylogenetic Trees

    PubMed Central

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php. PMID:24892935

  13. Assessment of Automated Disease Detection in Diabetic Retinopathy Screening Using Two-Field Photography

    PubMed Central

    Goatman, Keith; Charnley, Amanda; Webster, Laura; Nussey, Stephen

    2011-01-01

    Aim To assess the performance of automated disease detection in diabetic retinopathy screening using two field mydriatic photography. Methods Images from 8,271 sequential patient screening episodes from a South London diabetic retinopathy screening service were processed by the Medalytix iGrading™ automated grading system. For each screening episode macular-centred and disc-centred images of both eyes were acquired and independently graded according to the English national grading scheme. Where discrepancies were found between the automated result and original manual grade, internal and external arbitration was used to determine the final study grades. Two versions of the software were used: one that detected microaneurysms alone, and one that detected blot haemorrhages and exudates in addition to microaneurysms. Results for each version were calculated once using both fields and once using the macula-centred field alone. Results Of the 8,271 episodes, 346 (4.2%) were considered unassessable. Referable disease was detected in 587 episodes (7.1%). The sensitivity of the automated system for detecting unassessable images ranged from 97.4% to 99.1% depending on configuration. The sensitivity of the automated system for referable episodes ranged from 98.3% to 99.3%. All the episodes that included proliferative or pre-proliferative retinopathy were detected by the automated system regardless of configuration (192/192, 95% confidence interval 98.0% to 100%). If implemented as the first step in grading, the automated system would have reduced the manual grading effort by between 2,183 and 3,147 patient episodes (26.4% to 38.1%). Conclusion Automated grading can safely reduce the workload of manual grading using two field, mydriatic photography in a routine screening service. PMID:22174741

  14. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    SciTech Connect

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-02-25

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants.

  15. ENVIRONMENTAL ASSESSMENT SAMPLING AND ANALYTICAL STRATEGY PROGRAM

    EPA Science Inventory

    The report describes a costing methodology for environmental assessment that has been generated for industrial processes at various phases of development. The demonstrated environmental assessment strategies provide a framework for determining industry, process, and stream priori...

  16. High-Throughput Serum 25-Hydroxy Vitamin D Testing with Automated Sample Preparation.

    PubMed

    Stone, Judy

    2016-01-01

    Serum from bar-coded tubes, and then internal standard, are pipetted to 96-well plates with an 8-channel automated liquid handler (ALH). The first precipitation reagent (methanol:ZnSO4) is added and mixed with the 8-channel ALH. A second protein precipitating agent, 1 % formic acid in acetonitrile, is added and mixed with a 96-channel ALH. After a 4-min delay for larger precipitates to settle to the bottom of the plate, the upper 36 % of the precipitate/supernatant mix is transferred with the 96-channel ALH to a Sigma Hybrid SPE(®) plate and vacuumed through for removal of phospholipids and precipitated proteins. The filtrate is collected in a second 96-well plate (collection plate) which is foil-sealed, placed in the autosampler (ALS), and injected into a multiplexed LC-MS/MS system running AB Sciex Cliquid(®) and MPX(®) software. Two Shimadzu LC stacks, with multiplex timing controlled by MPX(®) software, inject alternately to one AB Sciex API-5000 MS/MS using positive atmospheric pressure chemical ionization (APCI) and a 1.87 min water/acetonitrile LC gradient with a 2.1 × 20 mm, 2.7 μm, C18 fused core particle column (Sigma Ascentis Express). LC-MS/MS through put is ~44 samples/h/LC-MS/MS system with dual-LC channel multiplexing. Plate maps are transferred electronically from the ALH and reformatted into LC-MS/MS sample table format using the Data Innovations LLC (DI) Instrument Manager middleware application. Before collection plates are loaded into the ALS, the plate bar code is manually scanned to download the sample table from the DI middleware to the LC-MS/MS. After acquisition-LC-MS/MS data is analyzed with AB Sciex Multiquant(®) software using customized queries, and then results are transferred electronically via a DI interface to the LIS. 2500 samples/day can be extracted by two analysts using four ALHs in 4-6 h. LC-MS/MS analysis of those samples on three dual-channel LC multiplexed LC-MS/MS systems requires 19-21 h and data analysis can be

  17. A Survey of Automated Assessment Approaches for Programming Assignments

    ERIC Educational Resources Information Center

    Ala-Mutka, Kirsti M.

    2005-01-01

    Practical programming is one of the basic skills pursued in computer science education. On programming courses, the coursework consists of programming assignments that need to be assessed from different points of view. Since the submitted assignments are executable programs with a formal structure, some features can be assessed automatically. The…

  18. Automated peroperative assessment of stents apposition from OCT pullbacks.

    PubMed

    Dubuisson, Florian; Péry, Emilie; Ouchchane, Lemlih; Combaret, Nicolas; Kauffmann, Claude; Souteyrand, Géraud; Motreff, Pascal; Sarry, Laurent

    2015-04-01

    This study's aim was to control the stents apposition by automatically analyzing endovascular optical coherence tomography (OCT) sequences. Lumen is detected using threshold, morphological and gradient operators to run a Dijkstra algorithm. Wrong detection tagged by the user and caused by bifurcation, struts'presence, thrombotic lesions or dissections can be corrected using a morphing algorithm. Struts are also segmented by computing symmetrical and morphological operators. Euclidian distance between detected struts and wall artery initializes a stent's complete distance map and missing data are interpolated with thin-plate spline functions. Rejection of detected outliers, regularization of parameters by generalized cross-validation and using the one-side cyclic property of the map also optimize accuracy. Several indices computed from the map provide quantitative values of malapposition. Algorithm was run on four in-vivo OCT sequences including different incomplete stent apposition's cases. Comparison with manual expert measurements validates the segmentation׳s accuracy and shows an almost perfect concordance of automated results. PMID:25700272

  19. Automated Assessment of Visual Quality of Digital Video

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Ellis, Stephen R. (Technical Monitor)

    1997-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images[1-4]. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  20. HAZARDOUS AIR POLLUTANTS PROGRAM (HAP-PRO): AUTOMATED HAP AND VOC CONTROL TECHNOLOGY ASSESSMENT SOFTWARE

    EPA Science Inventory

    The paper discusses the Hazardous Air Pollutant Program (HAP-PRO), version 1.0, automated hazardous air pollutant (HAP) and volatile organic compound (VOC) control technology assessment software, designed to assist permit engineers in reviewing applications for control of air tox...

  1. Improving EFL Graduate Students' Proficiency in Writing through an Online Automated Essay Assessing System

    ERIC Educational Resources Information Center

    Ma, Ke

    2013-01-01

    This study investigates the effects of using an online Automated Essay Assessing (AEA) system on EFL graduate students' writing. Eighty four EFL graduate students divided into the treatment group and the control group participated in this study. The treatment group was asked to use an AEA system to assist their essay writing. Both groups were…

  2. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    ERIC Educational Resources Information Center

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  3. Adult Students' Perceptions of Automated Writing Assessment Software: Does It Foster Engagement?

    ERIC Educational Resources Information Center

    LaGuerre, Joselle L.

    2013-01-01

    Generally, this descriptive study endeavored to include the voice of adult learners to the scholarly body of research regarding automated writing assessment tools (AWATs). Specifically, the study sought to determine the extent to which students perceive that the AWAT named Criterion fosters learning and if students' opinions differ depending…

  4. Correction of an input function for errors introduced with automated blood sampling

    SciTech Connect

    Schlyer, D.J.; Dewey, S.L.

    1994-05-01

    Accurate kinetic modeling of PET data requires an precise arterial plasma input function. The use of automated blood sampling machines has greatly improved the accuracy but errors can be introduced by the dispersion of the radiotracer in the sampling tubing. This dispersion results from three effects. The first is the spreading of the radiotracer in the tube due to mass transfer. The second is due to the mechanical action of the peristaltic pump and can be determined experimentally from the width of a step function. The third is the adsorption of the radiotracer on the walls of the tubing during transport through the tube. This is a more insidious effect since the amount recovered from the end of the tube can be significantly different than that introduced into the tubing. We have measured the simple mass transport using [{sup 18}F]fluoride in water which we have shown to be quantitatively recovered with no interaction with the tubing walls. We have also carried out experiments with several radiotracers including [{sup 18}F]Haloperidol, [{sup 11}C]L-deprenyl, [{sup 18}]N-methylspiroperidol ([{sup 18}F]NMS) and [{sup 11}C]buprenorphine. In all cases there was some retention of the radiotracer by untreated silicone tubing. The amount retained in the tubing ranged from 6% for L-deprenyl to 30% for NMS. The retention of the radiotracer was essentially eliminated after pretreatment with the relevant unlabeled compound. For example less am 2% of the [{sup 18}F]NMS was retained in tubing treated with unlabelled NMS. Similar results were obtained with baboon plasma although the amount retained in the untreated tubing was less in all cases. From these results it is possible to apply a mathematical correction to the measured input function to account for mechanical dispersion and to apply a chemical passivation to the tubing to reduce the dispersion due to adsorption of the radiotracer on the tubing walls.

  5. Performance verification of the Maxwell 16 Instrument and DNA IQ Reference Sample Kit for automated DNA extraction of known reference samples.

    PubMed

    Krnajski, Z; Geering, S; Steadman, S

    2007-12-01

    Advances in automation have been made for a number of processes conducted in the forensic DNA laboratory. However, because most robotic systems are designed for high-throughput laboratories batching large numbers of samples, smaller laboratories are left with a limited number of cost-effective options for employing automation. The Maxwell 16 Instrument and DNA IQ Reference Sample Kit marketed by Promega are designed for rapid, automated purification of DNA extracts from sample sets consisting of sixteen or fewer samples. Because the system is based on DNA capture by paramagnetic particles with maximum binding capacity, it is designed to generate extracts with yield consistency. The studies herein enabled evaluation of STR profile concordance, consistency of yield, and cross-contamination performance for the Maxwell 16 Instrument. Results indicate that the system performs suitably for streamlining the process of extracting known reference samples generally used for forensic DNA analysis and has many advantages in a small or moderate-sized laboratory environment. PMID:25869266

  6. Automated Spacecraft Conjunction Assessment at Mars and the Moon

    NASA Technical Reports Server (NTRS)

    Berry, David; Guinn, Joseph; Tarzi, Zahi; Demcak, Stuart

    2012-01-01

    Conjunction assessment and collision avoidance are areas of current high interest in space operations. Most current conjunction assessment activity focuses on the Earth orbital environment. Several of the world's space agencies have satellites in orbit at Mars and the Moon, and avoiding collisions there is important too. Smaller number of assets than Earth, and smaller number of organizations involved, but consequences similar to Earth scenarios.This presentation will examine conjunction assessment processes implemented at JPL for spacecraft in orbit at Mars and the Moon.

  7. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGICAL MODELING TOOL FOR WATERSHED MANAGEMENT AND LANDSCAPE ASSESSMENT

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (http://www.epa.gov/nerlesd1/land-sci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, and the University ...

  8. Assessment of Automated Measurement and Verification (M&V) Methods

    SciTech Connect

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Jump, David

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  9. Automated, low-temperature dielectric relaxation apparatus for measurement of air-sensitive, corrosive, hygroscopic, powdered samples

    NASA Astrophysics Data System (ADS)

    Bessonette, Paul W. R.; White, Mary Anne

    1999-07-01

    An automated apparatus for dielectric determinations on solid samples was designed to allow cryogenic measurements on air-sensitive, corrosive, hygroscopic, powdered samples, without determination of sample thickness, provided that it is uniform. A three-terminal design enabled measurements that were not affected by errors due to dimensional changes of the sample or the electrodes with changes in temperature. Meaningful dielectric data could be taken over the frequency range from 20 Hz to 1 MHz and the temperature range from 12 to 360 K. Tests with Teflon and with powdered NH4Cl gave results that were accurate within a few percent when compared with literature values.

  10. Automated high-throughput assessment of prostate biopsy tissue using infrared spectroscopic chemical imaging

    NASA Astrophysics Data System (ADS)

    Bassan, Paul; Sachdeva, Ashwin; Shanks, Jonathan H.; Brown, Mick D.; Clarke, Noel W.; Gardner, Peter

    2014-03-01

    Fourier transform infrared (FT-IR) chemical imaging has been demonstrated as a promising technique to complement histopathological assessment of biomedical tissue samples. Current histopathology practice involves preparing thin tissue sections and staining them using hematoxylin and eosin (H&E) after which a histopathologist manually assess the tissue architecture under a visible microscope. Studies have shown that there is disagreement between operators viewing the same tissue suggesting that a complementary technique for verification could improve the robustness of the evaluation, and improve patient care. FT-IR chemical imaging allows the spatial distribution of chemistry to be rapidly imaged at a high (diffraction-limited) spatial resolution where each pixel represents an area of 5.5 × 5.5 μm2 and contains a full infrared spectrum providing a chemical fingerprint which studies have shown contains the diagnostic potential to discriminate between different cell-types, and even the benign or malignant state of prostatic epithelial cells. We report a label-free (i.e. no chemical de-waxing, or staining) method of imaging large pieces of prostate tissue (typically 1 cm × 2 cm) in tens of minutes (at a rate of 0.704 × 0.704 mm2 every 14.5 s) yielding images containing millions of spectra. Due to refractive index matching between sample and surrounding paraffin, minimal signal processing is required to recover spectra with their natural profile as opposed to harsh baseline correction methods, paving the way for future quantitative analysis of biochemical signatures. The quality of the spectral information is demonstrated by building and testing an automated cell-type classifier based upon spectral features.

  11. Assessment of Automated Image Analysis of Breast Cancer Tissue Microarrays for Epidemiologic Studies

    PubMed Central

    Bolton, Kelly L.; Garcia-Closas, Montserrat; Pfeiffer, Ruth M.; Duggan, Máire A.; Howat, William J.; Hewitt, Stephen M.; Yang, Xiaohong R.; Cornelison, Robert; Anzick, Sarah L.; Meltzer, Paul; Davis, Sean; Lenz, Petra; Figueroa, Jonine D.; Pharoah, Paul D.P.; Sherman, Mark E.

    2010-01-01

    A major challenge in studies of etiologic heterogeneity in breast cancer has been the limited throughput, accuracy and reproducibility of measuring tissue markers. Computerized image analysis systems may help address these concerns but published reports of their use are limited. We assessed agreement between automated and pathologist scores of a diverse set of immunohistochemical (IHC) assays performed on breast cancer TMAs. TMAs of 440 breast cancers previously stained for ER-α, PR, HER-2, ER-β and aromatase were independently scored by two pathologists and three automated systems (TMALabII, TMAx, Ariol). Agreement between automated and pathologist scores of negative/positive was measured using the area under the receiver operator characteristics curve (AUC) and weighted kappa statistics (κ) for categorical scores. We also investigated the correlation between IHC scores and mRNA expression levels. Agreement between pathologist and automated negative/positive and categorical scores was excellent for ER-α and PR (AUC range =0.98-0.99; κ range =0.86-0.91). Lower levels of agreement were seen for ER-β categorical scores (AUC=0.99-1.0; κ=0.80-0.86) and both negative/positive and categorical scores for aromatase (AUC=0.85-0.96; κ=0.41-0.67) and HER2 (AUC=0.94-0.97; κ=0.53-0.72). For ER-α and PR, there was strong correlation between mRNA levels and automated (ρ=0.67-0.74) and pathologist IHC scores (ρ=0.67-0.77). HER2 mRNA levels were more strongly correlated with pathologist (ρ=0.63) than automated IHC scores (ρ=0.41-0.49). Automated analysis of IHC markers is a promising approach for scoring large numbers of breast cancer tissues in epidemiologic investigations. This would facilitate studies of etiologic heterogeneity which ultimately may allow improved risk prediction and better prevention approaches. PMID:20332278

  12. Measuring Classroom Assessment with a Work Sample

    ERIC Educational Resources Information Center

    Beesley, Andrea

    2009-01-01

    Background: To attain accurate information about student performance, teachers must apply sound classroom assessment practices. First, teachers need to be able to understand and identify the purpose of their assessments. Teachers also need to provide their students with clear learning targets, in language that students can easily understand, so…

  13. Automated analysis of contractility in the embryonic stem cell test, a novel approach to assess embryotoxicity.

    PubMed

    Peters, Annelieke K; Wouwer, Gert Van de; Weyn, Barbara; Verheyen, Geert R; Vanparys, Philippe; Gompel, Jacques Van

    2008-12-01

    The embryonic stem cell test (EST) is an ECVAM-validated assay to detect embryotoxicity. The output of the assay is the effect of test compounds on the differentiation of murine-derived embryonic stem cells (D3 cells), recorded by visual analysis of contracting cardiomyocyte-like cells. Incorporation of a system to assess the contractility in an automated manner is proposed, to increase the throughput in the EST independent of observer bias. The automated system is based on image recording of each well, resulting in the area (pixels) and frequency of contractility (Hz). Four test compounds were assessed for their embryotoxic potency in the 96-well version of the EST, with both manual and automated analysis: 6-Aminonicotinamide, Valproic Acid, Boric Acid, and Penicillin G. There was no statistically significant difference in the outcome of both methods in the fraction of contractility (p<0.05), resulting in the same rank-order of Relative Embryotoxic Potency (REP) values: 6-aminonicotinamide (1)>valproic acid (0.007-0.013)>Boric Acid (0.002-0.005)>Penicillin G (0.00001). The automated image recording of contractile cardiomyocyte-like cells in the EST allows for an unbiased high throughput method to assess the embryotoxic potency of test compounds, resulting in an outcome comparable to manual analysis. PMID:18845236

  14. Assessing Creative Problem-Solving with Automated Text Grading

    ERIC Educational Resources Information Center

    Wang, Hao-Chuan; Chang, Chun-Yen; Li, Tsai-Yen

    2008-01-01

    The work aims to improve the assessment of creative problem-solving in science education by employing language technologies and computational-statistical machine learning methods to grade students' natural language responses automatically. To evaluate constructs like creative problem-solving with validity, open-ended questions that elicit…

  15. An Automated Image Selection Approach for Annual Assessment of US Forest Dynamics

    NASA Astrophysics Data System (ADS)

    Schleeweis, K.; Goward, S. N.; Huang, C.; Lindsey, M. A.; Masek, J. G.

    2011-12-01

    North American forests are thought to be a long-term sink for atmospheric carbon, with much of the sink attributed to either forest regrowth from past agricultural clearing or to woody encroachment. However, the magnitude of the North American forest sink is uncertain, because disturbance and regrowth dynamics are not well characterized or understood. The North American Forest Dynamics (NAFD) team from the University of Maryland, US Forest Service and NASA has been working to develop a sound understanding of forest disturbance patterns in North America as a contribution to the North American Carbon Program since 2003. We have found that spatial and temporal sampling of the Landsat data record result in substantial residual errors remaining in our US national estimates for disturbance rates. Conducting a comprehensive annual, wall-to-wall analysis of US disturbance history over the 1985-2010 (Landsat Thematic Mapper) time period will overcome these limitations. The first phase of production includes developing a successful automated image selection approach for selecting the 10,000-15,000 Landsat images (depending on cloud compositing needs) necessary for an annual assessment (1984-2010) of CONUS forest dynamics. A major goal of this approach is to minimize the negative impact of clouds, phenology, and mis-registration by selecting the images necessary for producing annual, clear views of the land surface using a compositing algorithm. This approach has the potential of being adapted for implementation outside the CONUS.

  16. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen

    2014-01-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925

  17. Falcon: automated optimization method for arbitrary assessment criteria

    DOEpatents

    Yang, Tser-Yuan; Moses, Edward I.; Hartmann-Siantar, Christine

    2001-01-01

    FALCON is a method for automatic multivariable optimization for arbitrary assessment criteria that can be applied to numerous fields where outcome simulation is combined with optimization and assessment criteria. A specific implementation of FALCON is for automatic radiation therapy treatment planning. In this application, FALCON implements dose calculations into the planning process and optimizes available beam delivery modifier parameters to determine the treatment plan that best meets clinical decision-making criteria. FALCON is described in the context of the optimization of external-beam radiation therapy and intensity modulated radiation therapy (IMRT), but the concepts could also be applied to internal (brachytherapy) radiotherapy. The radiation beams could consist of photons or any charged or uncharged particles. The concept of optimizing source distributions can be applied to complex radiography (e.g. flash x-ray or proton) to improve the imaging capabilities of facilities proposed for science-based stockpile stewardship.

  18. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    NASA Technical Reports Server (NTRS)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads

  19. SAMPLING DESIGN FOR ASSESSING RECREATIONAL WATER QUALITY

    EPA Science Inventory

    Current U.S. EPA guidelines for monitoring recreatoinal water quality refer to the geometric mean density of indicator organisms, enterococci and E. coli in marine and fresh water, respectively, from at least five samples collected over a four-week period. In order to expand thi...

  20. Smart Combinatorial Research Equipment (SmartCoRE) for Sample Environmental Control and Automated Analysis with Optical Methods

    NASA Astrophysics Data System (ADS)

    Church, Matthew; Ding, Xiaodong; Nantel, Norman

    2012-02-01

    Combinatorial research (CR) has revolutionized the way research is done in every major chemistry, physics and material science laboratory. We propose to bring the same success of automation and capabilities of CR to a widely used technique, small- and wide- angle x-ray scattering (SAXS/WAXS) through our development of a small, modular sample environmental chamber with embedded control electronics that can easily be used in large arrays. The device however is not restricted to a SAXS/WAXS techniques as it can easily be adapted to almost any kind of small volume sample prep or optical analysis technique requiring control of basic sample environmental parameters such as temperature, atmosphere, light and electromagnetic fields. The prototype has the following capabilities: 1. Automated switching of external electronic instrumentation between modules. 2. Thermoelectric temperature control from -50 to 200 C. 3. Ports for gas flow through or evacuation of sample environment. 4. Sealed sample environment using minimally scattering window material. 5. 90 degree field of view of both sides of sample. 6. Optional fiber-optic connections for UV-Vis spectroscopy. 7. Optional GISAXS mounting geometry. 8. Optional liquid sample flow cell.

  1. The Effects of Finite Sampling Corrections on State Assessment Sample Requirements. NAEP Validity Studies (NVS).

    ERIC Educational Resources Information Center

    Chromy, James R.

    States participating in the National Assessment of Educational Progress State Assessment program (state NAEP) are required to sample at least 2,500 students from at least 100 schools per subject assessed. In this ideal situation, 25 students are assessed for a subject in each school selected for that subject. Two problems have arisen: some states…

  2. An automated digital microradiography system for assessing tooth demineralization

    NASA Astrophysics Data System (ADS)

    Darling, Cynthia L.; Le, Charles Q.; Featherstone, John D. B.; Fried, Daniel

    2009-02-01

    Digital Transverse microradiography (TMR) offers several advantages over film based methods including real-time image acquisition, excellent linearity with exposure, and it does not require expensive specialized film. The purpose of this work was to demonstrate that a high-resolution digital microradiography system can be used to measure the volume percent mineral loss for sound and demineralized enamel and dentin thin sections from 150-350-µm in thickness. A custom fabricated digital microradiography system with ~ 2-µm spatial resolution consisting of a digital x-ray imaging camera, a computerized high-speed motion control system and a high-intensity copper Kα x-ray source was used to determine the volume percent mineral content of sound and demineralized tooth sections. The volume percent mineral loss was compared with cross-sectional microhardness measurements on sound extracted human teeth. The correlation between microhardness and microradiography was excellent (Pr=0.99) for section thickness ranging from 59-319-µm (n=11). The attenuation was linear with varying exposure time from 1-10 seconds. Digital TMR is an effective and rapid method for the assessment of the mineral content of enamel and dentin thin sections.

  3. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    PubMed

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images. PMID:26186748

  4. Using after-action review based on automated performance assessment to enhance training effectiveness.

    SciTech Connect

    Stevens-Adams, Susan Marie; Gieseler, Charles J.; Basilico, Justin Derrick; Abbott, Robert G.; Forsythe, James Chris

    2010-09-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides a rigorous empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback on two out of three domain-specific performance metrics.

  5. Automated Assessment of Right Ventricular Volumes and Function Using Three-Dimensional Transesophageal Echocardiography.

    PubMed

    Nillesen, Maartje M; van Dijk, Arie P J; Duijnhouwer, Anthonie L; Thijssen, Johan M; de Korte, Chris L

    2016-02-01

    Assessment of right ventricular (RV) function is known to be of diagnostic value in patients with RV dysfunction. Because of its complex anatomic shape, automated determination of the RV volume is difficult and strong reliance on geometric assumptions is not desired. A method for automated RV assessment was developed using three-dimensional (3-D) echocardiography without relying on a priori knowledge of the cardiac anatomy. A 3-D adaptive filtering technique that optimizes the discrimination between blood and myocardium was applied to facilitate endocardial border detection. Filtered image data were incorporated in a segmentation model to automatically detect the endocardial RV border. End-systolic and end-diastolic RV volumes, as well as ejection fraction, were computed from the automatically segmented endocardial surfaces and compared against reference volumes manually delineated by two expert cardiologists. The results reported good performance in terms of correlation and agreement with the results from the reference volumes. PMID:26633596

  6. Automated sample preparation for radiogenic and non-traditional metal isotope analysis by MC-ICP-MS

    NASA Astrophysics Data System (ADS)

    Field, M. P.; Romaniello, S. J.; Gordon, G. W.; Anbar, A. D.

    2012-12-01

    High throughput analysis is becoming increasingly important for many applications of radiogenic and non-traditional metal isotopes. While MC-ICP-MS instruments offer the potential for very high sample throughout, the requirement for labor-intensive sample preparation and purification procedures remains a substantial bottleneck. Current purification protocols require manually feeding gravity-driven separation columns, a process that is both costly and time consuming. This bottleneck is eliminated with the prepFAST-MC™, an automated, low-pressure ion exchange chromatography system that can process from 1 to 60 samples in unattended operation. The syringe-driven system allows sample loading, multiple acid washes, column conditioning and elution cycles necessary to isolate elements of interest and automatically collect up to 3 discrete eluent fractions at user-defined intervals (time, volume and flow rate). Newly developed protocols for automated purification of uranium illustrates high throughput (>30 per run), multiple samples processed per column (>30), complete (>99%) matrix removal, high recovery (> 98%, n=25), and excellent precision (2 sigma =0.03 permil, n=10). The prepFAST-MC™ maximizes sample throughput and minimizes costs associated with personnel and consumables providing an opportunity to greatly expand research horizons in fields where large isotopic data sets are required, including archeology, geochemistry, and climate/environmental science

  7. Daily Automated Telephone Assessment and Intervention Improved 1-Month Outcome in Paroled Offenders.

    PubMed

    Andersson, Claes; Vasiljevic, Zoran; Höglund, Peter; Ojehagen, Agneta; Berglund, Mats

    2014-03-13

    This randomized trial evaluates whether automated telephony could be used to perform daily assessments in paroled offenders (N = 108) during their first 30 days after leaving prison. All subjects were called daily and answered assessment questions. Based on the content of their daily assessments, subjects in the intervention group received immediate feedback and a recommendation by automated telephony, and their probation officers also received a daily report by email. The outcome variables were analyzed using linear mixed models. The intervention group showed greater improvement than the control group in the summary scores (M = 9.6, 95% confidence interval [CI] = [0.5, 18.7], p = .038), in mental symptoms (M = 4.6, CI = [0.2, 9.0], p = .042), in alcohol drinking (M = 0.8, CI = [0.1, 1.4], p = .031), in drug use (M = 1.0, CI = [0.5, 1.6], p = .000), and in most stressful daily event (M = 1.9, CI = [1.1, 2.7], p = .000). In conclusion, automated telephony may be used to follow up and to give interventions, resulting in reduced stress and drug use, in paroled offenders. PMID:24626145

  8. Determination of rare-earth elements in geological and environmental samples using an automated batch preconcentration/matrix elimination system

    SciTech Connect

    Smith, F.G.; Wiederin, D.R.; Mortlock, R.

    1994-12-31

    Determination of the rare earth elements is important in the study of sedimentary processes. Geological and environmental samples often contain very low levels of these elements, and detection by plasma spectroscopy (ICP-AES, ICP-MS) is difficult unless a preconcentration and/or matrix elimination procedure is performed prior to analysis.; An automated batch preconcentration/matrix elimination system offers rapid, off-line sample preparation for a variety of sample types. A chelating form of a solid suspended reagent is added to a pH-adjusted sample. The suspended reagent with any bound elements are trapped in a hollow fiber membrane filter while unbound matrix components are washed to waste. The reagent with bound analytes are then released in a small volume. The system works in concert with an autosampler for unattended operation. Application to a variety of geological and environmental samples will be described.

  9. A conceptual model of the automated credibility assessment of the volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  10. Detection of Cytomegalovirus (CMV) DNA in EDTA Whole-Blood Samples: Evaluation of the Quantitative artus CMV LightCycler PCR Kit in Conjunction with Automated Sample Preparation▿

    PubMed Central

    Michelin, Birgit D. A.; Hadžisejdić, Ita; Bozic, Michael; Grahovac, Maja; Hess, Markus; Grahovac, Blaženka; Marth, Egon; Kessler, Harald H.

    2008-01-01

    Whole blood has been found to be a reliable matrix for the detection and quantitation of cytomegalovirus (CMV) DNA. In this study, the performance of the artus CMV LightCycler (LC) PCR kit in conjunction with automated sample preparation on a BioRobot EZ1 workstation was evaluated. The accuracy, linearity, analytical sensitivity, and inter- and intra-assay variations were determined. A total of 102 clinical EDTA whole-blood samples were investigated, and results were compared with those obtained with the in vitro diagnostics (IVD)/Conformité Européene (CE)-labeled CMV HHV6,7,8 R-gene quantification kit. When the accuracy of the new kit was tested, seven of eight results were found to be within ±0.5 log10 unit of the expected panel results. Determination of linearity resulted in a quasilinear curve over more than 5 log units. The lower limit of detection of the assay was determined to be 139 copies/ml in EDTA whole blood. The interassay variation ranged from 15 to 58%, and the intra-assay variation ranged from 7 to 35%. When clinical samples were tested and the results were compared with those of the routinely used IVD/CE-labeled assay, 53 samples tested positive and 13 samples tested negative by both of the assays. One sample was found to be positive with the artus CMV LC PCR kit only, and 35 samples tested positive with the routinely used assay only. The majority of discrepant results were found with low-titer samples. In conclusion, use of the artus CMV LC PCR kit in conjunction with automated sample preparation on the BioRobot EZ1 workstation may be suitable for the detection and quantitation of CMV DNA in EDTA whole blood in the routine low-throughput laboratory; however, low-positive results may be missed by this assay. PMID:18272703

  11. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    SciTech Connect

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  12. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    NASA Astrophysics Data System (ADS)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  13. Microcentrifuge or Automated Hematological Analyzer to Assess Hematocrit in Exercise? Effect on Plasma Volume Loss Calculations.

    PubMed

    Alis, Rafael; Sanchis-Gomar, Fabian; Lippi, Giuseppe; Roamgnoli, Marco

    2016-06-01

    The assessment of plasma volume loss (∆PV) induced by exercise can be estimated from changes in hematocrit (Htc) and hemoglobin (Hb), and it is essential when investigating the metabolic or biologic response to exercise of circulating biomarkers. We aimed to ascertain whether the estimation of ∆PV may differ when Hb and Htc are determined by automated hematological analyzer (AHA) versus manual methods. Twenty-five healthy male subjects performed a maximal running incremental exercise. Blood samples were taken before exercise, immediately after exercise, and after a 30-min recovery. Hb and Htc (Htc-AHA) were determined by an AHA. Htc was also determined by microcentrifugation (Htc-M). The ∆PV immediately after exercise and after recovery was calculated. The serum concentrations of several specimens were determined and corrected for ∆PV derived from Htc-AHA (∆PVAHA) and from Htc-M (∆PVM). Htc-M was found to be higher than Htc-AHA at all time points (p < 0.001). However, no differences were observed between ∆PVM and ∆PVAHA either post exercise (∆PVM -12.43% versus ∆PVAHA -12.41%, p = 0.929) or after recovery (∆PVM 1.47% versus ∆PVAHA 1.97%, p = 0.171). No significant differences were found between both ∆PV corrected concentrations of any biomarker (p ≥ 0.076). In conclusion, both AHA and the microcentrifuge may be reliably used to estimate ∆PV during exercise. PMID:25795010

  14. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    SciTech Connect

    Williams, Alex C; Hitt, Austin N; Voisin, Sophie; Tourassi, Georgia

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  15. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  16. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    NASA Astrophysics Data System (ADS)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  17. Automated Detection of Toxigenic Clostridium difficile in Clinical Samples: Isothermal tcdB Amplification Coupled to Array-Based Detection

    PubMed Central

    Pasko, Chris; Groves, Benjamin; Ager, Edward; Corpuz, Maylene; Frech, Georges; Munns, Denton; Smith, Wendy; Warcup, Ashley; Denys, Gerald; Ledeboer, Nathan A.; Lindsey, Wes; Owen, Charles; Rea, Larry; Jenison, Robert

    2012-01-01

    Clostridium difficile can carry a genetically variable pathogenicity locus (PaLoc), which encodes clostridial toxins A and B. In hospitals and in the community at large, this organism is increasingly identified as a pathogen. To develop a diagnostic test that combines the strengths of immunoassays (cost) and DNA amplification assays (sensitivity/specificity), we targeted a genetically stable PaLoc region, amplifying tcdB sequences and detecting them by hybridization capture. The assay employs a hot-start isothermal method coupled to a multiplexed chip-based readout, creating a manual assay that detects toxigenic C. difficile with high sensitivity and specificity within 1 h. Assay automation on an electromechanical instrument produced an analytical sensitivity of 10 CFU (95% probability of detection) of C. difficile in fecal samples, along with discrimination against other enteric bacteria. To verify automated assay function, 130 patient samples were tested: 31/32 positive samples (97% sensitive; 95% confidence interval [CI], 82 to 99%) and 98/98 negative samples (100% specific; 95% CI, 95 to 100%) were scored correctly. Large-scale clinical studies are now planned to determine clinical sensitivity and specificity. PMID:22675134

  18. Defense Automated Neurobehavioral Assessment (DANA)-psychometric properties of a new field-deployable neurocognitive assessment tool.

    PubMed

    Lathan, Corinna; Spira, James L; Bleiberg, Joseph; Vice, Jack; Tsao, Jack W

    2013-04-01

    The Defense Automated Neurobehavioral Assessment (DANA) is a new neurocognitive assessment tool that includes a library of standardized cognitive and psychological assessments, with three versions that range from a brief 5-minute screen to a 45-minute complete assessment. DANA is written using the Android open-source operating system and is suitable for multiple mobile platforms. This article presents testing of DANA by 224 active duty U.S. service members in five operationally relevant environments (desert, jungle, mountain, arctic, and shipboard). DANA was found to be a reliable instrument and compared favorably to other computer-based neurocognitive assessments. Implications for using DANA in far-forward military settings are discussed. PMID:23707818

  19. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study.

    PubMed

    Otten, Paul; Kim, Jonghyun; Son, Sang Hyuk

    2015-01-01

    Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA), are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA) for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient's upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods. PMID:26287206

  20. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    PubMed Central

    Otten, Paul; Kim, Jonghyun; Son, Sang Hyuk

    2015-01-01

    Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA), are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA) for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods. PMID:26287206

  1. Towards Automating Clinical Assessments: A Survey of the Timed Up and Go (TUG)

    PubMed Central

    Sprint, Gina; Cook, Diane; Weeks, Douglas

    2016-01-01

    Older adults often suffer from functional impairments that affect their ability to perform everyday tasks. To detect the onset and changes in abilities, healthcare professionals administer standardized assessments. Recently, technology has been utilized to complement these clinical assessments to gain a more objective and detailed view of functionality. In the clinic and at home, technology is able to provide more information about patient performance and reduce subjectivity in outcome measures. The timed up and go (TUG) test is one such assessment recently instrumented with technology in several studies, yielding promising results towards the future of automating clinical assessments. Potential benefits of technological TUG implementations include additional performance parameters, generated reports, and the ability to be self-administered in the home. In this paper, we provide an overview of the TUG test and technologies utilized for TUG instrumentation. We then critically review the technological advancements and follow up with an evaluation of the benefits and limitations of each approach. Finally, we analyze the gaps in the implementations and discuss challenges for future research towards automated, self-administered assessment in the home. PMID:25594979

  2. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    PubMed

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  3. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines

    PubMed Central

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  4. An Assessment of the Technology of Automated Rendezvous and Capture in Space

    NASA Technical Reports Server (NTRS)

    Polites, M. E.

    1998-01-01

    This paper presents the results of a study to assess the technology of automated rendezvous and capture (AR&C) in space. The outline of the paper is as follows. First, the history of manual and automated rendezvous and capture and rendezvous and dock is presented. Next, the need for AR&C in space is established. Then, today's technology and ongoing technology efforts related to AR&C in space are reviewed. In light of these, AR&C systems are proposed that meet NASA's future needs, but can be developed in a reasonable amount of time with a reasonable amount of money. Technology plans for developing these systems are presented; cost and schedule are included.

  5. A semi-automated micro-method for the histological assessment of fat embolism.

    PubMed

    Busuttil, A; Hanley, J J

    1994-01-01

    A method of quantitatively determining the volume of fat emboli in a tissue using an image analysis system (I.B.A.S.) was developed. This procedure is an interactive, semi-automated tool allowing the quick and accurate gathering of large quantities of data from sections of different tissue samples stained by osmium tetroxide. The development of this procedure was aimed at producing a system which is reliable, reproducible and semi-automated thereby enabling epidemiological and serial studies to be made of a large number of histological sections from different tissues. The system was tested in a study of tissue sections from a series of fatalities from an aircraft crash in an attempt at correlating the incidence of fat emboli with the presence of multiple fractures and soft tissue injuries, the correlation to be made being between the quantitative presence of fat emboli and the extent and severity of injuries suffered. PMID:7529546

  6. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    PubMed

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions. PMID:26135222

  7. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    NASA Astrophysics Data System (ADS)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  8. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  9. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  10. SU-E-I-94: Automated Image Quality Assessment of Radiographic Systems Using An Anthropomorphic Phantom

    SciTech Connect

    Wells, J; Wilson, J; Zhang, Y; Samei, E; Ravin, Carl E.

    2014-06-01

    Purpose: In a large, academic medical center, consistent radiographic imaging performance is difficult to routinely monitor and maintain, especially for a fleet consisting of multiple vendors, models, software versions, and numerous imaging protocols. Thus, an automated image quality control methodology has been implemented using routine image quality assessment with a physical, stylized anthropomorphic chest phantom. Methods: The “Duke” Phantom (Digital Phantom 07-646, Supertech, Elkhart, IN) was imaged twice on each of 13 radiographic units from a variety of vendors at 13 primary care clinics. The first acquisition used the clinical PA chest protocol to acquire the post-processed “FOR PRESENTATION” image. The second image was acquired without an antiscatter grid followed by collection of the “FOR PROCESSING” image. Manual CNR measurements were made from the largest and thickest contrast-detail inserts in the lung, heart, and abdominal regions of the phantom in each image. An automated image registration algorithm was used to estimate the CNR of the same insert using similar ROIs. Automated measurements were then compared to the manual measurements. Results: Automatic and manual CNR measurements obtained from “FOR PRESENTATION” images had average percent differences of 0.42%±5.18%, −3.44%±4.85%, and 1.04%±3.15% in the lung, heart, and abdominal regions, respectively; measurements obtained from “FOR PROCESSING” images had average percent differences of -0.63%±6.66%, −0.97%±3.92%, and −0.53%±4.18%, respectively. The maximum absolute difference in CNR was 15.78%, 10.89%, and 8.73% in the respective regions. In addition to CNR assessment of the largest and thickest contrast-detail inserts, the automated method also provided CNR estimates for all 75 contrast-detail inserts in each phantom image. Conclusion: Automated analysis of a radiographic phantom has been shown to be a fast, robust, and objective means for assessing radiographic

  11. A New Automated Sample Transfer System for Instrumental Neutron Activation Analysis

    PubMed Central

    Ismail, S. S.

    2010-01-01

    A fully automated and fast pneumatic transport system for short-time activation analysis was recently developed. It is suitable for small nuclear research reactors or laboratories that are using neutron generators and other neutron sources. It is equipped with a programmable logic controller, software package, and 12 devices to facilitate optimal analytical procedures. 550 ms were only necessary to transfer the irradiated capsule (diameter: 15 mm, length: 50 mm, weight: 4 gram) to the counting chamber at a distance of 20 meters using pressurized air (4 bars) as a transport gas. PMID:20369063

  12. An automated procedure for the simultaneous determination of specific conductance and pH in natural water samples

    USGS Publications Warehouse

    Eradmann, D.E.; Taylor, H.E.

    1978-01-01

    An automated, continuous-flow system is utilized to determine specific conductance and pH simultaneously in natural waters. A direct electrometric procedure is used to determine values in the range pH 4-9. The specific conductance measurements are made with an electronically modified, commercially available conductivity meter interfaced to a separate module containing the readout control devices and printer. The system is designed to switch ranges automatically to accommodate optimum analysis of widely varying conductances ranging from a few ??mhos cm-1 to 15,000 ??mho cm-1. Thirty samples per hour can be analyzed. Comparison of manual and automated procedures for 40 samples showed that the average differences were 1.3% for specific conductance and 0.07 units for pH. The relative standard deviation for 25 replicate values for each of five samples was significantly less than 1% for the specific conductance determination; the standard deviation for the pH determination was ??? 0.06 pH units. ?? 1978.

  13. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  14. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    SciTech Connect

    Rahman, Nur Aira Abd Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B.; Salim, Nazaratul Ashifa Bt. Abdullah; Ismail, Nadiah Binti

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  15. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    PubMed

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. PMID:26894596

  16. ALVEOLAR BREATH SAMPLING AND ANALYSIS IN HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Alveolar breath sampling and analysis can be extremely useful in exposure assessment studies involving volatile organic compounds (VOCs). Over recent years scientists from the EPA's National Exposure Research Laboratory have developed and refined an alveolar breath collection ...

  17. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    NASA Technical Reports Server (NTRS)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  18. Non-destructive automated sampling of mycotoxins in bulk food and feed - A new tool for required harmonization.

    PubMed

    Spanjer, M; Stroka, J; Patel, S; Buechler, S; Pittet, A; Barel, S

    2001-06-01

    Mycotoxins contamination is highly non-uniformly distributed as is well recog-nized by the EC, by not only setting legal limits in a series of commodities, but also schedule a sampling plan that takes this heterogeneity into account. In practice however, it turns out that it is very difficult to carry out this sampling plan in a harmonised way. Applying the sampling plan to a container filled with pallets of bags (i.e. with nuts or coffee beans) varies from very laborious to almost impossible. The presented non-destructive automated method to sample bulk food could help to overcome these practical problems and to enforcing of EC directives. It is derived from a tested and approved technology for detection of illicit substances in security applications. It has capability to collect and iden-tify ultra trace contaminants, i.e. from a fingerprint of chemical substance in a bulk of goods, a cargo pallet load (~ 1000 kg) with boxes and commodities.The technology, patented for explosives detection, uses physical and chemistry processes for excitation and remote rapid enhanced release of contaminant residues, vapours and particulate, of the inner/outer surfaces of inspected bulk and collect them on selective probes. The process is automated, takes only 10 minutes, is non-destructive and the bulk itself remains unharmed. The system design is based on applicable international regulations for shipped cargo hand-ling and transportation by road, sea and air. After this process the pallet can be loaded on a truck, ship or plane. Analysis can be carried out before the cargo leaves the place of shipping. The potent application of this technology for myco-toxins detection, has been demonstrated by preliminary feasibility experiments. Aflatoxins were detected in pistachios and ochratoxin A in green coffee beans bulk. Both commodities were naturally contaminated, priory found and confirm-ed by common methods as used at routine inspections. Once the contaminants are extracted from a

  19. Automated assessment of noninvasive filling pressure using color Doppler M-mode echocardiography

    NASA Technical Reports Server (NTRS)

    Greenberg, N. L.; Firstenberg, M. S.; Cardon, L. A.; Zuckerman, J.; Levine, B. D.; Garcia, M. J.; Thomas, J. D.

    2001-01-01

    Assessment of left ventricular filling pressure usually requires invasive hemodynamic monitoring to follow the progression of disease or the response to therapy. Previous investigations have shown accurate estimation of wedge pressure using noninvasive Doppler information obtained from the ratio of the wave propagation slope from color M-mode (CMM) images and the peak early diastolic filling velocity from transmitral Doppler images. This study reports an automated algorithm that derives an estimate of wedge pressure based on the spatiotemporal velocity distribution available from digital CMM Doppler images of LV filling.

  20. Automated recognition and assessment of cross peaks in two-dimensional NMR spectra of macromolecules

    NASA Astrophysics Data System (ADS)

    Glaser, S.; Kalbitzer, H. R.

    A generally applicable procedure for the automated recognition of cross peaks in two-dimensional NMR spectra is presented which exploits local and global spectral properties. It is mainly based on general symmetry considerations which apply for the two-dimensional homonuclear techniques commonly used for structural determination of macromolecules in solution. The corresponding PASCAL program has been tested on a double-quantumfiltered COSY spectrum of a small protein; the results show that the recognition of cross peaks and their assessment works effectively even on spectra with intense 1 noise and experimental artifacts as are typically obtained for biological macromolecules with relatively low solubility.

  1. IntelliCages and automated assessment of learning in group-housed mice

    NASA Astrophysics Data System (ADS)

    Puścian, Alicja; Knapska, Ewelina

    2014-11-01

    IntelliCage is a fully automated, computer controlled system, which can be used for long-term monitoring of behavior of group-housed mice. Using standardized experimental protocols we can assess cognitive abilities and behavioral flexibility in appetitively and aversively motivated tasks, as well as measure social influences on learning of the subjects. We have also identified groups of neurons specifically activated by appetitively and aversively motivated learning within the amygdala, function of which we are going to investigate optogenetically in the future.

  2. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  3. Assessment of Diesse Ves-matic automated system for measuring erythrocyte sedimentation rate.

    PubMed Central

    Caswell, M; Stuart, J

    1991-01-01

    Measurement of the erythrocyte sedimentation rate (ESR) using a closed tube system reduces the biohazard risk to laboratory staff. The Diesse Ves-matic system offers manual or vacuum collection of blood into plastic tubes, automated mixing of the sample, and automated reading of the end point after 20 minutes of sedimentation. This system was compared with the 1977 Westergren ESR method of the International Council for Standardization in Haematology (ICSH) and with the 1988 ICSH undiluted ESR method. Manually collected Ves-matic samples showed good agreement with ICSH values, although there was a tendency to false low results at low ESR values which may represent dilution of plasma protein with excess citrate. Vacuum collected Ves-matic samples also showed good agreement with ICSH values, although there was a tendency to false high results which may reflect a change in the blood: citrate ratio caused by loss of anticoagulant diluent or vacuum from plastic tubes during storage. The Diesse Ves-matic system incorporates several improvements over previous technology and offers a safer, quicker, and more standardised ESR. PMID:1752986

  4. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    PubMed

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-01

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis. PMID:23883353

  5. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  6. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    PubMed Central

    Burge, Scott R.; Hoffman, Dave A.; Hartman, Mary J.; Venedam, Richard J.

    2005-01-01

    The capabilities of a “universal platform” for the deployment of analytical sensors in the field for long-term monitoring of environmental contaminants were expanded in this investigation. The platform was previously used to monitor trichloroethene in monitoring wells and at groundwater treatment systems (1,2). The platform was interfaced with chromium (VI) and conductivity analytical systems to monitor shallow wells installed adjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. A groundwater plume of hexavalent chromium is discharging into the Columbia River through the gravels beds used by spawning salmon. The sampling/analytical platform was deployed for the purpose of collecting data on subsurface hexavalent chromium concentrations at more frequent intervals than was possible with the previous sampling and analysis methods employed a the Site.

  7. The Effects of Finite Sampling on State Assessment Sample Requirements. NAEP Validity Studies. Working Paper Series.

    ERIC Educational Resources Information Center

    Chromy, James R.

    This study addressed statistical techniques that might ameliorate some of the sampling problems currently facing states with small populations participating in State National Assessment of Educational Progress (NAEP) assessments. The study explored how the application of finite population correction factors to the between-school component of…

  8. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination.

    PubMed

    Milliard, Alex; Durand-Jézéquel, Myriam; Larivière, Dominic

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO(2)/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg(-1) for 5-300 mg of sample. PMID:21167982

  9. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  10. Use of automated monitoring to assess behavioral toxicology in fish: Linking behavior and physiology

    USGS Publications Warehouse

    Brewer, S.K.; DeLonay, A.J.; Beauvais, S.L.; Little, E.E.; Jones, S.B.

    1999-01-01

    We measured locomotory behaviors (distance traveled, speed, tortuosity of path, and rate of change in direction) with computer-assisted analysis in 30 day posthatch rainbow trout (Oncorhynchus mykiss) exposed to pesticides. We also examined cholinesterase inhibition as a potential endpoint linking physiology and behavior. Sublethal exposure to chemicals often causes changes in swimming behavior, reflecting alterations in sensory and motor systems. Swimming behavior also integrates functions of the nervous system. Rarely are the connections between physiology and behavior made. Although behavior is often suggested as a sensitive, early indicator of toxicity, behavioral toxicology has not been used to its full potential because conventional methods of behavioral assessment have relied on manual techniques, which are often time-consuming and difficult to quantify. This has severely limited the application and utility of behavioral procedures. Swimming behavior is particularly amenable to computerized assessment and automated monitoring. Locomotory responses are sensitive to toxicants and can be easily measured. We briefly discuss the use of behavior in toxicology and automated techniques used in behavioral toxicology. We also describe the system we used to determine locomotory behaviors of fish, and present data demonstrating the system's effectiveness in measuring alterations in response to chemical challenges. Lastly, we correlate behavioral and physiological endpoints.

  11. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    PubMed

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  12. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    PubMed

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. PMID:26759433

  13. Automation of ⁹⁹Tc extraction by LOV prior ICP-MS detection: application to environmental samples.

    PubMed

    Rodríguez, Rogelio; Leal, Luz; Miranda, Silvia; Ferrer, Laura; Avivar, Jessica; García, Ariel; Cerdà, Víctor

    2015-02-01

    A new, fast, automated and inexpensive sample pre-treatment method for (99)Tc determination by inductively coupled plasma-mass spectrometry (ICP-MS) detection is presented. The miniaturized approach is based on a lab-on-valve (LOV) system, allowing automatic separation and preconcentration of (99)Tc. Selectivity is provided by the solid phase extraction system used (TEVA resin) which retains selectively pertechnetate ion in diluted nitric acid solution. The proposed system has some advantages such as minimization of sample handling, reduction of reagents volume, improvement of intermediate precision and sample throughput, offering a significant decrease of both time and cost per analysis in comparison to other flow techniques and batch methods. The proposed LOV system has been successfully applied to different samples of environmental interest (water and soil) with satisfactory recoveries, between 94% and 98%. The detection limit (LOD) of the developed method is 0.005 ng. The high durability of the resin and its low amount (32 mg), its good intermediate precision (RSD 3.8%) and repeatability (RSD 2%) and its high extraction frequency (up to 5 h(-1)) makes this method an inexpensive, high precision and fast tool for monitoring (99)Tc in environmental samples. PMID:25435232

  14. Low-pressure, automated, sample packing unit for diffuse reflectance infrared spectrometry

    NASA Astrophysics Data System (ADS)

    Christy, Alfred A.; Tvedt, Jan Erik; Karstang, Terje V.; Velapoldi, Rance A.

    1988-03-01

    An automatic, low-pressure packing unit has been designed with control of packing time and pressure to prepare powder samples for diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS). This unit also provides a polished packing surface that ensures constant measurement height of the sample in the spectrometer. Use of this unit coupled with sample rotation during measurement and control of particle size and size distribution, provides excellent precision in obtaining DRIFTS spectra. For example, repackings by a single person or by several untrained people gave coefficients of variation from 0.8% to 2.3% for each digital spectral value for a coal sample and from 1.3% to 3.7% for thymol blue, a sharp spectral featured organic, rather than the 15%-30% normally found for repackings of the same sample. Thus representative DRIFTS spectra can be obtained quickly and efficiently from a powder sample with a single spectrum using this low-pressure, mechanical packing device, control of particle parameters, and sample rotation as opposed to previous efforts requiring the repacking of several samples and averaging of the spectra.

  15. Rapid habitability assessment of Mars samples by pyrolysis-FTIR

    NASA Astrophysics Data System (ADS)

    Gordon, Peter R.; Sephton, Mark A.

    2016-02-01

    Pyrolysis Fourier transform infrared spectroscopy (pyrolysis FTIR) is a potential sample selection method for Mars Sample Return missions. FTIR spectroscopy can be performed on solid and liquid samples but also on gases following preliminary thermal extraction, pyrolysis or gasification steps. The detection of hydrocarbon and non-hydrocarbon gases can reveal information on sample mineralogy and past habitability of the environment in which the sample was created. The absorption of IR radiation at specific wavenumbers by organic functional groups can indicate the presence and type of any organic matter present. Here we assess the utility of pyrolysis-FTIR to release water, carbon dioxide, sulfur dioxide and organic matter from Mars relevant materials to enable a rapid habitability assessment of target rocks for sample return. For our assessment a range of minerals were analyzed by attenuated total reflectance FTIR. Subsequently, the mineral samples were subjected to single step pyrolysis and multi step pyrolysis and the products characterised by gas phase FTIR. Data from both single step and multi step pyrolysis-FTIR provide the ability to identify minerals that reflect habitable environments through their water and carbon dioxide responses. Multi step pyrolysis-FTIR can be used to gain more detailed information on the sources of the liberated water and carbon dioxide owing to the characteristic decomposition temperatures of different mineral phases. Habitation can be suggested when pyrolysis-FTIR indicates the presence of organic matter within the sample. Pyrolysis-FTIR, therefore, represents an effective method to assess whether Mars Sample Return target rocks represent habitable conditions and potential records of habitation and can play an important role in sample triage operations.

  16. Rapid and automated sample preparation for nucleic acid extraction on a microfluidic CD (compact disk)

    NASA Astrophysics Data System (ADS)

    Kim, Jitae; Kido, Horacio; Zoval, Jim V.; Gagné, Dominic; Peytavi, Régis; Picard, François J.; Bastien, Martine; Boissinot, Maurice; Bergeron, Michel G.; Madou, Marc J.

    2006-01-01

    Rapid and automated preparation of PCR (polymerase chain reaction)-ready genomic DNA was demonstrated on a multiplexed CD (compact disk) platform by using hard-to-lyse bacterial spores. Cell disruption is carried out while beadcell suspensions are pushed back and forth in center-tapered lysing chambers by angular oscillation of the disk - keystone effect. During this lysis period, the cell suspensions are securely held within the lysing chambers by heatactivated wax valves. Upon application of a remote heat to the disk in motion, the wax valves release lysate solutions into centrifuge chambers where cell debris are separated by an elevated rotation of the disk. Only debris-free DNA extract is then transferred to collection chambers by capillary-assisted siphon and collected for heating that inactivates PCR inhibitors. Lysing capacity was evaluated using a real-time PCR assay to monitor the efficiency of Bacillus globigii spore lysis. PCR analysis showed that 5 minutes' CD lysis run gave spore lysis efficiency similar to that obtained with a popular commercial DNA extraction kit (i.e., IDI-lysis kit from GeneOhm Sciences Inc.) which is highly efficient for microbial cell and spore lysis. This work will contribute to the development of an integrated CD-based assay for rapid diagnosis of infectious diseases.

  17. Automated Sample Preparation Platform for Mass Spectrometry-Based Plasma Proteomics and Biomarker Discovery

    PubMed Central

    Guryča, Vilém; Roeder, Daniel; Piraino, Paolo; Lamerz, Jens; Ducret, Axel; Langen, Hanno; Cutler, Paul

    2014-01-01

    The identification of novel biomarkers from human plasma remains a critical need in order to develop and monitor drug therapies for nearly all disease areas. The discovery of novel plasma biomarkers is, however, significantly hampered by the complexity and dynamic range of proteins within plasma, as well as the inherent variability in composition from patient to patient. In addition, it is widely accepted that most soluble plasma biomarkers for diseases such as cancer will be represented by tissue leakage products, circulating in plasma at low levels. It is therefore necessary to find approaches with the prerequisite level of sensitivity in such a complex biological matrix. Strategies for fractionating the plasma proteome have been suggested, but improvements in sensitivity are often negated by the resultant process variability. Here we describe an approach using multidimensional chromatography and on-line protein derivatization, which allows for higher sensitivity, whilst minimizing the process variability. In order to evaluate this automated process fully, we demonstrate three levels of processing and compare sensitivity, throughput and reproducibility. We demonstrate that high sensitivity analysis of the human plasma proteome is possible down to the low ng/mL or even high pg/mL level with a high degree of technical reproducibility. PMID:24833342

  18. Computer Man Simulation of Incapacitation: An Automated Approach to Wound Ballistics and Associated Medical Care Assessments

    PubMed Central

    Clare, V.; Ashman, W.; Broome, P.; Jameson, J.; Lewis, J.; Merkler, J.; Mickiewicz, A.; Sacco, W.; Sturdivan, L.

    1981-01-01

    Wound ballistics assessments traditionally have been based on correlations between some quantification of “ballistic dose” and an empirical/subjective medical quantification of human functional degradation. Although complicated by the highly inhomogeneous nature of the human body and by the voluminous data handling requirements these correlation values were obtained by manual methods. The procedure required a substantial commitment of time and resources, thereby restricting the data base from which incapacitation evaluations were made. The obvious advantages of automated wound ballistics analyses have been realized in the ARRADCOM Computer Man System, capable of duplicating the results of the manual system while reducing the time required for each analysis from three months to less than one day. The versatility of the system also makes it readily adaptable to other ballistic, medical, and paramedical assessment tasks. ImagesFIGURE 1FIGURE 2FIGURE 3FIGURE 4FIGURE 5FIGURE 6FIGURE 7FIGURE 8FIGURE 9FIGURE 10FIGURE 11FIGURE 12FIGURE 13

  19. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    PubMed

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-01

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site. PMID:22440010

  20. Automated Radioanalytical System Incorporating Microwave-Assisted Sample Preparation, Chemical Separation, and Online Radiometric Detection for the Monitoring of Total 99Tc in Nuclear Waste Processing Streams

    SciTech Connect

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  1. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    NASA Technical Reports Server (NTRS)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  2. Determination of actinides in environmental samples using an automated batch preconcentration/matrix elimination system

    SciTech Connect

    Smith, F.G.; Crain, J.S.

    1995-12-31

    The determination of thorium, uranium, and uranium progeny (e.g. {sup 226}Ra) in environmental samples is of considerable interest in terms of human health. Traditional radiochemical determinations of long-lived radioisotopes often require rigorous chemical separations and long duration measurements by techniques such as {alpha}-spectrometry. Inductively coupled plasma mass spectrometry (ICP-MS) offers sub-ppt (1 ng/L) detection limits for the actinides with minimal sample preparation and high sample throughput. However, sample preconcentration and/or matrix elimination is required to achieve required detection limits below 1ppq (1 pg/L). This paper describes a batch preconcentration/matrix elimination system for off-line sample preparation. An aliquot of an actinide selective polymer beads is added to a sample and pumped through a filter. Unbound sample matrix components are washed to waste then the beads with bound actinides are released in a small volume. The preconcentrate is then introduced to the ICP-MS by pneumatic or ultrasonic nebulization. Data for a variety of natural water matrices (well, spring, lake, river, and tapwater) will be presented.

  3. Investigation of Mercury Wet Deposition Physicochemistry in the Ohio River Valley through Automated Sequential Sampling

    EPA Science Inventory

    Intra-storm variability and soluble fractionation was explored for summer-time rain events in Steubenville, Ohio to evaluate the physical processes controlling mercury (Hg) in wet deposition in this industrialized region. Comprehensive precipitation sample collection was conducte...

  4. Automated sample preparation station for studying self-diffusion in porous solids with NMR spectroscopy

    SciTech Connect

    Hedin, Niklas; DeMartin, Gregory J.; Reyes, Sebastian C.

    2006-03-15

    In studies of gas diffusion in porous solids with nuclear magnetic resonance (NMR) spectroscopy the sample preparation procedure becomes very important. An apparatus is presented here that pretreats the sample ex situ and accurately sets the desired pressure and temperature within the NMR tube prior to its introduction in the spectrometer. The gas manifold that supplies the NMR tube is also connected to a microbalance containing another portion of the same sample, which is kept at the same temperature as the sample in the NMR tube. This arrangement permits the simultaneous measurement of the adsorption loading on the sample, which is required for the interpretation of the NMR diffusion experiments. Furthermore, to ensure a good seal of the NMR tube, a hybrid valve design composed of titanium, a Teflon registered seat, and Kalrez registered O-rings is utilized. A computer controlled algorithm ensures the accuracy and reproducibility of all the procedures, enabling the NMR diffusion experiments to be performed at well controlled conditions of pressure, temperature, and amount of gas adsorbed on the porous sample.

  5. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    NASA Astrophysics Data System (ADS)

    Schneider-Zapp, K.; Salter, M. E.; Upstill-Goddard, R. C.

    2014-07-01

    In order to advance understanding of the role of seawater surfactants in the air-sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw), we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with Milli-Q water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air-sea gas exchange process.

  6. An automated gas exchange tank for determining gas transfer velocities in natural seawater samples

    NASA Astrophysics Data System (ADS)

    Schneider-Zapp, K.; Salter, M. E.; Upstill-Goddard, R. C.

    2014-02-01

    In order to advance understanding of the role of seawater surfactants in the air-sea exchange of climatically active trace gases via suppression of the gas transfer velocity (kw), we constructed a fully automated, closed air-water gas exchange tank and coupled analytical system. The system allows water-side turbulence in the tank to be precisely controlled with an electronically operated baffle. Two coupled gas chromatographs and an integral equilibrator, connected to the tank in a continuous gas-tight system, allow temporal changes in the partial pressures of SF6, CH4 and N2O to be measured simultaneously in the tank water and headspace at multiple turbulence settings, during a typical experimental run of 3.25 h. PC software developed by the authors controls all operations and data acquisition, enabling the optimisation of experimental conditions with high reproducibility. The use of three gases allows three independent estimates of kw for each turbulence setting; these values are subsequently normalised to a constant Schmidt number for direct comparison. The normalised kw estimates show close agreement. Repeated experiments with MilliQ water demonstrate a typical measurement accuracy of 4% for kw. Experiments with natural seawater show that the system clearly resolves the effects on kw of spatial and temporal trends in natural surfactant activity. The system is an effective tool with which to probe the relationships between kw, surfactant activity and biogeochemical indices of primary productivity, and should assist in providing valuable new insights into the air-sea gas exchange process.

  7. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    PubMed

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-01

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples. PMID:26118804

  8. Detection of motile micro-organisms in biological samples by means of a fully automated image processing system

    NASA Astrophysics Data System (ADS)

    Alanis, Elvio; Romero, Graciela; Alvarez, Liliana; Martinez, Carlos C.; Hoyos, Daniel; Basombrio, Miguel A.

    2001-08-01

    A fully automated image processing system for detection of motile microorganism is biological samples is presented. The system is specifically calibrated for determining the concentration of Trypanosoma Cruzi parasites in blood samples of mice infected with Chagas disease. The method can be adapted for use in other biological samples. A thin layer of blood infected by T. cruzi parasites is examined in a common microscope in which the images of the vision field are taken by a CCD camera and temporarily stored in the computer memory. In a typical field, a few motile parasites are observable surrounded by blood red cells. The parasites have low contrast. Thus, they are difficult to detect visually but their great motility betrays their presence by the movement of the nearest neighbor red cells. Several consecutive images of the same field are taken, decorrelated with each other where parasites are present, and digitally processed in order to measure the number of parasites present in the field. Several fields are sequentially processed in the same fashion, displacing the sample by means of step motors driven by the computer. A direct advantage of this system is that its results are more reliable and the process is less time consuming than the current subjective evaluations made visually by technicians.

  9. Automated total and radioactive strontium separation and preconcentration in samples of environmental interest exploiting a lab-on-valve system.

    PubMed

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Victor

    2012-07-15

    A novel lab-on-valve system has been developed for strontium determination in environmental samples. Miniaturized lab-on-valve system potentially offers facilities to allow any kind of chemical and physical processes, including fluidic and microcarrier bead control, homogenous reaction and liquid-solid interaction. A rapid, inexpensive and fully automated method for the separation and preconcentration of total and radioactive strontium, using a solid phase extraction material (Sr-Resin), has been developed. Total strontium concentrations are determined by ICP-OES and (90)Sr activities by a low background proportional counter. The method has been successfully applied to different water samples of environmental interest. The proposed system offers minimization of sample handling, drastic reduction of reagent volume, improvement of the reproducibility and sample throughput and attains a significant decrease of both time and cost per analysis. The LLD of the total Sr reached is 1.8ng and the minimum detectable activity for (90)Sr is 0.008Bq. The repeatability of the separation procedure is 1.2% (n=10). PMID:22817934

  10. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples.

    PubMed

    Budding, Andries E; Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M J E; Savelkoul, Paul H M

    2016-04-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice. PMID:26763956

  11. Rolling Deck to Repository (R2R): Automated Magnetic and Gravity Quality Assessment and Data Reduction

    NASA Astrophysics Data System (ADS)

    Morton, J. J.; O'Hara, S.; Ferrini, V.; Arko, R. A.

    2010-12-01

    With its global capability and diverse array of sensors, the academic research fleet is an integral component of ocean exploration. The Rolling Deck to Repository (R2R) Program provides a central shore-side gateway for underway data from the U.S. academic research fleet. In addition to ensuring preservation and documentation of routine underway data, R2R is also developing automated quality assessment (QA) tools for a variety of underway data types. Routine post-cruise QA will enable prompt feedback to shipboard operators and to provide the science community with sufficient background information for data analysis. Based on community feedback, R2R will perform data reduction to generate enhanced data products for select data types including gravity and magnetics. In the development of these tools, R2R seeks input from the scientific community, engaging specialists for each data type and requesting feedback from operators and scientists to deliver the most relevant and useful metadata. Development of data acquisition best practices that are being assembled within the community for some data types will also be important components of R2R QA development. Protocols for gravity and magnetics QA will include the development of guidelines for minimal and optimal metadata for each data type that will enable data reduction and optimize data re-use. Metadata including instrument specifications, navigational offsets, and calibration information will be important inputs for both data reduction and QA. Data reduction will include merging these geophysical data types with high-quality R2R-generated navigation data products, cleaning the data and applying instrument corrections. Automated routines that are being developed will then be used to assess data quality, ultimately producing a Quality Assessment Certificate (QAC) that will provide the science community with quality information in an easily accessible and understandable format. We present progress to date and invite

  12. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    PubMed Central

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  13. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    PubMed

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  14. Automation of preparation of nonmetallic samples for analysis by atomic absorption and inductively coupled plasma spectrometry

    NASA Technical Reports Server (NTRS)

    Wittmann, A.; Willay, G.

    1986-01-01

    For a rapid preparation of solutions intended for analysis by inductively coupled plasma emission spectrometry or atomic absorption spectrometry, an automatic device called Plasmasol was developed. This apparatus used the property of nonwettability of glassy C to fuse the sample in an appropriate flux. The sample-flux mixture is placed in a composite crucible, then heated at high temperature, swirled until full dissolution is achieved, and then poured into a water-filled beaker. After acid addition, dissolution of the melt, and filling to the mark, the solution is ready for analysis. The analytical results obtained, either for oxide samples or for prereduced iron ores show that the solutions prepared with this device are undistinguished from those obtained by manual dissolutions done by acid digestion or by high temperature fusion. Preparation reproducibility and analytical tests illustrate the performance of Plasmasol.

  15. Automated sample preparation for ICP analysis of active pharmaceutical ingredients and intermediates.

    PubMed

    Sims, Jonathan; Smith, Andrew; Patel, Dharmista; Batchelor, Richard; Carreira, Judith

    2011-10-01

    Routine testing of active pharmaceutical ingredients (APIs) for metal residues is an expectation of regulatory bodies such as the FDA (U.S. Food and Drug Administration). Sample preparation techniques are the rate-limiting step in the testing process and can be variable depending on the specific characteristics of the API under test. Simplification and standardization of the routine preparation of inductively coupled plasma spectroscopy sample solutions of organic compounds has been developed using a commercially available robotic workstation. Contamination from the metal components of the instrument and from sample tubes used in the methodology has been studied using a Design of Experiments approach. The optimized method described can be used for the measurement of trace metals in Pharmaceuticals at levels compliant with European and U.S. regulatory requirements. PMID:21906564

  16. Automated Lung Segmentation and Image Quality Assessment for Clinical 3-D/4-D-Computed Tomography

    PubMed Central

    Li, Guang

    2014-01-01

    4-D-computed tomography (4DCT) provides not only a new dimension of patient-specific information for radiation therapy planning and treatment, but also a challenging scale of data volume to process and analyze. Manual analysis using existing 3-D tools is unable to keep up with vastly increased 4-D data volume, automated processing and analysis are thus needed to process 4DCT data effectively and efficiently. In this paper, we applied ideas and algorithms from image/signal processing, computer vision, and machine learning to 4DCT lung data so that lungs can be reliably segmented in a fully automated manner, lung features can be visualized and measured on the fly via user interactions, and data quality classifications can be computed in a robust manner. Comparisons of our results with an established treatment planning system and calculation by experts demonstrated negligible discrepancies (within ±2%) for volume assessment but one to two orders of magnitude performance enhancement. An empirical Fourier-analysis-based quality measure-delivered performances closely emulating human experts. Three machine learners are inspected to justify the viability of machine learning techniques used to robustly identify data quality of 4DCT images in the scalable manner. The resultant system provides a toolkit that speeds up 4-D tasks in the clinic and facilitates clinical research to improve current clinical practice. PMID:25621194

  17. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    NASA Astrophysics Data System (ADS)

    El-Alaily, T. M.; El-Nimr, M. K.; Saafan, S. A.; Kamel, M. M.; Meaz, T. M.; Assar, S. T.

    2015-07-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability.

  18. AUTOMATED SYSTEM FOR COLLECTING MULTIPLE, SEQUENTIAL SAMPLES FROM SOIL WATER SAMPLERS UNDER CONTINUOUS VACUUM

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Manually collecting a series of sequential, discrete water samples from soil water percolation samplers, or similar devices, that withdraw water from unsaturated porous media under continuous vacuum is a logistical challenge, though the resulting collection can provide valuable information on the dy...

  19. A self-contained polymeric cartridge for automated biological sample preparation.

    PubMed

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y

    2011-09-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic cartridge contains all the necessary reagents for pathogen and cell lysis, DNA/RNA extraction, impurity washes, DNA/RNA elution and waste processing in a completely sealed cartridge. The entire sample preparation processes are automatically conducted within the cartridge on a desktop unit using a pneumatic fluid manipulation approach. Reagents transportation is achieved with a combination of push and pull forces (with compressed air and vacuum, respectively), which are connected to the pneumatic inlets at the bottom of the cartridge. These pneumatic forces are regulated by pinch valve manifold and two pneumatic syringe pumps within the desktop unit. The performance of this pneumatic reagent delivery method was examined. We have demonstrated the capability of the on-cartridge RNA extraction and cancer-specific gene amplification from 10 copies of MCF-7 breast cancer cells. The on-cartridge DNA recovery efficiency was 54-63%, which was comparable to or better than the conventional manual approach using silica spin column. The lab cartridge would be suitable for integration with lab-chip real-time polymerase chain reaction devices in providing a portable system for decentralized disease diagnosis. PMID:22662036

  20. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples

    PubMed Central

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A.; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M.; Fortina, Paolo; Pienta, Kenneth J.; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization. PMID:26808060

  1. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples.

    PubMed

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M; Fortina, Paolo; Pienta, Kenneth J; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization. PMID:26808060

  2. Automated flow-through amperometric immunosensor for highly sensitive and on-line detection of okadaic acid in mussel sample.

    PubMed

    Dominguez, Rocio B; Hayat, Akhtar; Sassolas, Audrey; Alonso, Gustavo A; Munoz, Roberto; Marty, Jean-Louis

    2012-09-15

    An electrochemical immunosensor for okadaic acid (OA) detection has been developed, and used in an indirect competitive immunoassay format under automated flow conditions. The biosensor was fabricated by injecting OA modified magnetic beads onto screen printed carbon electrode (SPCE) in the flow system. The OA present in the sample competed with the immobilized OA to bind with anti-okadaic acid monoclonal antibody (anti-OA-MAb). The secondary alkaline phosphatase labeled antibody was used to perform electrochemical detection. The current response obtained from the labeled alkaline phosphatase to 1-naphthyl phosphate decreased proportionally to the concentration of free OA in the sample. The calculated limit of detection (LOD) was 0.15 μg/L with a linear range of 0.19-25 μg/L. The good recoveries percentages validated the immunosensor application for real mussel samples. The developed system automatically controlled the incubation, washing and current measurement steps, showing its potential use for OA determination in field analysis. PMID:22967546

  3. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Assessing the Impact of Urban Growth and the use of Low Impact Development Practices.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impact of urban growth and evaluate the effects of low impact development (LID) practices. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov...

  4. Automated on-line preconcentration of palladium on different sorbents and its determination in environmental samples.

    PubMed

    Sánchez Rojas, Fuensanta; Bosch Ojeda, Catalina; Cano Pavón, José Manuel

    2007-01-01

    The determination of noble metals in environmental samples is of increasing importance. Palladium is often employed as a catalyst in chemical industry and is also used with platinum and rhodium in motor car catalytic converters which might cause environmental pollution problems. Two different sorbents for palladium preconcentration in different samples were investigated: silica gel functionalized with 1,5-bis(di-2-pyridyl)methylene tbiocarbohydrazide (DPTH-gel) and [1,5-Bis(2-pyridyl)-3-sulphophenyI methylene thiocarbonohydrazide (PSTH) immobilised on an anion-exchange resin (Dowex lx8-200)]. The sorbents were tested in a micro-column, placed in the auto-sampler arm, at the flow rate 2.8 mL min(-1). Elution was performed with 4 M HCl and 4 M HNO3, respectively. Satisfactory results were obtained for two sorbents. PMID:17822233

  5. Automated microextraction sample preparation coupled on-line to FT-ICR-MS: application to desalting and concentration of river and marine dissolved organic matter.

    PubMed

    Morales-Cid, Gabriel; Gebefugi, Istvan; Kanawati, Basem; Harir, Mourad; Hertkorn, Norbert; Rosselló-Mora, Ramón; Schmitt-Kopplin, Philippe

    2009-10-01

    Sample preparation procedures are in most cases sample- and time-consuming and commonly require the use of a large amount of solvents. Automation in this regard can optimize the minimal-needed injection volume and the solvent consumption will be efficiently reduced. A new fully automated sample desalting and pre-concentration technique employing microextraction by packed sorbents (MEPS) cartridges is implemented and coupled to an ion cyclotron resonance Fourier-transform mass spectrometer (ICR-FT/MS). The performance of non-target mass spectrometric analysis is compared for the automated versus off-line sample preparation for several samples of aqueous natural organic matter. This approach can be generalized for any metabolite profiling or metabolome analysis of biological materials but was optimized herein using a well characterized but highly complex organic mixture: a surface water and its well-characterized natural organic matter and a marine sample having a highly salt charge and enabling to validate the presented automatic system for salty samples. The analysis of Suwannee River water showed selective C18-MEPS enrichment of chemical signatures with average H/C and O/C elemental ratios and loss of both highly polar and highly aromatic structures from the original sample. Automated on-line application to marine samples showed desalting and different chemical signatures from surface to bottom water. Relative comparison of structural footprints with the C18-concentration/desalting procedure however enabled to demonstrate that the surface water film was more concentrated in surface-active components of natural (fatty acids) and anthropogenic origin (sulfur-containing surfactants). Overall, the relative standard deviation distribution in terms of peak intensity was improved by automating the proposed on-line method. PMID:19685041

  6. Automated cell viability assessment using a microfluidics based portable imaging flow analyzer.

    PubMed

    Jagannadh, Veerendra Kalyan; Adhikari, Jayesh Vasudeva; Gorthi, Sai Siva

    2015-03-01

    In this work, we report a system-level integration of portable microscopy and microfluidics for the realization of optofluidic imaging flow analyzer with a throughput of 450 cells/s. With the use of a cellphone augmented with off-the-shelf optical components and custom designed microfluidics, we demonstrate a portable optofluidic imaging flow analyzer. A multiple microfluidic channel geometry was employed to demonstrate the enhancement of throughput in the context of low frame-rate imaging systems. Using the cell-phone based digital imaging flow analyzer, we have imaged yeast cells present in a suspension. By digitally processing the recorded videos of the flow stream on the cellphone, we demonstrated an automated cell viability assessment of the yeast cell population. In addition, we also demonstrate the suitability of the system for blood cell counting. PMID:26015835

  7. An Automated Method for Navigation Assessment for Earth Survey Sensors Using Island Targets

    NASA Technical Reports Server (NTRS)

    Patt, F. S.; Woodward, R. H.; Gregg, W. W.

    1997-01-01

    An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalogue of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean colour sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.

  8. Automated cell viability assessment using a microfluidics based portable imaging flow analyzer

    PubMed Central

    Jagannadh, Veerendra Kalyan; Adhikari, Jayesh Vasudeva; Gorthi, Sai Siva

    2015-01-01

    In this work, we report a system-level integration of portable microscopy and microfluidics for the realization of optofluidic imaging flow analyzer with a throughput of 450 cells/s. With the use of a cellphone augmented with off-the-shelf optical components and custom designed microfluidics, we demonstrate a portable optofluidic imaging flow analyzer. A multiple microfluidic channel geometry was employed to demonstrate the enhancement of throughput in the context of low frame-rate imaging systems. Using the cell-phone based digital imaging flow analyzer, we have imaged yeast cells present in a suspension. By digitally processing the recorded videos of the flow stream on the cellphone, we demonstrated an automated cell viability assessment of the yeast cell population. In addition, we also demonstrate the suitability of the system for blood cell counting. PMID:26015835

  9. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  10. Steady-State Vacuum Ultraviolet Exposure Facility With Automated Lamp Calibration and Sample Positioning Fabricated

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Steuber, Thomas J.; Banks, Bruce A.; Dever, Joyce A.

    2000-01-01

    The Next Generation Space Telescope (NGST) will be placed in an orbit that will subject it to constant solar radiation during its planned 10-year mission. A sunshield will be necessary to passively cool the telescope, protecting it from the Sun s energy and assuring proper operating temperatures for the telescope s instruments. This sunshield will be composed of metalized polymer multilayer insulation with an outer polymer membrane (12 to 25 mm in thickness) that will be metalized on the back to assure maximum reflectance of sunlight. The sunshield must maintain mechanical integrity and optical properties for the full 10 years. This durability requirement is most challenging for the outermost, constantly solar-facing polymer membrane of the sunshield. One of the potential threats to the membrane material s durability is from vacuum ultraviolet (VUV) radiation in wavelengths below 200 nm. Such radiation can be absorbed in the bulk of these thin polymer membrane materials and degrade the polymer s optical and mechanical properties. So that a suitable membrane material can be selected that demonstrates durability to solar VUV radiation, ground-based testing of candidate materials must be conducted to simulate the total 10- year VUV exposure expected during the Next Generation Space Telescope mission. The Steady State Vacuum Ultraviolet exposure facility was designed and fabricated at the NASA Glenn Research Center at Lewis Field to provide unattended 24-hr exposure of candidate materials to VUV radiation of 3 to 5 times the Sun s intensity in the wavelength range of 115 to 200 nm. The facility s chamber, which maintains a pressure of approximately 5 10(exp -6) torr, is divided into three individual exposure cells, each with a separate VUV source and sample-positioning mechanism. The three test cells are separated by a water-cooled copper shield plate assembly to minimize thermal effects from adjacent test cells. Part of the interior sample positioning mechanism of one

  11. Versatile sample environments and automation for biological solution X-ray scattering experiments at the P12 beamline (PETRA III, DESY)

    PubMed Central

    Blanchet, Clement E.; Spilotros, Alessandro; Schwemmer, Frank; Graewert, Melissa A.; Kikhney, Alexey; Jeffries, Cy M.; Franke, Daniel; Mark, Daniel; Zengerle, Roland; Cipriani, Florent; Fiedler, Stefan; Roessle, Manfred; Svergun, Dmitri I.

    2015-01-01

    A high-brilliance synchrotron P12 beamline of the EMBL located at the PETRA III storage ring (DESY, Hamburg) is dedicated to biological small-angle X-ray scattering (SAXS) and has been designed and optimized for scattering experiments on macromolecular solutions. Scatterless slits reduce the parasitic scattering, a custom-designed miniature active beamstop ensures accurate data normalization and the photon-counting PILATUS 2M detector enables the background-free detection of weak scattering signals. The high flux and small beam size allow for rapid experiments with exposure time down to 30–50 ms covering the resolution range from about 300 to 0.5 nm. P12 possesses a versatile and flexible sample environment system that caters for the diverse experimental needs required to study macromolecular solutions. These include an in-vacuum capillary mode for standard batch sample analyses with robotic sample delivery and for continuous-flow in-line sample purification and characterization, as well as an in-air capillary time-resolved stopped-flow setup. A novel microfluidic centrifugal mixing device (SAXS disc) is developed for a high-throughput screening mode using sub-microlitre sample volumes. Automation is a key feature of P12; it is controlled by a beamline meta server, which coordinates and schedules experiments from either standard or nonstandard operational setups. The integrated SASFLOW pipeline automatically checks for consistency, and processes and analyses the data, providing near real-time assessments of overall parameters and the generation of low-resolution models within minutes of data collection. These advances, combined with a remote access option, allow for rapid high-throughput analysis, as well as time-resolved and screening experiments for novice and expert biological SAXS users. PMID:25844078

  12. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    PubMed

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands. PMID:26580726

  13. Highly Sensitive Automated Method for DNA Damage Assessment: Gamma-H2AX Foci Counting and Cell Cycle Sorting

    PubMed Central

    Hernández, Laia; Terradas, Mariona; Martín, Marta; Tusell, Laura; Genescà, Anna

    2013-01-01

    Phosphorylation of the H2AX protein is an early step in the double strand break (DSB) repair pathway; therefore, phosphorylated histone (γH2AX) foci scoring is widely used as a measure for DSBs. Foci scoring is performed either manually or semi-automatically using hand-operated capturing and image analysis software. In general, both techniques are laborious and prone to artifacts associated with manual scoring. While a few fully automated methods have been described in the literature, none of them have been used to quantify γH2AX foci in combination with a cell cycle phase analysis. Adding this feature to a rapid automated γH2AX foci quantification method would reduce the scoring uncertainty that arises from the variations in the background level of the γH2AX signal throughout the cell cycle. The method was set up to measure DNA damage induced in human mammary epithelial cells by irradiation under a mammogram device. We adapted a FISH (fluorescent in situ hybridization) Spot-counting system, which has a slide loader with automatic scanning and cell capture system throughout the thickness of each cell (z-stack), to meet our assay requirements. While scanning the sample, the system classifies the selected nuclei according to the signal patterns previously described by the user. For our purposes, a double staining immunofluorescence was carried out with antibodies to detect γH2AX and pericentrin, an integral component of the centrosome. We could thus distinguish both the number of γH2AX foci per cell and the cell cycle phase. Furthermore, restrictive settings of the program classifier reduced the “touching nuclei” problem described in other image analysis software. The automated scoring was faster than and as sensitive as its manually performed counterpart. This system is a reliable tool for γH2AX radio-induced foci counting and provides essential information about the cell cycle stage. It thus offers a more complete and rapid assessment of DNA damage. PMID

  14. Evaluation of the appropriate time period between sampling and analyzing for automated urinalysis

    PubMed Central

    Dolscheid-Pommerich, Ramona C.; Klarmann-Schulz, Ute; Conrad, Rupert; Stoffel-Wagner, Birgit; Zur, Berndt

    2016-01-01

    Introduction Preanalytical specifications for urinalysis must be strictly adhered to avoid false interpretations. Aim of the present study is to examine whether the preanalytical factor ‘time point of analysis’ significantly influences stability of urine samples for urine particle and dipstick analysis. Materials and methods In 321 pathological spontaneous urine samples, urine dipstick (Urisys™2400, Combur-10-Test™strips, Roche Diagnostics, Mannheim, Germany) and particle analysis (UF-1000 i™, Sysmex, Norderstedt, Germany) were performed within 90 min, 120 min and 240 min after urine collection. Results For urine particle analysis, a significant increase in conductivity (120 vs. 90 min: P < 0.001, 240 vs. 90 min: P < 0.001) and a significant decrease in WBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), RBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), casts (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and epithelial cells (120 vs. 90 min P = 0.610, 240 vs. 90 min P = 0.041) were found. There were no significant changes for bacteria. Regarding urine dipstick analysis, misclassification rates between measurements were significant for pH (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), leukocytes (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), nitrite (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), protein (120 vs. 90 min P < 0.001, 240 vs. 90 min P<0.001), ketone (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), blood (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), specific gravity (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and urobilinogen (120 vs. 90 min, P = 0.031). Misclassification rates were not significant for glucose and bilirubin. Conclusion Most parameters critically depend on the time window between sampling and analysis. Our study stresses the importance of adherence to early time points in urinalysis (within 90 min). PMID:26981022

  15. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  16. Examples of Optical Assessment of Surface Cleanliness of Genesis Samples

    NASA Technical Reports Server (NTRS)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.; Gonzalez, C. P.

    2013-01-01

    Optical microscope assessment of Genesis solar wind collector surfaces is a coordinated part of the effort to obtain an assessed clean subset of flown wafer material for the scientific community. Microscopic survey is typically done at 50X magnification at selected approximately 1 square millimeter areas on the fragment surface. This survey is performed each time a principle investigator (PI) returns a sample to JSC for documentation as part of the established cleaning plan. The cleaning plan encompasses sample handling and analysis by Genesis science team members, and optical survey is done at each step in the process. Sample surface cleaning is performed at JSC (ultrapure water [1] and UV ozone cleaning [2]) and experimentally by other science team members (acid etch [3], acetate replica peels [4], CO2 snow [5], etc.). The documentation of each cleaning method can potentially be assessed with optical observation utilizing Image Pro Plus software [6]. Differences in particle counts can be studied and discussed within analysis groups. Approximately 25 samples have been identified as part of the cleaning matrix effort to date.

  17. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  18. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  19. An Automated Version of the BAT Syntactic Comprehension Task for Assessing Auditory L2 Proficiency in Healthy Adults

    ERIC Educational Resources Information Center

    Achim, Andre; Marquis, Alexandra

    2011-01-01

    Studies of bilingualism sometimes require healthy subjects to be assessed for proficiency at auditory sentence processing in their second language (L2). The Syntactic Comprehension task of the Bilingual Aphasia Test could satisfy this need. For ease and uniformity of application, we automated its English (Paradis, M., Libben, G., and Hummel, K.…

  20. Automated Urinalysis

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Information from NASA Tech Briefs assisted DiaSys Corporation in the development of the R/S 2000 which automates urinalysis, eliminating most manual procedures. An automatic aspirator is inserted into a standard specimen tube, the "Sample" button is pressed, and within three seconds a consistent amount of urine sediment is transferred to a microscope. The instrument speeds up, standardizes, automates and makes urine analysis safer. Additional products based on the same technology are anticipated.

  1. Aerothermodynamics Feasibility Assessment of a Mars Atmoshperic Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Ferracina, L.; Larranaga, J.; Falkner, P.

    2011-02-01

    ESA's optional Mars Robotic Exploration Preparation (MREP) programme is based on a long term collaboration with NASA, by taking Mars exploration as global objective, and Mars Sample Return (MSR) mission as long term goal to be achieved by the mid 2020's. Considering today's uncertainties, different missions are envisaged and prepared by ESA as possible alternative missions to MSR in the timeframe of 2020- 2026, in case the required technology readiness is not reached by 2015 or landed mass capabilities are exceeded for any of the MSR mission elements. One of the ESA considered missions within this framework is the Mars Atmospheric Sample Return Mission. This mission has been recently assessed by ESA using its Concurrent Design Facility (CDF), aiming to enter with a probe at Mars low altitudes (≈50 km), collect a sample of airborne atmosphere (gas and dust) and return the sample back to Earth. This paper aim at reporting the preliminary aerothermodynamic assessment of the design of the Martian entry probe conducted within the CDF study. Special attention has been paid to the selection of aerodynamically efficient vehicle concepts compare to blunt bodies and to the effect of the hot-temperature shock to the cavity placed at stagnation point and used in the atmospheric sampling system.

  2. Assessment of efficient sampling designs for urban stormwater monitoring.

    PubMed

    Leecaster, Molly K; Schiff, Kenneth; Tiefenthaler, Liesl L

    2002-03-01

    Monitoring programs for urban runoff have not been assessed for effectiveness or efficiency in estimating mass emissions. In order to determine appropriate designs for stormwater, total suspended solids (TSS) and flow information from the Santa Ana River was collected nearly every 15 min for every storm of the 1998 water year. All samples were used to calculate the "true load" and then three within-storm sampling designs (flow-interval, time-interval, and simple random) and five among-storm sampling designs (stratified by size, stratified by season, simple random, simple random of medium and large storms, and the first m storms of the season) were simulated. Using these designs, we evaluated three estimators for storm mass emissions (mean, volume-weighted, and ratio) and three estimators for annual mass emissions (median, ratio, and regular). Designs and estimators were evaluated with respect to accuracy and precision. The optimal strategy was used to determine the appropriate number of storms to sample annually based upon confidence interval width for estimates of annual mass emissions and concentration. The amount of detectable trend in mass emissions and concentration was determined for sample sizes 3 and 7. Single storms were most efficiently characterized (small bias and standard error) by taking 12 samples following a flow-interval schedule and using a volume-weighted estimator of mass emissions. The ratio estimator, when coupled with the simple random sample of medium and large storms within a season, most accurately estimated concentration and mass emissions; and had low bias over all of the designs. Sampling seven storms is the most efficient method for attaining small confidence interval width for annual concentration. Sampling three storms per year allows a 20% trend to be detected in mass emissions or concentration over five years. These results are decreased by 10% by sampling seven storms per year. PMID:11996344

  3. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    SciTech Connect

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  4. Automated contour mapping using sparse volume sampling for 4D radiation therapy

    SciTech Connect

    Chao Ming; Schreibmann, Eduard; Li Tianfang; Wink, Nicole; Xing Lei

    2007-10-15

    The purpose of this work is to develop a novel strategy to automatically map organ contours from one phase of respiration to all other phases on a four-dimensional computed tomography (4D CT). A region of interest (ROI) was manually delineated by a physician on one phase specific image set of a 4D CT. A number of cubic control volumes of the size of {approx}1 cm were automatically placed along the contours. The control volumes were then collectively mapped to the next phase using a rigid transformation. To accommodate organ deformation, a model-based adaptation of the control volume positions was followed after the rigid mapping procedure. This further adjustment of control volume positions was performed by minimizing an energy function which balances the tendency for the control volumes to move to their correspondences with the desire to maintain similar image features and shape integrity of the contour. The mapped ROI surface was then constructed based on the central positions of the control volumes using a triangulated surface construction technique. The proposed technique was assessed using a digital phantom and 4D CT images of three lung patients. Our digital phantom study data indicated that a spatial accuracy better than 2.5 mm is achievable using the proposed technique. The patient study showed a similar level of accuracy. In addition, the computational speed of our algorithm was significantly improved as compared with a conventional deformable registration-based contour mapping technique. The robustness and accuracy of this approach make it a valuable tool for the efficient use of the available spatial-tempo information for 4D simulation and treatment.

  5. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    SciTech Connect

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  6. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    ERIC Educational Resources Information Center

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  7. Evaluation of automated direct sample introduction with comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry for the screening analysis of dioxins of fish oil

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An automated direct sample introduction technique coupled to comprehensive two-dimensional gas chromatography-time of flight mass spectrometry (DSI-GC×GC/TOF-MS) was applied for the development of a relatively fast and easy analytical screening method for 17 polychlorinated dibenzo-p-dioxins/dibenzo...

  8. Information-Theoretic Assessment of Sample Imaging Systems

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.

  9. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    PubMed

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  10. An automated serial Grinding, Imaging and Reconstruction Instrument (GIRI) for digital modeling of samples with weak density contrasts

    NASA Astrophysics Data System (ADS)

    Maloof, A. C.; Samuels, B.; Mehra, A.; Spatzier, A.

    2013-12-01

    We present the first results from the new Princeton University Grinder Lab dedicated to the digital reconstruction of hidden objects through serial grinding and imaging. The purpose of a destructive technique like serial grinding is to facilitate the discovery of embedded objects with weak density contrasts outside the sensitivity limits of X-ray CT-scanning devices (Feature segmentation and object reconstruction are based on color and textural contrasts in the stack of images rather than density). The device we have developed is a retrofit imaging station designed for a precision CNC surface. The instrument is capable of processing a sample 20x25x40 cm in size at 1 micron resolution in x, y and z axes. Directly coupled to the vertical axis of the grinder is an 80 megapixel medium format camera and specialty macro lens capable of imaging a 4x5 cm surface at 5 micron resolution in full 16 bit color. The system is automated such that after each surface grind, the sample is cleaned, travels to the opposite end of the bed from the grinder wheel, is photographed, and then moved back to the grinding position. This process establishes a comprehensive archive of the specimen that is used for digital reconstruction and quantitative analysis. For example, in one night, a 7 cm thick sample can be imaged completely at 20 micron horizontal and vertical resolution without human supervision. Some of the initial results we present here include new digital reconstructions of early animal fossils, 3D sedimentary bedforms, the size and shape distribution of chondrules in chondritic meteorites, and the porosity structure of carbonate cemented reservoir rocks.

  11. Automated Cognitive Health Assessment From Smart Home-Based Behavior Data.

    PubMed

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-07-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behavior in the home and predicting clinical scores of the residents. To accomplish this goal, we propose a clinical assessment using activity behavior (CAAB) approach to model a smart home resident's daily behavior and predict the corresponding clinical scores. CAAB uses statistical features that describe characteristics of a resident's daily activity performance to train machine learning algorithms that predict the clinical scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years. We obtain a statistically significant correlation ( r=0.72) between CAAB-predicted and clinician-provided cognitive scores and a statistically significant correlation ( r=0.45) between CAAB-predicted and clinician-provided mobility scores. These prediction results suggest that it is feasible to predict clinical scores using smart home sensor data and learning-based data analysis. PMID:26292348

  12. Performance of the Automated Neuropsychological Assessment Metrics (ANAM) in Detecting Cognitive Impairment in Heart Failure Patients

    PubMed Central

    Xie, Susan S.; Goldstein, Carly M.; Gathright, Emily C.; Gunstad, John; Dolansky, Mary A.; Redle, Joseph; Hughes, Joel W.

    2015-01-01

    Objective Evaluate capacity of the Automated Neuropsychological Assessment Metrics (ANAM) to detect cognitive impairment (CI) in heart failure (HF) patients. Background CI is a key prognostic marker in HF. Though the most widely used cognitive screen in HF, the Mini-Mental State Examination (MMSE) is insufficiently sensitive. The ANAM has demonstrated sensitivity to cognitive domains affected by HF, but has not been assessed in this population. Methods Investigators administered the ANAM and MMSE to 57 HF patients, compared against a composite model of cognitive function. Results ANAM efficiency (p < .05) and accuracy scores (p < .001) successfully differentiated CI and non-CI. ANAM efficiency and accuracy scores classified 97.7% and 93.0% of non-CI patients, and 14.3% and 21.4% with CI, respectively. Conclusions The ANAM is more effective than the MMSE for detecting CI, but further research is needed to develop a more optimal cognitive screen for routine use in HF patients. PMID:26354858

  13. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    PubMed

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. PMID:27006022

  14. Assessment of the 296-S-21 Stack Sampling Probe Location

    SciTech Connect

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  15. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  16. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    SciTech Connect

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site.

  17. Comparison of Automated Scoring Methods for a Computerized Performance Assessment of Clinical Judgment

    ERIC Educational Resources Information Center

    Harik, Polina; Baldwin, Peter; Clauser, Brian

    2013-01-01

    Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that "compare" automated scoring strategies. Here, comparisons are made among five strategies for…

  18. Development of Automated Scoring Algorithms for Complex Performance Assessments: A Comparison of Two Approaches.

    ERIC Educational Resources Information Center

    Clauser, Brian E.; Margolis, Melissa J.; Clyman, Stephen G.; Ross, Linette P.

    1997-01-01

    Research on automated scoring is extended by comparing alternative automated systems for scoring a computer simulation of physicians' patient management skills. A regression-based system is more highly correlated with experts' evaluations than a system that uses complex rules to map performances into score levels, but both approaches are feasible.…

  19. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    ERIC Educational Resources Information Center

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  20. Assessing the potential value of automated body condition scoring through stochastic simulation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automated body condition scoring (BCS) using digital images has been shown to be feasible. The primary objective of this research was to identify factors that influence the profitability of investment in an automated BCS system. An expert opinion survey was conducted to provide estimates for potenti...

  1. Molecular assessment of disease states in kidney transplant biopsy samples.

    PubMed

    Halloran, Philip F; Famulski, Konrad S; Reeve, Jeff

    2016-09-01

    Progress in renal transplantation requires improved understanding and assessment of rejection and injury. Study of the relationship between gene expression and clinical phenotypes in kidney transplant biopsy samples has led to the development of a system that enables diagnoses of specific disease states on the basis of messenger RNA levels in the biopsy sample. Using this system we have defined the molecular landscape of T cell-mediated rejection (TCMR), antibody-mediated rejection (ABMR), acute kidney injury (AKI), and tubular atrophy and interstitial fibrosis. TCMR and ABMR share IFNγ-mediated effects and TCMR has emerged as a cognate T cell-antigen presenting cell process in the interstitium, whereas ABMR is a natural-killer-cell-mediated process that occurs in the microcirculation. The specific features of these different processes have led to the creation of classifiers to test for TCMR and ABMR, and revealed that ABMR is the principal cause of kidney transplant deterioration. The molecular changes associated with renal injury are often more extensive than suggested by histology and indicate that the progression to graft failure is caused by continuing nephron injury, rather than fibrogenesis. In summary, advances in the molecular assessment of disease states in biopsy samples has improved understanding of specific processes involved in kidney graft outcomes. PMID:27345248

  2. Assessment of anti-Salmonella activity of boot dip samples.

    PubMed

    Rabie, André J; McLaren, Ian M; Breslin, Mark F; Sayers, Robin; Davies, Rob H

    2015-01-01

    The introduction of pathogens from the external environment into poultry houses via the boots of farm workers and visitors presents a significant risk. The use of boot dips containing disinfectant to help prevent this from happening is common practice, but the effectiveness of these boot dips as a preventive measure can vary. The aim of this study was to assess the anti-Salmonella activity of boot dips that are being used on poultry farms. Boot dip samples were collected from commercial laying hen farms in the UK and tested within 24 hours of receipt at the laboratory to assess their anti-Salmonella activity. All boot dip samples were tested against a field strain of Salmonella enterica serovar Enteritidis using three test models: pure culture, paper disc surface matrix and yeast suspension model. Of the 112 boot dip samples tested 83.6% were effective against Salmonella in pure culture, 37.3% in paper disc surface matrix and 44.5% in yeast suspension model. Numerous factors may influence the efficacy of the disinfectants. Disinfectants used in the dips may not always be fully active against surface or organic matter contamination; they may be inaccurately measured or diluted to a concentration other than that specified or recommended; dips may not be changed regularly or may have been exposed to rain and other environmental elements. This study showed that boot dips in use on poultry farms are frequently ineffective. PMID:25650744

  3. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals. PMID:24314512

  4. Automated content and quality assessment of full-motion-video for the generation of meta data

    NASA Astrophysics Data System (ADS)

    Harguess, Josh

    2015-05-01

    Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.

  5. Interim assessment of the VAL automated guideway transit system. Interim report

    SciTech Connect

    Anagnostopoulos, G.

    1981-11-01

    This report describes an interim assessment of the VAL (Vehicules Automatiques Legers or Light Automated Vehicle) AGT system which is currently under construction in Lille, France, and which is to become fully operational in December 1983. This report contains a technical description and performance data resulting from a demonstration test program performed concurrently in August 1980. VAL is the first driverless AGT urban system application in France. The system operates at grade, elevated, and in tunnels on an exclusive concrete dual-lane guideway that is 12.7 kilometers long. The configuration of the system is a push-pull loop operating between 17 on-line stations. The system is designed to provide scheduled operation at 60-second headways and a normal one-way capacity of 7440 passengers per hour per direction with 55 percent of the passengers seated. Two pneumatic-tired vehicles are coupled into a single vehicle capable of carrying 124 passengers at line speeds of 60 km/hr. During the course of the demonstration test program, VAL demonstrated that it could achieve high levels of dependability and availability and could perform safely under all perceivable conditions.

  6. Automated Health Alerts Using In-Home Sensor Data for Embedded Health Assessment

    PubMed Central

    Guevara, Rainer Dane; Rantz, Marilyn

    2015-01-01

    We present an example of unobtrusive, continuous monitoring in the home for the purpose of assessing early health changes. Sensors embedded in the environment capture behavior and activity patterns. Changes in patterns are detected as potential signs of changing health. We first present results of a preliminary study investigating 22 features extracted from in-home sensor data. A 1-D alert algorithm was then implemented to generate health alerts to clinicians in a senior housing facility. Clinicians analyze each alert and provide a rating on the clinical relevance. These ratings are then used as ground truth for training and testing classifiers. Here, we present the methodology for four classification approaches that fuse multisensor data. Results are shown using embedded sensor data and health alert ratings collected on 21 seniors over nine months. The best results show similar performance for two techniques, where one approach uses only domain knowledge and the second uses supervised learning for training. Finally, we propose a health change detection model based on these results and clinical expertise. The system of in-home sensors and algorithms for automated health alerts provides a method for detecting health problems very early so that early treatment is possible. This method of passive in-home sensing alleviates compliance issues. PMID:27170900

  7. Mixed species radioiodine air sampling readout and dose assessment system

    DOEpatents

    Distenfeld, Carl H.; Klemish, Jr., Joseph R.

    1978-01-01

    This invention provides a simple, reliable, inexpensive and portable means and method for determining the thyroid dose rate of mixed airborne species of solid and gaseous radioiodine without requiring highly skilled personnel, such as health physicists or electronics technicians. To this end, this invention provides a means and method for sampling a gas from a source of a mixed species of solid and gaseous radioiodine for collection of the mixed species and readout and assessment of the emissions therefrom by cylindrically, concentrically and annularly molding the respective species around a cylindrical passage for receiving a conventional probe-type Geiger-Mueller radiation detector.

  8. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    SciTech Connect

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose estimates

  9. Beyond crosswalks: reliability of exposure assessment following automated coding of free-text job descriptions for occupational epidemiology.

    PubMed

    Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L

    2014-05-01

    Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ

  10. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  11. Assessment of the application of an automated electronic milk analyzer for the enumeration of total bacteria in raw goat milk.

    PubMed

    Ramsahoi, L; Gao, A; Fabri, M; Odumeru, J A

    2011-07-01

    Automated electronic milk analyzers for rapid enumeration of total bacteria counts (TBC) are widely used for raw milk testing by many analytical laboratories worldwide. In Ontario, Canada, Bactoscan flow cytometry (BsnFC; Foss Electric, Hillerød, Denmark) is the official anchor method for TBC in raw cow milk. Penalties are levied at the BsnFC equivalent level of 50,000 cfu/mL, the standard plate count (SPC) regulatory limit. This study was conducted to assess the BsnFC for TBC in raw goat milk, to determine the mathematical relationship between the SPC and BsnFC methods, and to identify probable reasons for the difference in the SPC:BsnFC equivalents for goat and cow milks. Test procedures were conducted according to International Dairy Federation Bulletin guidelines. Approximately 115 farm bulk tank milk samples per month were tested for inhibitor residues, SPC, BsnFC, psychrotrophic bacteria count, composition (fat, protein, lactose, lactose and other solids, and freezing point), and somatic cell count from March 2009 to February 2010. Data analysis of the results for the samples tested indicated that the BsnFC method would be a good alternative to the SPC method, providing accurate and more precise results with a faster turnaround time. Although a linear regression model showed good correlation and prediction, tests for linearity indicated that the relationship was linear only beyond log 4.1 SPC. The logistic growth curve best modeled the relationship between the SPC and BsnFC for the entire sample population. The BsnFC equivalent to the SPC 50,000 cfu/mL regulatory limit was estimated to be 321,000 individual bacteria count (ibc)/mL. This estimate differs considerably from the BsnFC equivalent for cow milk (121,000 ibc/mL). Because of the low frequency of bulk tank milk pickups at goat farms, 78.5% of the samples had their oldest milking in the tank to be 6.5 to 9.0 d old when tested, compared with the cow milk samples, which had their oldest milking at 4 d

  12. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    PubMed

    Purahong, Witoon; Stempfhuber, Barbara; Lentendu, Guillaume; Francioli, Davide; Reitz, Thomas; Buscot, François; Schloter, Michael; Krüger, Dirk

    2015-01-01

    Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA) is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub). Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico) sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected. PMID:25749323

  13. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  14. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  15. Application of automated serial blood sampling and dried blood spot technique with liquid chromatography-tandem mass spectrometry for pharmacokinetic studies in mice.

    PubMed

    Wong, Philip; Pham, Roger; Whitely, Carl; Soto, Marcus; Salyers, Kevin; James, Christopher; Bruenner, Bernd A

    2011-11-01

    The goal of this work was to obtain full pharmacokinetic profiles from individual mice with the use of an automated blood sampling system and dried blood spot (DBS) technique. AMG 517, a potent and selective vanilloid receptor (VR1) antagonist, was dosed to mice (n=3) intravenously and blood samples were collected using the automated blood sampling system with the "no blood waste" method. The collected blood samples were a mixture of 25 μL blood and 50 μL of heparinized saline solution. Two 15 μL aliquots were manually spotted onto a DBS card and dried at room temperature for at least 2h before being stored in zip bags with desiccant. The remaining samples (45 μL) were stored at -70°C until analysis. Both the DBS and the whole blood samples (diluted with saline (1:2, v/v)) were extracted and analyzed by liquid chromatography-tandem mass spectrometry. The overall extraction recovery of the analyte from the dried blood spots was determined to be about 90%. The pharmacokinetic parameters calculated using the whole blood or the DBS concentration data were comparable, and were obtained from only 3 mice, whereas conventional sampling and analysis would have required up to 27 mice to achieve the same result. The analyte was shown to be stable in the diluted whole blood (blood:saline 1:2) at room temperature for at least 4h and in the DBS for at least 34 days when stored at room temperature. These results indicated that the automated blood sampling system and DBS collection are promising techniques to obtain full pharmacokinetic profiles from individual mice and reduce the use of animals. PMID:21784595

  16. Development and testing of external quality assessment samples for Salmonella detection in poultry samples.

    PubMed

    Martelli, F; Gosling, R; McLaren, I; Wales, A; Davies, R

    2014-10-01

    Salmonella-contaminated poultry house dust plus 10 g chicken faeces inoculated with Salmonella Enteritidis and then frozen for storage and transport were used as candidate external quality assurance test samples. Variations in faeces sample preparation, storage and culture were examined initially. This indicated that, within modest limits, the age of the inoculating culture and of the faeces did not affect detection, nor did swirling the pre-enrichment culture or extending its duration. Under optimal conditions of preparation and storage, Salmonella numbers of 70 colony-forming units (CFU) and above were reliably detected at the originating laboratory. A ring trial was performed, involving 13 external UK laboratories plus the originating laboratory. Faeces samples inoculated with Salmonella Enteritidis were frozen, transported on dry ice and tested by the ISO 6579:2002 (Annex D) method. Detection by the originating laboratory was consistent with the previously established lower limit for reliability of 70 CFU. However, the sensitivity of detection by the external laboratories was apparently poorer in several cases, with significant interlaboratory variation seen at the lowest inoculum level, using Fisher's exact test. Detection of Salmonella in poultry house dust appeared to be more sensitive and uniform among laboratories. Significance and impact of the study: Salmonella surveillance and control regimes in the European poultry industry and elsewhere require sensitive culture detection of Salmonella in environmental samples, including poultry faeces. A ring trial was conducted, and the results highlighted that some of the participating laboratories failed to identify Salmonella. This suggests that contaminated frozen faeces cubes could be beneficial to assess proficiency, according to the results of this preliminary study. The data obtained in this study can be used as an indication for the design of realistic external quality assurance for laboratories involved in

  17. Fully Automated Assessment of the Severity of Parkinson’s Disease from Speech

    PubMed Central

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2014-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson’s disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks – the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  18. Mammographic Breast Density Evaluation in Korean Women Using Fully Automated Volumetric Assessment

    PubMed Central

    2016-01-01

    The purpose was to present mean breast density of Korean women according to age using fully automated volumetric assessment. This study included 5,967 screening normal or benign mammograms (mean age, 46.2 ± 9.7; range, 30–89 years), from cancer-screening program. We evaluated mean fibroglandular tissue volume, breast tissue volume, volumetric breast density (VBD), and the results were 53.7 ± 30.8 cm3, 383.8 ± 205.2 cm3, and 15.8% ± 7.3%. The frequency of dense breasts and mean VBD by age group were 94.3% and 19.1% ± 6.7% for the 30s (n = 1,484), 91.4% and 17.2% ± 6.8% for the 40s (n = 2,706), 72.2% and 12.4% ± 6.2% for the 50s (n = 1,138), 44.0% and 8.6% ± 4.3% for the 60s (n = 89), 39.1% and 8.0% ± 3.8% for the 70s (n = 138), and 39.1% and 8.0% ± 3.5% for the 80s (n = 12). The frequency of dense breasts was higher in younger women (n = 4,313, 92.3%) than older women (n = 1,654, 59.8%). Mean VBD decreased with aging or menopause, and was about 16% for 46-year-old-Korean women, much higher than in other countries. The proportion of dense breasts sharply decreases in Korean women between 40 and 69 years of age. PMID:26955249

  19. Fully Automated Assessment of the Severity of Parkinson's Disease from Speech.

    PubMed

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2015-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson's disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks - the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  20. Performance assessment of automated tissue characterization for prostate H and E stained histopathology

    NASA Astrophysics Data System (ADS)

    DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette

    2015-03-01

    Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.

  1. Cardiac activity in marine invertebrates in response to pollutants: Automated interpulse duration assessment

    SciTech Connect

    Lundebye, A.K.; Curtis, T.; Depledge, M.H.

    1995-12-31

    The updated method of the Computer-Aided Physiological Monitoring (CAPMON) system was used to study the effects of copper exposure on cardiac activity in the shore crab (Carcinus maenas) and the common mussel (Mytilus edulis). This new Automated Interpulse Duration Assessment (AIDA) system measures the time interval between heart beats, and was found to be a more sensitive tool for evaluating cardiac responses to pollutant exposure than other techniques. In addition to information regarding heart rate, also obtained by the CAPMON system (as beats per minute), the new system enables frequency distribution analysis of interpulse duration. An experiment involving C. maenas examined the effects of short term (24 h) and chronic exposure (4 weeks) to copper concentrations 0, 0.2, 0.4, 0.6 and 0.8 mgl{sup {minus}1} Cu. Subsequent recovery (6 weeks) of cardiac activity was also examined. In a second experiment mussels were exposed to one of five copper concentrations (in the range of 0--0.1 mgl{sup {minus}1} Cu) and `normal` cardiac activity was compared with activity after copper exposure. A dose-response relationship was established between copper concentration and heart rate in crabs. The control group had the longest mean inter-pulse duration, and mean interpulse duration decreased in a concentration-dependent manner for the copper treatments, reflecting an increase in heart rate. Distribution of interpulse duration changed from a variable, rather wide distribution in control crabs, to a sharp-peaked normal distribution in exposed crabs. Results after 4 weeks exposure were not significantly different from those found after 24 h. Return to normal cardiac activity was evident after a 6 week `recovery` period. Results from the mussel experiment showed burst activity followed by a decline in heart rate in response to copper exposure.

  2. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  3. Decadal predictive skill assessment - ensemble and hindcast sample size impact

    NASA Astrophysics Data System (ADS)

    Sienz, Frank; Müller, Wolfgang; Pohlmann, Holger

    2015-04-01

    Hindcast, respectively retrospective prediction experiments have to be performed to validate decadal prediction systems. These are necessarily restricted in the number due to the computational constrains. From weather and seasonal prediction it is known that, the ensemble size is crucial. A similar dependency is likely for decadal predictions but, differences are expected due to the differing time-scales of the involved processes and the longer prediction horizon. It is shown here, that the ensemble and hindcast sample size have a large impact on the uncertainty assessment of the ensemble mean, as well as for the detection of prediction skill. For that purpose a conceptual model is developed, which enables the systematic analysis of statistical properties and its dependencies in a framework close to that of real decadal predictions. In addition, a set of extended range hindcast experiments have been undertaken, covering the entire 20th century.

  4. Enviromental sampling at remote sites based on radiological screening assessments

    SciTech Connect

    Ebinger, M.H.; Hansen, W.R.; Wenz, G.; Oxenberg, T.P.

    1996-06-01

    Environmental radiation monitoring (ERM) data from remote sites on the White Sands Missile Range, New Mexico, were used to estimate doses to humans and terrestrial mammals from residual radiation deposited during testing of components containing depleted uranium (DU) and thorium (Th). ERM data were used with the DOE code RESRAD and a simple steady-state pathway code to estimate the potential adverse effects from DU and Th to workers in the contaminated zones, to hunters consuming animals from the contaminated zones, and to terrestrial mammals that inhabit the contaminated zones. Assessments of zones contaminated with DU and Th and DU alone were conducted. Radiological doses from Th and DU in soils were largest with a maximum of about 3.5 mrem y{sup -1} in humans and maximum of about 0.1 mrad d{sup -1} in deer. Dose estimates from DU alone in soils were significantly less with a maximum of about 1 mrem y{sup -1} in humans and about 0.04 mrad d{sup -1} in deer. The results of the dose estimates suggest strongly that environmental sampling in these affected areas can be infrequent and still provide adequate assessments of radiological doses to workers, hunters, and terrestrial mammals.

  5. Experimental Assessment of Mouse Sociability Using an Automated Image Processing Approach.

    PubMed

    Varghese, Frency; Burket, Jessica A; Benson, Andrew D; Deutsch, Stephen I; Zemlin, Christian W

    2016-01-01

    Mouse is the preferred model organism for testing drugs designed to increase sociability. We present a method to quantify mouse sociability in which the test mouse is placed in a standardized apparatus and relevant behaviors are assessed in three different sessions (called session I, II, and III). The apparatus has three compartments (see Figure 1), the left and right compartments contain an inverted cup which can house a mouse (called "stimulus mouse"). In session I, the test mouse is placed in the cage and its mobility is characterized by the number of transitions made between compartments. In session II, a stimulus mouse is placed under one of the inverted cups and the sociability of the test mouse is quantified by the amounts of time it spends near the cup containing the enclosed stimulus mouse vs. the empty inverted cup. In session III, the inverted cups are removed and both mice interact freely. The sociability of the test mouse in session III is quantified by the number of social approaches it makes toward the stimulus mouse and by the number of times it avoids a social approach by the stimulus mouse. The automated evaluation of the movie detects the nose of the test mouse, which allows the determination of all described sociability measures in session I and II (in session III, approaches are identified automatically but classified manually). To find the nose, the image of an empty cage is digitally subtracted from each frame of the movie and the resulting image is binarized to identify the mouse pixels. The mouse tail is automatically removed and the two most distant points of the remaining mouse are determined; these are close to nose and base of tail. By analyzing the motion of the mouse and using continuity arguments, the nose is identified. Figure 1. Assessment of Sociability During 3 sessions. Session I (top): Acclimation of test mouse to the cage. Session II (middle): Test mouse moving freely in the cage while the stimulus mouse is enclosed in an

  6. Automated determination of nitrate plus nitrite in aqueous samples with flow injection analysis using vanadium (III) chloride as reductant.

    PubMed

    Wang, Shu; Lin, Kunning; Chen, Nengwang; Yuan, Dongxing; Ma, Jian

    2016-01-01

    Determination of nitrate in aqueous samples is an important analytical objective for environmental monitoring and assessment. Here we report the first automatic flow injection analysis (FIA) of nitrate (plus nitrite) using VCl3 as reductant instead of the well-known but toxic cadmium column for reducing nitrate to nitrite. The reduced nitrate plus the nitrite originally present in the sample react with the Griess reagent (sulfanilamide and N-1-naphthylethylenediamine dihydrochloride) under acidic condition. The resulting pink azo dye can be detected at 540 nm. The Griess reagent and VCl3 are used as a single mixed reagent solution to simplify the system. The various parameters of the FIA procedure including reagent composition, temperature, volume of the injection loop, and flow rate were carefully investigated and optimized via univariate experimental design. Under the optimized conditions, the linear range and detection limit of this method are 0-100 µM (R(2)=0.9995) and 0.1 µM, respectively. The targeted analytical range can be easily extended to higher concentrations by selecting alternative detection wavelengths or increasing flow rate. The FIA system provides a sample throughput of 20 h(-1), which is much higher than that of previously reported manual methods based on the same chemistry. National reference solutions and different kinds of aqueous samples were analyzed with our method as well as the cadmium column reduction method. The results from our method agree well with both the certified value and the results from the cadmium column reduction method (no significant difference with P=0.95). The spiked recovery varies from 89% to 108% for samples with different matrices, showing insignificant matrix interference in this method. PMID:26695325

  7. Technical note: Comparison of automated ribosomal intergenic spacer analysis and denaturing gradient gel electrophoresis to assess bacterial diversity in the rumen of sheep.

    PubMed

    Saro, C; Ranilla, M J; Cifuentes, A; Rosselló-Mora, R; Carro, M D

    2014-03-01

    The aim of this study was to compare automated ribosomal intergenic spacer analysis (ARISA) and denaturing gradient gel electrophoresis (DGGE) techniques to assess bacterial diversity in the rumen of sheep. Sheep were fed 2 diets with 70% of either alfalfa hay or grass hay, and the solid (SOL) and liquid (LIQ) phases of the rumen were sampled immediately before feeding (0 h) and at 4 and 8 h postfeeding. Both techniques detected similar differences between forages, with alfalfa hay promoting greater (P < 0.05) bacterial diversity than grass hay. In contrast, whereas ARISA analysis showed a decrease (P < 0.05) of bacterial diversity in SOL at 4 h postfeeding compared with 0 and 8 h samplings, no variations (P > 0.05) over the postfeeding period were detected by DGGE. The ARISA technique showed lower (P < 0.05) bacterial diversity in SOL than in LIQ samples at 4 h postfeeding, but no differences (P > 0.05) in bacterial diversity between both rumen phases were detected by DGGE. Under the conditions of this study, the DGGE was not sensitive enough to detect some changes in ruminal bacterial communities, and therefore ARISA was considered more accurate for assessing bacterial diversity of ruminal samples. The results highlight the influence of the fingerprinting technique used to draw conclusions on factors affecting ruminal bacterial diversity. PMID:24492564

  8. Fully automated, quantitative, noninvasive assessment of collagen fiber content and organization in thick collagen gels

    NASA Astrophysics Data System (ADS)

    Bayan, Christopher; Levitt, Jonathan M.; Miller, Eric; Kaplan, David; Georgakoudi, Irene

    2009-05-01

    Collagen is the most prominent protein of human tissues. Its content and organization define to a large extent the mechanical properties of tissue as well as its function. Methods that have been used traditionally to visualize and analyze collagen are invasive, provide only qualitative or indirect information, and have limited use in studies that aim to understand the dynamic nature of collagen remodeling and its interactions with the surrounding cells and other matrix components. Second harmonic generation (SHG) imaging emerged as a promising noninvasive modality for providing high-resolution images of collagen fibers within thick specimens, such as tissues. In this article, we present a fully automated procedure to acquire quantitative information on the content, orientation, and organization of collagen fibers. We use this procedure to monitor the dynamic remodeling of collagen gels in the absence or presence of fibroblasts over periods of 12 or 14 days. We find that an adaptive thresholding and stretching approach provides great insight to the content of collagen fibers within SHG images without the need for user input. An additional feature-erosion and feature-dilation step is useful for preserving structure and noise removal in images with low signal. To quantitatively assess the orientation of collagen fibers, we extract the orientation index (OI), a parameter based on the power distribution of the spatial-frequency-averaged, two-dimensional Fourier transform of the SHG images. To measure the local organization of the collagen fibers, we access the Hough transform of small tiles of the image and compute the entropy distribution, which represents the probability of finding the direction of fibers along a dominant direction. Using these methods we observed that the presence and number of fibroblasts within the collagen gel significantly affects the remodeling of the collagen matrix. In the absence of fibroblasts, gels contract, especially during the first few

  9. The Automated Planetary Space Station

    NASA Technical Reports Server (NTRS)

    Ivie, C. V.; Friedman, L. D.

    1977-01-01

    Results are presented for a study on mission definition and design to determine broad technology directions and needs for advanced planetary spacecraft and future planetary missions. The discussion covers mission selection, system design, and technology assessment and review for a multicomponent spacecraft exploration facility provided with nuclear power propulsion. As an example, the Automated Planetary Space Station at Jupiter is examined as a generic concept which has the capability of conducting in-depth investigations of different aspects of the entire Jovian system. Mission planning is discussed relative to low-thrust trajectory control, automatic target identification and landing, roving vehicle operation, and automated sample analysis.

  10. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  11. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  12. Comparative Assessment of Automated Nucleic Acid Sample Extraction Equipment for Biothreat Agents

    PubMed Central

    Kalina, Warren Vincent; Douglas, Christina Elizabeth; Coyne, Susan Rajnik

    2014-01-01

    Magnetic beads offer superior impurity removal and nucleic acid selection over older extraction methods. The performances of nucleic acid extraction of biothreat agents in blood or buffer by easyMAG, MagNA Pure, EZ1 Advanced XL, and Nordiag Arrow were evaluated. All instruments showed excellent performance in blood; however, the easyMAG had the best precision and versatility. PMID:24452173

  13. Automated extraction of 11-nor-delta9-tetrahydrocannabinol carboxylic acid from urine samples using the ASPEC XL solid-phase extraction system.

    PubMed

    Langen, M C; de Bijl, G A; Egberts, A C

    2000-09-01

    The analysis of 11-nor-delta9-tetrahydrocannabinol-carboxylic acid (THCCOOH, the major metabolite of cannabis) in urine with gas chromatography and mass spectrometry (GC-MS) and solid-phase extraction (SPE) sample preparation is well documented. Automated SPE sample preparation of THCCOOH in urine, although potentially advantageous, is to our knowledge poorly investigated. The objective of the present study was to develop and validate an automated SPE sample-preparation step using ASPEC XL suited for GC-MS confirmation analysis of THCCOOH in urine drug control. The recoveries showed that it was not possible to transfer the protocol for the manual SPE procedure with the vacuum manifold to the ASPEC XL without loss of recovery. Making the sample more lipophilic by adding 1 mL 2-propanol after hydrolysis to the urine sample in order to overcome the problem of surface adsorption of THCCOOH led to an extraction efficiency (77%) comparable to that reached with the vacuum manifold (84%). The reproducibility of the automated SPE procedure was better (coefficient of variation 5%) than that of the manual procedure (coefficient of variation 12%). The limit of detection was 1 ng/mL, and the limit of quantitation was 4 ng/mL. Precision at the 12.5-ng/mL level was as follows: mean, 12.4 and coefficient of variation, 3.0%. Potential carryover was evaluated, but a carryover effect could not be detected. It was concluded that the proposed method is suited for GC-MS confirmation urinalysis of THCCOOH for prisons and detoxification centers. PMID:10999349

  14. Development of Automated Signal and Meta-data Quality Assessment at the USGS ANSS NOC

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Buland, R.; Boaz, R.; Benz, H.; Gee, L.; Leith, W.

    2007-05-01

    Real-time earthquake processing systems at the Advanced National Seismic System (ANSS) National Operations Center (NOC) rely on high-quality broadband seismic data to compute accurate earthquake locations, moment-tensor solutions, finite-fault models, Shakemaps and impact assessments. The NEIC receives real- time seismic data from the ANSS backbone, the Global Seismographic Network, ANSS regional network operators, foreign regional and national networks, the tsunami warning centers and the International Monitoring System. For many contributed stations, calibration information is not well known. In addition, equipment upgrades or changes may occur, making it difficult to maintain accurate metadata. The high-degree of real-time integration of seismic data necessitates the development of automated QC tools and procedures that identify changes in instrument response, quality of waveforms and other systematic changes in station performance that might affect NEIC computations and products. We present new tools and methods that will allow NEIC and other network operations to evaluate seismic station performance and characteristics both in the time and frequency domain using probability density functions (PDF) of power spectral densities (PSD) (McNamara and Buland, 2004). The method involves determining station standard noise conditions and characterizing deviations from the standard using the probabilistic distribution hourly PSDs. We define the standard station noise conditions to lie within the 10th and 90th percentile of the PSD distribution. The computed PSDs are stored in a database, allowing a user to access specific time periods of PSDs (PDF subsets) and time series segments through a client-interface or programmatic database calls. This allows the user to visually define the spectral characteristics of known system transients. In order to identify instrument response changes or systems transients we compare short-term spectral envelopes (1 hour to 1 day) against

  15. Assessing tiger population dynamics using photographic capture-recapture sampling

    USGS Publications Warehouse

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain

  16. Taking Advantage of Automated Assessment of Student-Constructed Graphs in Science

    ERIC Educational Resources Information Center

    Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.

    2015-01-01

    We present a new system for automated scoring of graph construction items that address complex science concepts, feature qualitative prompts, and support a range of possible solutions. This system utilizes analysis of spatial features (e.g., slope of a line) to evaluate potential student ideas represented within graphs. Student ideas are then…

  17. The Implementation of an Automated Assessment Feedback and Quality Assurance System for ICT Courses

    ERIC Educational Resources Information Center

    Debuse, J.; Lawley, M.; Shibl, R.

    2007-01-01

    Providing detailed, constructive and helpful feedback is an important contribution to effective student learning. Quality assurance is also required to ensure consistency across all students and reduce error rates. However, with increasing workloads and student numbers these goals are becoming more difficult to achieve. An automated feedback…

  18. Assessing the Potential Value for an Automated Body Condition Scoring System through Stochastic Simulation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automated body condition scoring (BCS) through extraction of information from digital images has been demonstrated to be feasible; and commercial technologies are being developed. The primary objective of this research was to identify the factors that influence the potential profitability of investi...

  19. Assessing Racial Microaggression Distress in a Diverse Sample.

    PubMed

    Torres-Harding, Susan; Turner, Tasha

    2015-12-01

    Racial microaggressions are everyday subtle or ambiguous racially related insults, slights, mistreatment, or invalidations. Racial microaggressions are a type of perceived racism that may negatively impact the health and well-being of people of color in the United States. This study examined the reliability and validity of the Racial Microaggression Scale distress subscales, which measure the perceived stressfulness of six types of microaggression experiences in a racially and ethnically diverse sample. These subscales exhibited acceptable to good internal consistency. The distress subscales also evidenced good convergent validity; the distress subscales were positively correlated with additional measures of stressfulness due to experiencing microaggressions or everyday discrimination. When controlling for the frequency of one's exposure to microaggression incidents, some racial/ethnic group differences were found. Asian Americans reported comparatively lower distress and Latinos reporting comparatively higher distress in response to Foreigner, Low-Achieving, Invisibility, and Environmental microaggressions. African Americans reported higher distress than the other groups in response to Environmental microaggressions. Results suggest that the Racial Microaggressions Scale distress subscales may aid health professionals in assessing the distress elicited by different types of microaggressions. In turn, this may facilitate diagnosis and treatment planning in order to provide multiculturally competent care for African American, Latino, and Asian American clients. PMID:25237154

  20. Using Group Projects to Assess the Learning of Sampling Distributions

    ERIC Educational Resources Information Center

    Neidigh, Robert O.; Dunkelberger, Jake

    2012-01-01

    In an introductory business statistics course, student groups used sample data to compare a set of sample means to the theoretical sampling distribution. Each group was given a production measurement with a population mean and standard deviation. The groups were also provided an excel spreadsheet with 40 sample measurements per week for 52 weeks…

  1. Negative symptoms in schizophrenia: a study in a large clinical sample of patients using a novel automated method

    PubMed Central

    Patel, Rashmi; Jayatilleke, Nishamali; Broadbent, Matthew; Chang, Chin-Kuo; Foskett, Nadia; Gorrell, Genevieve; Hayes, Richard D; Jackson, Richard; Johnston, Caroline; Shetty, Hitesh; Roberts, Angus; McGuire, Philip; Stewart, Robert

    2015-01-01

    Objectives To identify negative symptoms in the clinical records of a large sample of patients with schizophrenia using natural language processing and assess their relationship with clinical outcomes. Design Observational study using an anonymised electronic health record case register. Setting South London and Maudsley NHS Trust (SLaM), a large provider of inpatient and community mental healthcare in the UK. Participants 7678 patients with schizophrenia receiving care during 2011. Main outcome measures Hospital admission, readmission and duration of admission. Results 10 different negative symptoms were ascertained with precision statistics above 0.80. 41% of patients had 2 or more negative symptoms. Negative symptoms were associated with younger age, male gender and single marital status, and with increased likelihood of hospital admission (OR 1.24, 95% CI 1.10 to 1.39), longer duration of admission (β-coefficient 20.5 days, 7.6–33.5), and increased likelihood of readmission following discharge (OR 1.58, 1.28 to 1.95). Conclusions Negative symptoms were common and associated with adverse clinical outcomes, consistent with evidence that these symptoms account for much of the disability associated with schizophrenia. Natural language processing provides a means of conducting research in large representative samples of patients, using data recorded during routine clinical practice. PMID:26346872

  2. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis

  3. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius

  4. Automated ground data acquisition and processing system for calibration and performance assessment of the EO-1 Advanced Land Imager

    NASA Astrophysics Data System (ADS)

    Viggh, Herbert E. M.; Mendenhall, Jeffrey A.; Sayer, Ronald W.; Stuart, J. S.; Gibbs, Margaret D.

    1999-09-01

    The calibration and performance assessment of the Earth Observing-1 (EO-1) Advanced Land Imager (ALI) required a ground data system for acquiring and processing ALI data. In order to meet tight schedule and budget requirements, an automated system was developed that could be run by a single operator. This paper describes the overall system and the individual Electrical Ground Support Equipment (EGSE) and computer components used. The ALI Calibration Control Node (ACCN) serves as a test executive with a single graphical user interface to the system, controlling calibration equipment and issuing data acquisition and processing requests to the other EGSE and computers. EGSE1, a custom data acquisition syste, collects ALI science data and also passes ALI commanding and housekeeping telemetry collection requests to EGSE2 and EGSE3 which are implemented on an ASIST workstation. The performance assessment machine, stores and processes collected ALI data, automatically displaying quick-look processing results. The custom communications protocol developed to interface these various machines and to automate their interactions is described, including the various modes of operation needed to support spatial, radiometric, spectral, and functional calibration and performance assessment of the ALI.

  5. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    PubMed Central

    2012-01-01

    Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF) imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD) values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD values were calculated

  6. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  7. Manual versus Automated Rodent Behavioral Assessment: Comparing Efficacy and Ease of Bederson and Garcia Neurological Deficit Scores to an Open Field Video-Tracking System

    PubMed Central

    Desland, Fiona A.; Afzal, Aqeela; Warraich, Zuha; Mocco, J

    2014-01-01

    Animal models of stroke have been crucial in advancing our understanding of the pathophysiology of cerebral ischemia. Currently, the standards for determining neurological deficit in rodents are the Bederson and Garcia scales, manual assessments scoring animals based on parameters ranked on a narrow scale of severity. Automated open field analysis of a live-video tracking system that analyzes animal behavior may provide a more sensitive test. Results obtained from the manual Bederson and Garcia scales did not show significant differences between pre- and post-stroke animals in a small cohort. When using the same cohort, however, post-stroke data obtained from automated open field analysis showed significant differences in several parameters. Furthermore, large cohort analysis also demonstrated increased sensitivity with automated open field analysis versus the Bederson and Garcia scales. These early data indicate use of automated open field analysis software may provide a more sensitive assessment when compared to traditional Bederson and Garcia scales. PMID:24526841

  8. Preliminary biogeochemical assessment of EPICA LGM and Holocene ice samples

    NASA Astrophysics Data System (ADS)

    Bulat, S.; Alekhina, I.; Marie, D.; Wagenbach, D.; Raynaud, D.; Petit, J. R.

    2009-04-01

    weak signals were possible to generate which are now under cloning. The signals were hard to reproduce because of rather low volume of samples. More ice volume is needed to get the biosignal stronger and reproducible. Meantime we are adjusting PCR and in addition testing DNA repair-enzyme cocktail in case of DNA damage. As a preliminary conclusion we would like to highlight the following. Both Holocene and LGM ice samples (EDC99 and EDML) are very clean in terms of Ultra low biomass and Ultra low DOC content. The most basal ice of EDC and EDML ice cores could help in assessing microbial biomass and diversity if present under the glacier at the ice-bedrock boundary. * The present-day consortium includes S. Bulat, I. Alekhina, P. Normand, D. Prieur, J-R. Petit and D. Raynaud (France) and E. Willerslev and J.P. Steffensen (Denmark)

  9. Space Station Freedom automation and robotics: An assessment of the potential for increased productivity

    NASA Technical Reports Server (NTRS)

    Weeks, David J.; Zimmerman, Wayne F.; Swietek, Gregory E.; Reid, David H.; Hoffman, Ronald B.; Stammerjohn, Lambert W., Jr.; Stoney, William; Ghovanlou, Ali H.

    1990-01-01

    This report presents the results of a study performed in support of the Space Station Freedom Advanced Development Program, under the sponsorship of the Space Station Engineering (Code MT), Office of Space Flight. The study consisted of the collection, compilation, and analysis of lessons learned, crew time requirements, and other factors influencing the application of advanced automation and robotics, with emphasis on potential improvements in productivity. The lessons learned data collected were based primarily on Skylab, Spacelab, and other Space Shuttle experiences, consisting principally of interviews with current and former crew members and other NASA personnel with relevant experience. The objectives of this report are to present a summary of this data and its analysis, and to present conclusions regarding promising areas for the application of advanced automation and robotics technology to the Space Station Freedom and the potential benefits in terms of increased productivity. In this study, primary emphasis was placed on advanced automation technology because of its fairly extensive utilization within private industry including the aerospace sector. In contrast, other than the Remote Manipulator System (RMS), there has been relatively limited experience with advanced robotics technology applicable to the Space Station. This report should be used as a guide and is not intended to be used as a substitute for official Astronaut Office crew positions on specific issues.

  10. Automated ambulatory assessment of cognitive performance, environmental conditions, and motor activity during military operations

    NASA Astrophysics Data System (ADS)

    Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.

    2005-05-01

    Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.

  11. Parenchymal texture analysis in digital mammography: A fully automated pipeline for breast cancer risk assessment

    PubMed Central

    Zheng, Yuanjie; Keller, Brad M.; Ray, Shonket; Wang, Yan; Conant, Emily F.; Gee, James C.; Kontos, Despina

    2015-01-01

    Purpose: Mammographic percent density (PD%) is known to be a strong risk factor for breast cancer. Recent studies also suggest that parenchymal texture features, which are more granular descriptors of the parenchymal pattern, can provide additional information about breast cancer risk. To date, most studies have measured mammographic texture within selected regions of interest (ROIs) in the breast, which cannot adequately capture the complexity of the parenchymal pattern throughout the whole breast. To better characterize patterns of the parenchymal tissue, the authors have developed a fully automated software pipeline based on a novel lattice-based strategy to extract a range of parenchymal texture features from the entire breast region. Methods: Digital mammograms from 106 cases with 318 age-matched controls were retrospectively analyzed. The lattice-based approach is based on a regular grid virtually overlaid on each mammographic image. Texture features are computed from the intersection (i.e., lattice) points of the grid lines within the breast, using a local window centered at each lattice point. Using this strategy, a range of statistical (gray-level histogram, co-occurrence, and run-length) and structural (edge-enhancing, local binary pattern, and fractal dimension) features are extracted. To cover the entire breast, the size of the local window for feature extraction is set equal to the lattice grid spacing and optimized experimentally by evaluating different windows sizes. The association between their lattice-based texture features and breast cancer was evaluated using logistic regression with leave-one-out cross validation and further compared to that of breast PD% and commonly used single-ROI texture features extracted from the retroareolar or the central breast region. Classification performance was evaluated using the area under the curve (AUC) of the receiver operating characteristic (ROC). DeLong’s test was used to compare the different ROCs in

  12. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  13. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  14. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. PMID:26806135

  15. An automated system to mount cryo-cooled protein crystals on a synchrotron beam line, using compact sample cassettes and a small-scale robot

    PubMed Central

    Cohen, Aina E.; Ellis, Paul J.; Miller, Mitchell D.; Deacon, Ashley M.; Phizackerley, R. Paul

    2014-01-01

    An automated system for mounting and dismounting pre-frozen crystals has been implemented at the Stanford Synchrotron Radiation Laboratory (SSRL). It is based on a small industrial robot and compact cylindrical cassettes, each holding up to 96 crystals mounted on Hampton Research sample pins. For easy shipping and storage, the cassette fits inside several popular dry-shippers and long-term storage Dewars. A dispensing Dewar holds up to three cassettes in liquid nitrogen adjacent to the beam line goniometer. The robot uses a permanent magnet tool to extract samples from, and insert samples into a cassette, and a cryo-tong tool to transfer them to and from the beam line goniometer. The system is simple, with few moving parts, reliable in operation and convenient to use. PMID:24899734

  16. Toxicological Assessment of ISS Air Quality: Contingency Sampling - February 2013

    NASA Technical Reports Server (NTRS)

    Meyers, Valerie

    2013-01-01

    Two grab sample containers (GSCs) were collected by crew members onboard ISS in response to a vinegar-like odor in the US Lab. On February 5, the first sample was collected approximately 1 hour after the odor was noted by the crew in the forward portion of the Lab. The second sample was collected on February 22 when a similar odor was noted and localized to the end ports of the microgravity science glovebox (MSG). The crewmember removed a glove from the MSG and collected the GSC inside the glovebox volume. Both samples were returned on SpaceX-2 for ground analysis.

  17. Can we predict habitat quality from space? A multi-indicator assessment based on an automated knowledge-driven system

    NASA Astrophysics Data System (ADS)

    Vaz, Ana Sofia; Marcos, Bruno; Gonçalves, João; Monteiro, António; Alves, Paulo; Civantos, Emilio; Lucas, Richard; Mairota, Paola; Garcia-Robles, Javier; Alonso, Joaquim; Blonda, Palma; Lomba, Angela; Honrado, João Pradinho

    2015-05-01

    There is an increasing need of effective monitoring systems for habitat quality assessment. Methods based on remote sensing (RS) features, such as vegetation indices, have been proposed as promising approaches, complementing methods based on categorical data to support decision making. Here, we evaluate the ability of Earth observation (EO) data, based on a new automated, knowledge-driven system, to predict several indicators for oak woodland habitat quality in a Portuguese Natura 2000 site. We collected in-field data on five habitat quality indicators in vegetation plots from woodland habitats of a landscape undergoing agricultural abandonment. Forty-three predictors were calculated, and a multi-model inference framework was applied to evaluate the predictive strength of each data set for the several quality indicators. Three indicators were mainly explained by predictors related to landscape and neighbourhood structure. Overall, competing models based on the products of the automated knowledge-driven system had the best performance to explain quality indicators, compared to models based on manually classified land cover data. The system outputs in terms of both land cover classes and spectral/landscape indices were considered in the study, which highlights the advantages of combining EO data with RS techniques and improved modelling based on sound ecological hypotheses. Our findings strongly suggest that some features of habitat quality, such as structure and habitat composition, can be effectively monitored from EO data combined with in-field campaigns as part of an integrative monitoring framework for habitat status assessment.

  18. Homogeneous sample preparation for automated high throughput analysis with matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry.

    PubMed

    Onnerfjord, P; Ekström, S; Bergquist, J; Nilsson, J; Laurell, T; Marko-Varga, G

    1999-01-01

    This work presents a simple method for obtaining homogeneous sample surfaces in matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI-TOFMS) for the automated analysis of peptides and proteins. The sample preparation method is based on applying the sample/matrix mixture onto a pre-deposited highly diluted matrix spot. The pre-deposited crystals act as seeds for the new sample containing crystals which become much smaller in size and more evenly distributed than with conventional methods. This 'seed-layer' method was developed, optimised and compared with the dried-droplet method using peptides and proteins in the 1000-20,000 Da range. The seed-layer method increases the surface homogeneity, spot to spot reproducibility and sample washability as compared with the commonly used dried-droplet method. This methodology is applicable to alpha-cyanohydroxycinnamic acid, sinapinic acid and ferulic acid, which all form homogeneous crystal surfaces. Within-spot variation and between-spot variation was investigated using statistics at a 95% confidence level (n = 36). The statistical values were generated from more than 5000 data points collected from 500 spectra. More than 90% of the sample locations results in high intensity spectra with relatively low standard deviations (RSDs). Typically obtained data showed an RSD of 19-35% within a sample spot as well as in-between spots for proteins, and an RSD of < or = 50% for peptides. Linear calibration curves were obtained within one order of magnitude using internal calibration with a point-RSD of 3% (n = 10). The sample homogeneity allows mass spectra (average of 16 laser shots) to be obtained on each individual sample within 15 sec, whereby a 100 spot target plate can be run in 25 min. High density target plates using the seed-layer method were prepared by spotting approximately 100 picoliter droplets onto the target, resulting in sample spots < or = 500 microns in diameter using a flow

  19. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  20. Maximizing the Value of Mobile Health Monitoring by Avoiding Redundant Patient Reports: Prediction of Depression-Related Symptoms and Adherence Problems in Automated Health Assessment Services

    PubMed Central

    Sussman, Jeremy B; Pfeiffer, Paul N; Silveira, Maria J; Singh, Satinder; Lavieri, Mariel S

    2013-01-01

    Background Interactive voice response (IVR) calls enhance health systems’ ability to identify health risk factors, thereby enabling targeted clinical follow-up. However, redundant assessments may increase patient dropout and represent a lost opportunity to collect more clinically useful data. Objective We determined the extent to which previous IVR assessments predicted subsequent responses among patients with depression diagnoses, potentially obviating the need to repeatedly collect the same information. We also evaluated whether frequent (ie, weekly) IVR assessment attempts were significantly more predictive of patients’ subsequent reports than information collected biweekly or monthly. Methods Using data from 1050 IVR assessments for 208 patients with depression diagnoses, we examined the predictability of four IVR-reported outcomes: moderate/severe depressive symptoms (score ≥10 on the PHQ-9), fair/poor general health, poor antidepressant adherence, and days in bed due to poor mental health. We used logistic models with training and test samples to predict patients’ IVR responses based on their five most recent weekly, biweekly, and monthly assessment attempts. The marginal benefit of more frequent assessments was evaluated based on Receiver Operator Characteristic (ROC) curves and statistical comparisons of the area under the curves (AUC). Results Patients’ reports about their depressive symptoms and perceived health status were highly predictable based on prior assessment responses. For models predicting moderate/severe depression, the AUC was 0.91 (95% CI 0.89-0.93) when assuming weekly assessment attempts and only slightly less when assuming biweekly assessments (AUC: 0.89; CI 0.87-0.91) or monthly attempts (AUC: 0.89; CI 0.86-0.91). The AUC for models predicting reports of fair/poor health status was similar when weekly assessments were compared with those occurring biweekly (P value for the difference=.11) or monthly (P=.81). Reports of

  1. Automated assessment of renal cortical surface roughness from computerized tomography images and its association with age

    PubMed Central

    Duan, Xinhui; Rule, Andrew D.; Elsherbiny, Hisham E.; Vrtiska, Terri J.; Avula, Ramesh T.; Alexander, Mariam P.; Lerman, Lilach O.; McCollough, Cynthia H.

    2014-01-01

    Rationale and Objectives Nephrosclerosis occurs with aging and is characterized by increased kidney sub-capsular surface irregularities at autopsy. Assessments of cortical roughness in-vivo could provide an important measure of nephrosclerosis. The purpose of this study was to develop and validate an image-processing algorithm for quantifying renal cortical surface roughness in-vivo and determine its association with age. Materials and methods Renal cortical surface roughness was measured on contrast-enhanced abdominal CT images of potential living kidney donors. A roughness index was calculated based on geometric curvature of each kidney from 3D images, and compared with visual observation scores. Cortical roughness was compared between the oldest and youngest donors, and its interaction with cortical volume and age assessed. Results The developed quantitative roughness index identified significant differences in kidneys with visual surface roughness scores of 0 (minimal), 1 (mild), and 2 (moderate) (p<0.001) in a random sample of 200 potential kidney donors. Cortical roughness was significantly higher in the 94 oldest (64–75y) versus 91 youngest (18–25y) potential kidney donors (p<0.001). Lower cortical volume was associated with older age but not with roughness (r=−0.03, p=0.75). The association of oldest age group with roughness (OR=1.8 per SD of roughness index) remained significant after adjustment for total cortex volume (OR=2.0 per SD of roughness index). Conclusion A new algorithm to measure renal cortical surface roughness from CT scans detected rougher surface in older compared to younger kidneys, independent of cortical volume loss. This novel index may allow quantitative evaluation of nephrosclerosis in vivo using contrast-enhanced CT. PMID:25086950

  2. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  3. Automated cytochrome c oxidase bioassay developed for ionic liquids' toxicity assessment.

    PubMed

    Costa, Susana P F; Martins, Bárbara S F; Pinto, Paula C A G; Saraiva, M Lúcia M F S

    2016-05-15

    A fully automated cytochrome c oxidase assay resorting to sequential injection analysis (SIA) was developed for the first time and implemented to evaluate potential toxic compounds. The bioassay was validated by evaluation of 15 ionic liquids (ILs) with distinct cationic head groups, alkyl side chains and anions. The assay was based on cytochrome c oxidase activity reduction in presence of tested compounds and quantification of inhibitor concentration required to cause 50% of enzyme activity inhibition (EC50). The obtained results demonstrated that enzyme activity was considerably inhibited by BF4 anion and ILs incorporating non-aromatic pyrrolidinium and tetrabutylphosphonium cation cores. Emim [Ac] and chol [Ac], on contrary, presented the higher EC50 values among the ILs tested. The developed automated SIA methodology is a simple and robust high-throughput screening bioassay and exhibited good repeatability in all the tested conditions (rsd<3.7%, n=10). Therefore, it is expected that due to its simplicity and low cost, the developed approach can be used as alternative to traditional screening assays for evaluation of ILs toxicity and identification of possible toxicophore structures. Additionally, the results presented in this study provide further information about ILs toxicity. PMID:26894289

  4. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  5. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  6. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  7. High-throughput, Automated Extraction of DNA and RNA from Clinical Samples using TruTip Technology on Common Liquid Handling Robots

    PubMed Central

    Holmberg, Rebecca C.; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G.; Chandler, Darrell P.

    2013-01-01

    TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively). PMID:23793016

  8. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  9. Protein Quality Assessment on Saliva Samples for Biobanking Purposes.

    PubMed

    Rosa, Nuno; Marques, Jéssica; Esteves, Eduardo; Fernandes, Mónica; Mendes, Vera M; Afonso, Ângela; Dias, Sérgio; Pereira, Joaquim Polido; Manadas, Bruno; Correia, Maria José; Barros, Marlene

    2016-08-01

    Biobank saliva sample quality depends on specific criteria applied to collection, processing, and storage. In spite of the growing interest in saliva as a diagnostic fluid, few biobanks currently store large collections of such samples. The development of a standard operating procedure (SOP) for saliva collection and quality control is fundamental for the establishment of a new saliva biobank, which stores samples to be made available to the saliva research community. Different collection methods were tested regarding total volume of protein obtained, protein content, and protein profiles, and the results were used to choose the best method for protein studies. Furthermore, the impact of the circadian variability and inter- and intraindividual differences, as well as the saliva sample stability at room temperature, were also evaluated. Considering our results, a sublingual cotton roll method for saliva collection proved to produce saliva with the best characteristics and should be applied in the morning, whenever possible. In addition, there is more variability in salivary proteins between individuals than in the same individual for a 5-month period. According to the electrophoretic protein profile, protein stability is guaranteed for 24 hours at room temperature and the protein degradation profile and protein identification were characterized. All this information was used to establish an SOP for saliva collection, processing, and storage in a biobank. We conclude that it is possible to collect saliva using an easy and inexpensive protocol, resulting in saliva samples for protein analysis with sufficient quality for biobanking purposes. PMID:26937781

  10. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    SciTech Connect

    Bennett, Bonnie; Boddy, Mark; Doyle, Frank; Jamshidi, Mo; Ogunnaike, Tunde

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  11. Exploring trait assessment of samples, persons, and cultures.

    PubMed

    McCrae, Robert R

    2013-01-01

    I present a very broad overview of what I have learned about personality trait assessment at different levels and offer some views on future directions for research and clinical practice. I review some basic principles of scale development and argue that internal consistency has been overemphasized; more attention to retest reliability is needed. Because protocol validity is crucial for individual assessment and because validity scales have limited utility, I urge combining assessments from multiple informants, and I present some statistical tools for that purpose. As culture-level traits, I discuss ethos, national character stereotypes, and aggregated personality traits, and summarize evidence for the validity of the latter. Our understanding of trait profiles of cultures is limited, but it can guide future exploration. PMID:23924211

  12. Automated quantitative cytological analysis using portable microfluidic microscopy.

    PubMed

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. PMID:25990413

  13. WebMark--A Fully Automated Method of Submission, Assessment, Grading, and Commentary for Laboratory Practical Scripts

    NASA Astrophysics Data System (ADS)

    Olivier, George W. J.; Herson, Katie; Sosabowski, Michael H.

    2001-12-01

    The traditional (manual) method of checking and grading student laboratory practical scripts is time consuming and therefore can cause long script turnaround times; it is labor intensive, especially for nonuniform quantitative data; there is potential for inconsistency, and, for large student groups, a great deal of tedium for the checker. Automation of checking such scripts has the potential to alleviate these disadvantages. This paper describes a strategy adopted by the School of Pharmacy and Biomolecular Sciences, University of Brighton, UK, to automate the submission, assessment, grading, and commentary of laboratory practical scripts. Student evaluation and feedback is also reported. Students enter their results into a Web-based form via the school intranet. Their results are linked to a Filemaker Pro database, which calculates the "right" answers on the basis of the primary data used, compares them with the students' answers, and grades the scripts. The database detects where students have made errors in calculations and calculates the grade, which it sends to students with qualitative feedback. Students receive their grade and feedback by email immediately upon submission of their results, which gives them the opportunity to reflect upon and discuss their results with the instructor while the exercise is still fresh in their mind.

  14. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    NASA Astrophysics Data System (ADS)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98). Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  15. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging.

    PubMed

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint.Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18–49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system.High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98).Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  16. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    SciTech Connect

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  17. High-performance liquid chromatographic determination of ochratoxin A in artificially contaminated cocoa beans using automated sample clean-up.

    PubMed

    Hurst, W J; Martin, R A

    1998-06-12

    A HPLC method is described for the analysis of ochratoxin A at low-ppb levels in samples of artificially contaminated cocoa beans. The samples are extracted in a mixture of methanol-water containing ascorbic acid, adjusted to pH and evaporated to dryness. Samples in this state are then placed onto a Benchmate sample preparation workstation where C18 solid-phase extraction operations are performed. The resulting materials are evaporated to dryness and analyzed by reversed-phase HPLC with fluorescence detection. The method was evaluated for accuracy and precision with R.S.D.s for multiple injections of sample and standard calculated to 1.1% and 2.5% for sample and standard, respectively. Recoveries of ochratoxin A added to cocoa beans ranged from 87-106% over the range of the assay. PMID:9691293

  18. Automated system for generation of soil moisture products for agricultural drought assessment

    NASA Astrophysics Data System (ADS)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  19. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  20. SAMPLING AND ANALYSIS OF ASBESTOS FIBERS TO SUPPORT EXPOSURE ASSESSMENTS

    EPA Science Inventory

    The Agency's Asbestos Coordinating Committee (ACT) has conducted a technical review of the research needs for asbestos programs. The overall research in this task has been highlighted as a high prioirty need from that review. The filter comparison needs assessment was a recommen...

  1. Genesis Solar Wind Collector Cleaning Assessment: 60366 Sample Case Study

    NASA Technical Reports Server (NTRS)

    Goreva, Y. S.; Gonzalez, C. P.; Kuhlman, K. R.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, M. C.; Burkett, P. J.

    2014-01-01

    In order to recognize, localize, characterize and remove particle and thin film surface contamination, a small subset of Genesis mission collector fragments are being subjected to extensive study via various techniques [1-5]. Here we present preliminary results for sample 60336, a Czochralski silicon (Si-CZ) based wafer from the bulk array (B/C).

  2. [Information as physical factor: problems of measurement, hygienic assessment and IT-automation].

    PubMed

    Denisov, É I; Prokopenko, L V; Eremin, A L; Kur'erov, N N; Bodiakin, V I; Stepanian, I V

    2014-01-01

    The increasing flow of information, speeding up the progress of society, can impact the health that puts the task of its hygienic reglamentation. The physical aspects of information, parameters and units of quantities, aspects of measurement and evaluation with account of information quantity and quality as well as criteria of its permissible and optimal levels are considered. The results of measurements of quantity of text information produced per year on computer in 17 occupations of 10 economic sectors are presented. The principle of IT-automation of operator's work and of dynamic monitoring is proposed. On the basis of research performed the glossary of terms and guide on the problem with computer support are elaborated for the accumulation of experience and clarification of prospects. PMID:25069277

  3. Possibility of Using Nonmetallic Check Samples to Assess the Sensitivity of Penetrant Testing

    NASA Astrophysics Data System (ADS)

    Kalinichenko, N.; Lobanova, I.; Kalinichenko, A.; Loboda, E.; Jakubec, T.

    2016-06-01

    Versions of check sample manufacturing for penetrant inspection are considered. A statistical analysis of crack width measuring for nonmetallic samples is performed to determine the possibility of their application to assess the penetrant testing sensitivity.

  4. ON-LINE TOOLS FOR PROPER VERTICAL POSITIONING OF VERTICAL SAMPLING INTERVALS DURING SITE ASSESSMENT

    EPA Science Inventory

    This presentation presents on-line tools for proper vertical positioning of vertical sampling intervals during site assessment. Proper vertical sample interval selection is critical for generate data on the vertical distribution of contamination. Without vertical delineation, th...

  5. Development of Genesis Solar Wind Sample Cleanliness Assessment: Initial Report on Sample 60341 Optical Imagery and Elemental Mapping

    NASA Technical Reports Server (NTRS)

    Gonzalez, C. P.; Goreva, Y. S.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, P. J.; Burkett, P. J.

    2014-01-01

    Since 2005 the Genesis science team has experimented with techniques for removing the contaminant particles and films from the collection surface of the Genesis fragments. A subset of 40 samples have been designated as "cleaning matrix" samples. These are small samples to which various cleaning approaches are applied and then cleanliness is assessed optically, by TRXRF, SEM, ToF-SIMS, XPS, ellipsometry or other means [1-9]. Most of these sam-ples remain available for allocation, with cleanliness assessment data. This assessment allows evaluation of various cleaning techniques and handling or analytical effects. Cleaning techniques investigated by the Genesis community include acid/base etching, acetate replica peels, ion beam, and CO2 snow jet cleaning [10-16]. JSC provides surface cleaning using UV ozone exposure and ultra-pure water (UPW) [17-20]. The UPW rinse is commonly used to clean samples for handling debris between processing by different researchers. Optical microscopic images of the sample taken before and after UPW cleaning show what has been added or removed during the cleaning process.

  6. Using Attribute Sampling to Assess the Accuracy of a Library Circulation System.

    ERIC Educational Resources Information Center

    Kiger, Jack E.; Wise, Kenneth

    1995-01-01

    Discusses how to use attribute sampling to assess the accuracy of a library circulation system. Describes the nature of sampling, sampling risk, and nonsampling error. Presents nine steps for using attribute sampling to determine the maximum percentage of incorrect records in a circulation system. (AEF)

  7. Universality of Generalized Bunching and Efficient Assessment of Boson Sampling

    NASA Astrophysics Data System (ADS)

    Shchesnovich, V. S.

    2016-03-01

    It is found that identical bosons (fermions) show a generalized bunching (antibunching) property in linear networks: the absolute maximum (minimum) of the probability that all N input particles are detected in a subset of K output modes of any nontrivial linear M -mode network is attained only by completely indistinguishable bosons (fermions). For fermions K is arbitrary; for bosons it is either (i) arbitrary for only classically correlated bosons or (ii) satisfies K ≥N (or K =1 ) for arbitrary input states of N particles. The generalized bunching allows us to certify in a polynomial in N number of runs that a physical device realizing boson sampling with an arbitrary network operates in the regime of full quantum coherence compatible only with completely indistinguishable bosons. The protocol needs only polynomial classical computations for the standard boson sampling, whereas an analytic formula is available for the scattershot version.

  8. Universality of Generalized Bunching and Efficient Assessment of Boson Sampling.

    PubMed

    Shchesnovich, V S

    2016-03-25

    It is found that identical bosons (fermions) show a generalized bunching (antibunching) property in linear networks: the absolute maximum (minimum) of the probability that all N input particles are detected in a subset of K output modes of any nontrivial linear M-mode network is attained only by completely indistinguishable bosons (fermions). For fermions K is arbitrary; for bosons it is either (i) arbitrary for only classically correlated bosons or (ii) satisfies K≥N (or K=1) for arbitrary input states of N particles. The generalized bunching allows us to certify in a polynomial in N number of runs that a physical device realizing boson sampling with an arbitrary network operates in the regime of full quantum coherence compatible only with completely indistinguishable bosons. The protocol needs only polynomial classical computations for the standard boson sampling, whereas an analytic formula is available for the scattershot version. PMID:27058078

  9. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  10. Assessing total and volatile solids in municipal solid waste samples.

    PubMed

    Peces, M; Astals, S; Mata-Alvarez, J

    2014-01-01

    Municipal solid waste is broadly generated in everyday activities and its treatment is a global challenge. Total solids (TS) and volatile solids (VS) are typical control parameters measured in biological treatments. In this study, the TS and VS were determined using the standard methods, as well as introducing some variants: (i) the drying temperature for the TS assays was 105°C, 70°C and 50°C and (ii) the VS were determined using different heating ramps from room tempature to 550°C. TS could be determined at either 105°C or 70°C, but oven residence time was tripled at 70°C, increasing from 48 to 144 h. The VS could be determined by smouldering the sample (where the sample is burnt without a flame), which avoids the release of fumes and odours in the laboratory. However, smouldering can generate undesired pyrolysis products as a consequence of carbonization, which leads to VS being underestimated. Carbonization can be avoided using slow heating ramps to prevent the oxygen limitation. Furthermore, crushing the sample cores decreased the time to reach constant weight and decreased the potential to underestimate VS. PMID:25244131

  11. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    SciTech Connect

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders; Hansen, Olfred; Schytte, Tine; Bentzen, Soren M.

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  12. A lab-on-a-chip system integrating tissue sample preparation and multiplex RT-qPCR for gene expression analysis in point-of-care hepatotoxicity assessment.

    PubMed

    Lim, Geok Soon; Chang, Joseph S; Lei, Zhang; Wu, Ruige; Wang, Zhiping; Cui, Kemi; Wong, Stephen

    2015-10-21

    A truly practical lab-on-a-chip (LOC) system for point-of-care testing (POCT) hepatotoxicity assessment necessitates the embodiment of full-automation, ease-of-use and "sample-in-answer-out" diagnostic capabilities. To date, the reported microfluidic devices for POCT hepatotoxicity assessment remain rudimentary as they largely embody only semi-quantitative or single sample/gene detection capabilities. In this paper, we describe, for the first time, an integrated LOC system that is somewhat close to a practical POCT hepatotoxicity assessment device - it embodies both tissue sample preparation and multiplex real-time RT-PCR. It features semi-automation, is relatively easy to use, and has "sample-in-answer-out" capabilities for multiplex gene expression analysis. Our tissue sample preparation module incorporating both a microhomogenizer and surface-treated paramagnetic microbeads yielded high purity mRNA extracts, considerably better than manual means of extraction. A primer preloading surface treatment procedure and the single-loading inlet on our multiplex real-time RT-PCR module simplify off-chip handling procedures for ease-of-use. To demonstrate the efficacy of our LOC system for POCT hepatotoxicity assessment, we perform a preclinical animal study with the administration of cyclophosphamide, followed by gene expression analysis of two critical protein biomarkers for liver function tests, aspartate transaminase (AST) and alanine transaminase (ALT). Our experimental results depict normalized fold changes of 1.62 and 1.31 for AST and ALT, respectively, illustrating up-regulations in their expression levels and hence validating their selection as critical genes of interest. In short, we illustrate the feasibility of multiplex gene expression analysis in an integrated LOC system as a viable POCT means for hepatotoxicity assessment. PMID:26329655

  13. A portable automated system for trace gas sampling in the field and stable isotope analysis in the laboratory.

    PubMed

    Theis, Daniel E; Saurer, Matthias; Blum, Herbert; Frossard, Emmanuel; Siegwolf, Rolf T W

    2004-01-01

    A computer-controllable mobile system is presented which enables the automatic collection of 33 air samples in the field and the subsequent analysis for delta13C and delta18O stable isotope ratios of a carbon-containing trace gas in the laboratory, e.g. CO2, CO or CH4. The system includes a manifold gas source input for profile sampling and an infrared gas analyzer for in situ CO2 concentration measurements. Measurements of delta13C and delta18O of all 33 samples can run unattended and take less than six hours for CO2. Laboratory tests with three gases (compressed air with different pCO2 and stable isotope compositions) showed a measurement precision of 0.03 per thousand for delta13C and 0.02 per thousand for delta18O of CO2 (standard error (SE), n = 11). A field test of our system, in which 66 air samples were collected within a 24-hour period above grassland, showed a correlation of 0.99 (r2) between the inverse of pCO2 and delta13C of CO2. Storage of samples until analysis is possible for about 1 week; this can be an important factor for sampling in remote areas. A wider range of applications in the field is open with our system, since sampling and analysis of CO and CH4 for stable isotope composition is also possible. Samples of compressed air had a measurement precision (SE, n = 33) of 0.03 per thousand for delta13C and of 0.04 per thousand for delta18O on CO and of 0.07 per thousand for delta13C on CH4. Our system should therefore further facilitate research of trace gases in the context of the carbon cycle in the field, and opens many other possible applications with carbon- and possibly non-carbon-containing trace gases. PMID:15317047

  14. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  15. Simple semi-automated portable capillary electrophoresis instrument with contactless conductivity detection for the determination of β-agonists in pharmaceutical and pig-feed samples.

    PubMed

    Nguyen, Thi Anh Huong; Pham, Thi Ngoc Mai; Doan, Thi Tuoi; Ta, Thi Thao; Sáiz, Jorge; Nguyen, Thi Quynh Hoa; Hauser, Peter C; Mai, Thanh Duc

    2014-09-19

    An inexpensive, robust and easy to use portable capillary electrophoresis instrument with miniaturized high-voltage capacitively coupled contactless conductivity detection was developed. The system utilizes pneumatic operation to manipulate the solutions for all flushing steps. The different operations, i.e. capillary flushing, interface rinsing, and electrophoretic separation, are easily activated by turning an electronic switch. To allow the analysis of samples with limited available volume, and to render the construction less complicated compared to a computer-controlled counterpart, sample injection is carried out hydrodynamically directly from the sample vial into the capillary by manual syphoning. The system is a well performing solution where the financial means for the highly expensive commercial instruments are not available and where the in-house construction of a sophisticated automated instrument is not possible due to limited mechanical and electronic workshop facilities and software programming expertise. For demonstration, the system was employed successfully for the determination of some β-agonists, namely salbutamol, metoprolol and ractopamine down to 0.7ppm in pharmaceutical and pig-feed sample matrices in Vietnam. PMID:25115456

  16. Assessing rare earth elements in quartz rich geological samples.

    PubMed

    Santoro, A; Thoss, V; Guevara, S Ribeiro; Urgast, D; Raab, A; Mastrolitti, S; Feldmann, J

    2016-01-01

    Sodium peroxide (Na2O2) fusion coupled to Inductively Coupled Plasma Tandem Mass Spectrometry (ICP-MS/MS) measurements was used to rapidly screen quartz-rich geological samples for rare earth element (REE) content. The method accuracy was checked with a geological reference material and Instrumental Neutron Activation Analysis (INAA) measurements. The used mass-mode combinations presented accurate results (only exception being (157)Gd in He gas mode) with recovery of the geological reference material QLO-1 between 80% and 98% (lower values for Lu, Nd and Sm) and in general comparable to INAA measurements. Low limits of detection for all elements were achieved, generally below 10 pg g(-1), as well as measurement repeatability below 15%. Overall, the Na2O2/ICP-MS/MS method proved to be a suitable lab-based method to quickly and accurately screen rock samples originating from quartz-rich geological areas for rare earth element content; particularly useful if checking commercial viability. PMID:26595776

  17. AGWA: The Automated Geospatial Watershed Assessment Tool to inform rangeland management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Do you want a relatively easy to use tool to assess rangeland soil and water conservation practices on rangeland erosion that is specifically designed to use ecological information? Effective rangeland management requires the ability to assess the potential impacts of management actions on soil ero...

  18. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts

    PubMed Central

    Torney, Colin J.; Dobson, Andrew P.; Borner, Felix; Lloyd-Jones, David J.; Moyer, David; Maliti, Honori T.; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J. Grant C.

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future. PMID:27227888

  19. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    PubMed

    Torney, Colin J; Dobson, Andrew P; Borner, Felix; Lloyd-Jones, David J; Moyer, David; Maliti, Honori T; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J Grant C

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future. PMID:27227888

  20. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. PMID:25553972

  1. Aquatic hazard assessment of a commercial sample of naphthenic acids.

    PubMed

    Swigert, James P; Lee, Carol; Wong, Diana C L; White, Russell; Scarlett, Alan G; West, Charles E; Rowland, Steven J

    2015-04-01

    This paper presents chemical composition and aquatic toxicity characteristics of a commercial sample of naphthenic acids (NAs). Naphthenic acids are derived from the refining of petroleum middle distillates and can contribute to refinery effluent toxicity. NAs are also present in oil sands process-affected water (OSPW), but differences in the NAs compositions from these sources precludes using a common aquatic toxicity dataset to represent the aquatic hazards of NAs from both origins. Our chemical characterization of a commercial sample of NAs showed it to contain in order of abundance, 1-ring>2-ring>acyclic>3-ring acids (∼84%). Also present were monoaromatic acids (7%) and non-acids (9%, polyaromatic hydrocarbons and sulfur heterocyclic compounds). While the acyclic acids were only the third most abundant group, the five most abundant individual compounds were identified as C(10-14) n-acids (n-decanoic acid to n-tetradecanoic acid). Aquatic toxicity testing of fish (Pimephales promelas), invertebrate (Daphnia magna), algae (Pseudokirchneriella subcapitata), and bacteria (Vibrio fischeri) showed P. promelas to be the most sensitive species with 96-h LL50=9.0 mg L(-1) (LC50=5.6 mg L(-1)). Acute EL50 values for the other species ranged 24-46 mg L(-1) (EC50 values ranged 20-30 mg L(-1)). Biomimetic extraction via solid-phase-microextraction (BE-SPME) suggested a nonpolar narcosis mode of toxic action for D. magna, P. subcapitata, and V. fischeri. The BE analysis under-predicted fish toxicity, which indicates that a specific mode of action, besides narcosis, may be a factor for fishes. PMID:25434270

  2. Simultaneous analysis of organochlorinated pesticides (OCPs) and polychlorinated biphenyls (PCBs) from marine samples using automated pressurized liquid extraction (PLE) and Power Prep™ clean-up.

    PubMed

    Helaleh, Murad I H; Al-Rashdan, Amal; Ibtisam, A

    2012-05-30

    An automated pressurized liquid extraction (PLE) method followed by Power Prep™ clean-up was developed for organochlorinated pesticide (OCP) and polychlorinated biphenyl (PCB) analysis in environmental marine samples of fish, squid, bivalves, shells, octopus and shrimp. OCPs and PCBs were simultaneously determined in a single chromatographic run using gas chromatography-mass spectrometry-negative chemical ionization (GC-MS-NCI). About 5 g of each biological marine sample was mixed with anhydrous sodium sulphate and placed in the extraction cell of the PLE system. PLE is controlled by means of a PC using DMS 6000 software. Purification of the extract was accomplished using automated Power Prep™ clean-up with a pre-packed disposable silica column (6 g) supplied by Fluid Management Systems (FMS). All OCPs and PCBs were eluted from the silica column using two types of solvent: 80 mL of hexane and a 50 mL mixture of hexane and dichloromethane (1:1). A wide variety of fish and shellfish were collected from the fish market and analyzed using this method. The total PCB concentrations were 2.53, 0.25, 0.24, 0.24, 0.17 and 1.38 ng g(-1) (w/w) for fish, squid, bivalves, shells, octopus and shrimp, respectively, and the corresponding total OCP concentrations were 30.47, 2.86, 0.92, 10.72, 5.13 and 18.39 ng g(-1) (w/w). Lipids were removed using an SX-3 Bio-Beads gel permeation chromatography (GPC) column. Analytical criteria such as recovery, reproducibility and repeatability were evaluated through a range of biological matrices. PMID:22608412

  3. Assessing the Alcohol-BMI Relationship in a US National Sample of College Students

    ERIC Educational Resources Information Center

    Barry, Adam E.; Piazza-Gardner, Anna K.; Holton, M. Kim

    2015-01-01

    Objective: This study sought to assess the body mass index (BMI)-alcohol relationship among a US national sample of college students. Design: Secondary data analysis using the Fall 2011 National College Health Assessment (NCHA). Setting: A total of 44 US higher education institutions. Methods: Participants included a national sample of college…

  4. Improving Preservice Teacher Preparation through the Teacher Work Sample: Exploring Assessment and Analysis of Student Learning

    ERIC Educational Resources Information Center

    Stobaugh, Rebecca Ruth; Tassell, Janet Lynne; Norman, Antony D.

    2010-01-01

    This study focuses on the Renaissance Teacher Work Sample's critical sections Assessment Plan and Analysis of Student Learning. Preliminary review of scoring data based on the sample revealed that preservice teachers at a large comprehensive institution teacher program appeared to be most challenged with designing assessments and analyzing student…

  5. Isotope Enrichment Detection by Laser Ablation - Laser Absorption Spectrometry: Automated Environmental Sampling and Laser-Based Analysis for HEU Detection

    SciTech Connect

    Anheier, Norman C.; Bushaw, Bruce A.

    2010-01-01

    The global expansion of nuclear power, and consequently the uranium enrichment industry, requires the development of new safeguards technology to mitigate proliferation risks. Current enrichment monitoring instruments exist that provide only yes/no detection of highly enriched uranium (HEU) production. More accurate accountancy measurements are typically restricted to gamma-ray and weight measurements taken in cylinder storage yards. Analysis of environmental and cylinder content samples have much higher effectiveness, but this approach requires onsite sampling, shipping, and time-consuming laboratory analysis and reporting. Given that large modern gaseous centrifuge enrichment plants (GCEPs) can quickly produce a significant quantity (SQ ) of HEU, these limitations in verification suggest the need for more timely detection of potential facility misuse. The Pacific Northwest National Laboratory (PNNL) is developing an unattended safeguards instrument concept, combining continuous aerosol particulate collection with uranium isotope assay, to provide timely analysis of enrichment levels within low enriched uranium facilities. This approach is based on laser vaporization of aerosol particulate samples, followed by wavelength tuned laser diode spectroscopy to characterize the uranium isotopic ratio through subtle differences in atomic absorption wavelengths. Environmental sampling (ES) media from an integrated aerosol collector is introduced into a small, reduced pressure chamber, where a focused pulsed laser vaporizes material from a 10 to 20-µm diameter spot of the surface of the sampling media. The plume of ejected material begins as high-temperature plasma that yields ions and atoms, as well as molecules and molecular ions. We concentrate on the plume of atomic vapor that remains after the plasma has expanded and then cooled by the surrounding cover gas. Tunable diode lasers are directed through this plume and each isotope is detected by monitoring absorbance

  6. Verification of performance with the automated direct optical TIRF immunosensor (River Analyser) in single and multi-analyte assays with real water samples.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Gauglitz, Guenter

    2004-11-01

    In order to verify the reproducibility, precision, and robustness of the optical immunosensor River Analyser (RIANA), we investigated two common statistical methods to evaluate the limit of detection (LOD) and the limit of quantification (LOQ). Therefore, we performed a simultaneous multi-analyte calibration with atrazine, bisphenol A, and estrone in Milli-Q water. Using an automated biosensor, it was possible for the first time to achieve a LOD below 0.020 microg L(-1) using a common statistically based method without sample pre-treatment and pre-concentration for each of the analytes in a simultaneous multi-analyte calibration. This biosensor setup shows values comparable to those obtained by more classical analytical methods. Based on this calibration, we measured spiked and un-spiked real water samples with complex matrices (samples from different water bodies, from ground water sources, and tap water samples). The comparison between our River Analyser and common analytical methods (like GC-MS and HPLC-DAD) shows overall comparable values for all three analytes. Furthermore, a calibration of isoproturon (IPU) (in single analyte mode) resulted in a LOD of 0.016 microg L(-1), and a LOQ of 0.091 microg L(-1). In compliance with guidelines of the Association of Analytical Communities International (AOAC), six out of nine recovery rates (recovery rate: measured concentration divided by real concentration in percent) for three surface water samples with different matrices (spiked and un-spiked) could be obtained between 70 and 120% (recovery rates between 70 and 120%, as demanded by the guidelines of the AOAC International). The reproducibility was checked by measuring replica of each sample within independent repetitions. Robustness could be demonstrated by long-term stability tests of the biosensor surface. These studies show that the biosensor used offers the necessary reproducibility, precision, and robustness required for an analytical method. PMID:15522589

  7. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  8. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  9. An evaluation of automated broncho-arterial ratios for reliable assessment of bronchiectasis

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois

    2008-03-01

    Bronchiectasis, the permanent dilatation of the airways, is frequently evaluated by computed tomography (CT) in order to determine disease progression and response to treatment. Normal airways have diameters of approximately the same size as their accompanying artery, and most scoring systems for quantifying bronchiectasis severity ask physicians to estimate the broncho-arterial ratio. However, the lack of standardization coupled with inter-observer variability limits diagnostic sensitivity and the ability to make reliable comparisons with follow-up CT studies. We have developed a Computer Aided Diagnosis method to detect airway disease by locating abnormal broncho-arterial ratios. Our approach is based on computing a tree model of the airways followed by automated measurements of broncho-arterial ratios at peripheral airway locations. The artery accompanying a given bronchus is automatically determined by correlation of its orientation and proximity to the airway, while the diameter measurements are based on the full-width half maximum method. This method was previously evaluated subjectively; in this work we quantitatively evaluate the airway and vessel measurements on 9 CT studies and compare the results with three independent readers. The automatically selected artery location was in agreement with the readers in 75.3% of the cases compared with 65.6% agreement of the readers with each other. The reader-computer variability in lumen diameters (7%) was slightly lower than that of the readers with respect to each other (9%), whereas the reader-computer variability in artery diameter (18%) was twice that of the readers (8%), but still acceptable for detecting disease. We conclude that the automatic system has comparable accuracy to that of readers, while providing greater speed and consistency.

  10. Detection of coronary calcifications from computed tomography scans for automated risk assessment of coronary artery disease

    SciTech Connect

    Isgum, Ivana; Rutten, Annemarieke; Prokop, Mathias; Ginneken, Bram van

    2007-04-15

    A fully automated method for coronary calcification detection from non-contrast-enhanced, ECG-gated multi-slice computed tomography (CT) data is presented. Candidates for coronary calcifications are extracted by thresholding and component labeling. These candidates include coronary calcifications, calcifications in the aorta and in the heart, and other high-density structures such as noise and bone. A dedicated set of 64 features is calculated for each candidate object. They characterize the object's spatial position relative to the heart and the aorta, for which an automatic segmentation scheme was developed, its size and shape, and its appearance, which is described by a set of approximated Gaussian derivatives for which an efficient computational scheme is presented. Three classification strategies were designed. The first one tested direct classification without feature selection. The second approach also utilized direct classification, but with feature selection. Finally, the third scheme employed two-stage classification. In a computationally inexpensive first stage, the most easily recognizable false positives were discarded. The second stage discriminated between more difficult to separate coronary calcium and other candidates. Performance of linear, quadratic, nearest neighbor, and support vector machine classifiers was compared. The method was tested on 76 scans containing 275 calcifications in the coronary arteries and 335 calcifications in the heart and aorta. The best performance was obtained employing a two-stage classification system with a k-nearest neighbor (k-NN) classifier and a feature selection scheme. The method detected 73.8% of coronary calcifications at the expense of on average 0.1 false positives per scan. A calcium score was computed for each scan and subjects were assigned one of four risk categories based on this score. The method assigned the correct risk category to 93.4% of all scans.

  11. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1). PMID:26703262

  12. Higher-Order Exploratory Factor Analysis of the Reynolds Intellectual Assessment Scales with a Referred Sample

    ERIC Educational Resources Information Center

    Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.

    2007-01-01

    The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…

  13. Sequential sampling: cost-effective approach for monitoring benthic macroinvertebrates in environmental impact assessements

    SciTech Connect

    Resh, V.H.; Price, D.G.

    1984-01-01

    Sequential sampling is a method for monitoring benthic macroinvertebrates that can significantly reduce the number of samples required to reach a decision, and consequently, decrease the cost of benthic sampling in environmental impact assessments. Rather than depending on a fixed number of samples, this analysis cumulatively compares measured parameter values (for example, density, community diversity) from individual samples, with thresholds that are based on specified degrees of precision. In addition to reducing sample size, a monitoring program based on sequential sampling can provide clear-cut decisions as to whether a priori-defined changes in the measured parameter(s) have or have not occurred.

  14. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  15. Assessment of social cognition in non-human primates using a network of computerized automated learning device (ALDM) test systems.

    PubMed

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-01-01

    Fagot & Paleressompoulle(1) and Fagot & Bonte(2) have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing(3). This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition. PMID:25992495

  16. Shorter sampling periods and accurate estimates of milk volume and components are possible for pasture based dairy herds milked with automated milking systems.

    PubMed

    Kamphuis, Claudia; Burke, Jennie K; Taukiri, Sarah; Petch, Susan-Fay; Turner, Sally-Anne

    2016-08-01

    Dairy cows grazing pasture and milked using automated milking systems (AMS) have lower milking frequencies than indoor fed cows milked using AMS. Therefore, milk recording intervals used for herd testing indoor fed cows may not be suitable for cows on pasture based farms. We hypothesised that accurate standardised 24 h estimates could be determined for AMS herds with milk recording intervals of less than the Gold Standard (48 hs), but that the optimum milk recording interval would depend on the herd average for milking frequency. The Gold Standard protocol was applied on five commercial dairy farms with AMS, between December 2011 and February 2013. From 12 milk recording test periods, involving 2211 cow-test days and 8049 cow milkings, standardised 24 h estimates for milk volume and milk composition were calculated for the Gold Standard protocol and compared with those collected during nine alternative sampling scenarios, including six shorter sampling periods and three in which a fixed number of milk samples per cow were collected. Results infer a 48 h milk recording protocol is unnecessarily long for collecting accurate estimates during milk recording on pasture based AMS farms. Collection of two milk samples only per cow was optimal in terms of high concordance correlation coefficients for milk volume and components and a low proportion of missed cow-test days. Further research is required to determine the effects of diurnal variations in milk composition on standardised 24 h estimates for milk volume and components, before a protocol based on a fixed number of samples could be considered. Based on the results of this study New Zealand have adopted a split protocol for herd testing based on the average milking frequency for the herd (NZ Herd Test Standard 8100:2015). PMID:27600967

  17. Evaluation of the RapidHIT™ 200, an automated human identification system for STR analysis of single source samples.

    PubMed

    Holland, Mitchell; Wendt, Frank

    2015-01-01

    The RapidHIT™ 200 Human Identification System was evaluated to determine its suitability for STR analysis of single source buccal swabs. Overall, the RapidHIT™ 200 performed as well as our traditional capillary electrophoresis based method in producing useable profile information on a first-pass basis. General observations included 100% concordance with known profile information, consistent instrument performance after two weeks of buccal swab storage, and an absence of contamination in negative controls. When data analysis was performed by the instrument software, 95.3% of the 85 samples in the reproducibility study gave full profiles. Including the 81 full profiles, a total of 2682 alleles were correctly called by the instrument software, or 98.6% of 2720 possible alleles tested. Profile information was generated from as little as 10,000 nucleated cells, with swab collection technique being a major contributing factor to profile quality. The average peak-height-ratio for heterozygote profiles (81%) was comparable to conventional STR analysis, and while a high analytical threshold was required when offline profile analysis was performed (800 RFU), it was proportionally consistent with traditional methods. Stochastic sampling effects were evaluated, and a manageable approach to address limits of detection for homozygote profiles is provided. These results support consideration of the RapidHIT™ 200 as an acceptable alternative to conventional, laboratory based STR analysis for the testing of single source buccal samples, with review of profile information as a requirement until an expert software system is incorporated, and when proper developmental and internal validation studies have been completed. PMID:25286443

  18. Automated determination of urinary Na+, K+, chloride, inorganic phosphate, urea, and creatinine without sample dilution, with the "RA-XT".

    PubMed

    Fesus, P D; Pressac, M; Braconnier, F; Aymard, P

    1989-03-01

    We describe how concentrations of chloride, urea, inorganic phosphate, and creatinine in urine can be measured directly, without manual sample dilution, in a discrete analyzer (the Technicon "RA-XT"). These methods were accurate for concentrations of chloride up to 280 mmol/L, urea up to 500 mmol/L, inorganic phosphate up to 50 mmol/L, and creatinine up to 30 mmol/L. CVs are less than 3% nd results correlate well with those obtained by continuous-flow analysis (SMA-II). All these reagents are stable at room temperature for three weeks. Analyses are easy to perform and infrequent calibration is required. PMID:2920416

  19. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  20. Determination of polycyclic aromatic hydrocarbons in food samples by automated on-line in-tube solid-phase microextraction coupled with high-performance liquid chromatography-fluorescence detection.

    PubMed

    Ishizaki, A; Saito, K; Hanioka, N; Narimatsu, S; Kataoka, H

    2010-08-27

    A simple and sensitive automated method, consisting of in-tube solid-phase microextraction (SPME) coupled with high-performance liquid chromatography-fluorescence detection (HPLC-FLD), was developed for the determination of 15 polycyclic aromatic hydrocarbons (PAHs) in food samples. PAHs were separated within 15 min by HPLC using a Zorbax Eclipse PAH column with a water/acetonitrile gradient elution program as the mobile phase. The optimum in-tube SPME conditions were 20 draw/eject cycles of 40 microL of sample using a CP-Sil 19CB capillary column as an extraction device. Low- and high-molecular weight PAHs were extracted effectively onto the capillary coating from 5% and 30% methanol solutions, respectively. The extracted PAHs were readily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME HPLC-FLD method, good linearity of the calibration curve (r>0.9972) was obtained in the concentration range of 0.05-2.0 ng/mL, and the detection limits (S/N=3) of PAHs were 0.32-4.63 pg/mL. The in-tube SPME method showed 18-47 fold higher sensitivity than the direct injection method. The intra-day and inter-day precision (relative standard deviations) for a 1 ng/mL PAH mixture were below 5.1% and 7.6% (n=5), respectively. This method was applied successfully to the analysis of tea products and dried food samples without interference peaks, and the recoveries of PAHs spiked into the tea samples were >70%. Low-molecular weight PAHs such as naphthalene and pyrene were detected in many foods, and carcinogenic benzo[a]pyrene, at relatively high concentrations, was also detected in some black tea samples. This method was also utilized to assess the release of PAHs from tea leaves into the liquor. PMID:20637468

  1. Integration and Evaluation of CAL Courseware and Automated Assessment in the Delivery of a Geography Module.

    ERIC Educational Resources Information Center

    Towse, Raymond J.; Garside, Peter

    1998-01-01

    Discusses software and logistical issues and proposed solutions in the development of a system for delivery of flexible and distance-based learning using a computer-assisted learning module and computer-based testing. Assesses initial evaluative responses to the new system and to this mode of learning in general. (DSK)

  2. Force sensing system for automated assessment of motor performance during fMRI.

    PubMed

    Rogers, Bill; Zhang, Wei; Narayana, Shalini; Lancaster, Jack L; Robin, Donald A; Fox, Peter T

    2010-06-30

    Finger tapping sequences are a commonly used measure of motor learning in functional imaging studies. Subjects repeat a defined sequence of finger taps as fast as possible for a set period of time. The number of sequences completed per unit time is the measure of performance. Assessment of speed and accuracy is generally accomplished by video recording the session then replaying in slow motion to assess rate and accuracy. This is a time consuming and error prone process. Keyboards and instrumented gloves have also been used for task assessment though they are relatively expensive and not usually compatible in a magnetic resonance imaging (MRI) scanner. To address these problems, we developed a low cost system using MRI compatible force sensitive resistors (FSR) to assess the performance during a finger sequence task. This system additionally provides information on finger coordination including time between sequences, intervals between taps, and tap duration. The method was validated by comparing the FSR system results with results obtained by video analysis during the same session. PMID:20417235

  3. GIS-BASED HYDROLOGIC MODELING: THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local scale problems toward complex, spatially explicit regional ones. Such problems have to be
    addressed with distributed models that can compute runoff and erosion at different spatial a...

  4. AGWA: The Automated Geospatial Watershed Assessment Tool to Inform Rangeland Management

    EPA Science Inventory

    Do you want a relatively easy to use tool to assess rangeland soil and water conservation practices on rangeland erosion that is specifically designed to use ecological information? New Decision Support Tools (DSTs) that are easy-to-use, incorporate ecological concepts and rangel...

  5. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    SciTech Connect

    Walworth, Matthew J; ElNaggar, Mariam S; Stankovich, Joseph J; WitkowskiII, Charles E.; Norris, Jeremy L; Van Berkel, Gary J

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  6. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  7. Test-retest reliability analysis of the Cambridge Neuropsychological Automated Tests for the assessment of dementia in older people living in retirement homes.

    PubMed

    Gonçalves, Marta Matos; Pinho, Maria Salomé; Simões, Mário R

    2016-01-01

    The validity of the Cambridge Neuropsychological Automated Tests has been widely studied, but their reliability has not. This study aimed to estimate the test-retest reliability of these tests in a sample of 34 older adults, aged 69 to 90 years old, without neuropsychiatric diagnoses and living in retirement homes in the district of Lisbon, Portugal. The battery was administered twice, with a 4-week interval between sessions. The Paired Associates Learning (PAL), Spatial Working Memory (SWM), Rapid Visual Information Processing, and Reaction Time tests revealed measures with high-to-adequate test-retest correlations (.71-.89), although several PAL and SWM measures showed susceptibility to practice effects. Two estimated standardized regression-based methods were found to be more efficient at correcting for practice effects than a method of fixed correction. We also found weak test-retest correlations (.56-.68) for several measures. These results suggest that some, but not all, measures are suitable for cognitive assessment and monitoring in this population. PMID:26574661

  8. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    EPA Science Inventory

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  9. Automated solid-phase extraction and liquid chromatography-electrospray ionization-mass spectrometry for the determination of flunitrazepam and its metabolites in human urine and plasma samples.

    PubMed

    Jourdil, N; Bessard, J; Vincent, F; Eysseric, H; Bessard, G

    2003-05-25

    A sensitive and specific method using reversed-phase liquid chromatography coupled with electrospray ionization-mass spectrometry (LC-ESI-MS) has been developed for the quantitative determination of flunitrazepam (F) and its metabolites 7-aminoflunitrazepam (7-AF), N-desmethylflunitrazepam (N-DMF) and 3-hydroxyflunitrazepam (3-OHF) in biological fluids. After the addition of deuterium labelled standards of F,7-AF and N-DMF, the drugs were isolated from urine or plasma by automated solid-phase extraction, then chromatographed in an isocratic elution mode with a salt-free eluent. The quantification was performed using selected ion monitoring of protonated molecular ions (M+H(+)). Experiments were carried out to improve the extraction recovery (81-100%) and the sensitivity (limit of detection 0.025 ng/ml for F and 7-AF, 0.040 ng/ml for N-DMF and 0.200 ng/ml for 3-OHF). The method was applied to the determination of F and metabolites in drug addicts including withdrawal urine samples and in one date-rape plasma and urine sample. PMID:12705961

  10. Automated trace determination of earthy-musty odorous compounds in water samples by on-line purge-and-trap-gas chromatography-mass spectrometry.

    PubMed

    Salemi, Amir; Lacorte, Sílvia; Bagheri, Habib; Barceló, Damià

    2006-12-15

    An automated technique based on purge-and-trap coupled to gas chromatography with mass spectrometric detection has been developed and optimized for the trace determination of five of the most important water odorants; 2-isopropyl-3-methoxypyrazine, 2-isobutyl-3-methoxypyrazine, 2-methylisoborneol, 2,4,6-trichloroanisole and geosmin. The extraction method was absolutely solvent-free. Analytes were purged from 20 ml of water sample containing sodium chloride at room temperature by a flow of He and trapped on a Tenax sorbent. The desorption step was performed with helium and temperature programming and desorbed analytes were directly transferred to a gas chromatograph coupled to a mass spectrometer for separation and determination. The method was reproducible (RSD<8%) and linear over the calibration range (10-200 ngl(-1)). The relative recoveries of the analytes from ground water sample were calculated and were between 80 and 103% and limits of detection (LOD) below odor thresholds were achieved for most of the compounds. PMID:17055519

  11. Automated Technology for In-home Fall Risk Assessment and Detection Sensor System

    PubMed Central

    Rantz, Marilyn J.; Skubic, Marjorie; Abbott, Carmen; Galambos, Colleen; Pak, Youngju; Ho, Dominic K.C.; Stone, Erik E.; Rui, Liyang; Back, Jessica; Miller, Steven J.

    2013-01-01

    Falls are a major problem for older adults. A continuous, unobtrusive, environmentally mounted in-home monitoring system that automatically detects when falls have occurred or when the risk of falling is increasing could alert health care providers and family members so they could intervene to improve physical function or mange illnesses that are precipitating falls. Researchers at the University of Missouri (MU)Center for Eldercare and Rehabilitation Technology are testing such sensor systems for fall risk assessment and detection in older adults’ apartments in a senior living community. Initial results comparing ground truth fall risk assessment data and GAITRite gait parameters with gait parameters captured from Mircosoft Kinect and Pulse-Dopplar radar are reported. PMID:23675644

  12. An automated headspace solid-phasemicroextraction followed by gas chromatography–mass spectrometry method to determine macrocyclic musk fragrances in wastewater samples.

    PubMed

    Vallecillos, Laura; Borrull, Francesc; Pocurull, Eva

    2013-11-01

    A fully automated method has been developed for determining eight macrocyclic musk fragrances in wastewater samples. The method is based on headspace solid-phase microextraction (HS-SPME) followed by gas chromatography–mass spectrometry (GC-MS). Five different fibres (PDMS 7 μm, PDMS 30 μm, PDMS 100 μm, PDMS/DVB 65 μm and PA 85 μm) were tested. The best conditions were achieved when a PDMS/DVB 65 μm fibre was exposed for 45 min in the headspace of 10 mL water samples at 100 °C. Method detection limits were found in the low ng L−1 range between 0.75 and 5 ng L−1 depending on the target analytes. Moreover, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations (n =5, 1,000 ng L−1) less than 9 and 14 %, respectively. The applicability of the method was tested with influent and effluent urban wastewater samples from different wastewater treatment plants (WWTPs). The analysis of influent urban wastewater revealed the presence of most of the target macrocyclic musks with, most notably, the maximum concentration of ambrettolide being obtained in WWTP A (4.36 μg L−1) and WWTP B (12.29 μg L−1), respectively. The analysis of effluent urban wastewater showed a decrease in target analyte concentrations, with exaltone and ambrettolide being the most abundant compounds with concentrations varying between below method quantification limit (

  13. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  14. Semi-automated Volumetric and Morphological Assessment of Glioblastoma Resection with Fluorescence-Guided Surgery

    PubMed Central

    Cordova, J. Scott; Gurbani, Saumya S.; Holder, Chad A.; Olson, Jeffrey J.; Schreibmann, Eduard; Shi, Ran; Guo, Ying; Shu, Hui-Kuo G.; Shim, Hyunsuk; Hadjipanayis, Costas G.

    2016-01-01

    Purpose Glioblastoma (GBM) neurosurgical resection relies on contrast-enhanced MRI-based neuronavigation. However, it is well-known that infiltrating tumor extends beyond contrast enhancement. Fluorescence-guided surgery (FGS) using 5-aminolevulinic acid (5-ALA) was evaluated to improve extent of resection (EOR) of GBMs. Pre-operative morphological tumor metrics were also assessed. Procedures Thirty patients from a Phase II trial evaluating 5-ALA FGS in newly diagnosed GBM were assessed. Tumors were segmented pre-operatively to assess morphological features as well as post-operatively to evaluate EOR and residual tumor volume (RTV). Results Median EOR and RTV were 94.3% and 0.821 cm3, respectively. Pre-operative surface area to volume ratio and RTV were significantly associated with overall survival, even when controlling for the known survival confounders. Conclusions This study supports claims that 5-ALA FGS is helpful at decreasing tumor burden and prolonging survival in GBM. Moreover, morphological indices are shown to impact both resection and patient survival. PMID:26463215

  15. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    NASA Astrophysics Data System (ADS)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  16. Vertical Sampling in Recharge Areas Versus Lateral Sampling in Discharge Areas: Assessing the Agricultural Nitrogen Legacy in Groundwater

    NASA Astrophysics Data System (ADS)

    Gilmore, T. E.; Genereux, D. P.; Solomon, D. K.; Mitasova, H.; Burnette, M.

    2014-12-01

    Agricultural nitrogen (N) is a legacy contaminant often found in shallow groundwater systems. This legacy has commonly been observed using well nests (vertical sampling) in recharge areas, but may also be observed by sampling at points in/beneath a streambed using pushable probes along transects across a channel (lateral sampling). We compared results from two different streambed point sampling approaches and from wells in the recharge area to assess whether the different approaches give fundamentally different pictures of (1) the magnitude of N contamination, (2) historic trends in N contamination, and (3) the extent to which denitrification attenuates nitrate transport through the surficial aquifer. Two different arrangements of streambed points (SP) were used to sample groundwater discharging into a coastal plain stream in North Carolina. In July 2012, a 58 m reach was sampled using closely-spaced lateral transects of SP, revealing high average [NO3-] (808 μM, n=39). In March 2013, transects of SP were widely distributed through a 2.7 km reach that contained the 58 m reach and suggested overall lower [NO3-] (210 μM, n=30), possibly due to variation in land use along the longer study reach. Mean [NO3-] from vertical sampling (2 well nests with 3 wells each) was 296 μM. Groundwater apparent ages from SP in the 58 m and 2.7 km reaches suggested lower recharge [NO3-] (observed [NO3-] plus modeled excess N2) in 0-10 year-old water (1250 μM and 525 μM, respectively), compared to higher recharge [NO3-] from 10-30 years ago (about 1600 μM and 900 μM, respectively). In the wells, [NO3-] was highest (835 μM) in groundwater with apparent age of 12-15 years and declined as apparent age increased, a trend that was consistent with SP in the 2.7 km reach. The 58 m reach suggested elevated recharge [NO3-] (>1100 μM) over a 50-year period. Excess N2 from wells suggested that about 62% of nitrate had been removed via denitrification since recharge, versus 51% and 78

  17. Automated solvent concentrator

    NASA Technical Reports Server (NTRS)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  18. Enabling automated magnetic resonance imaging-based targeting assessment during dipole field navigation

    NASA Astrophysics Data System (ADS)

    Latulippe, Maxime; Felfoul, Ouajdi; Dupont, Pierre E.; Martel, Sylvain

    2016-02-01

    The magnetic navigation of drugs in the vascular network promises to increase the efficacy and reduce the secondary toxicity of cancer treatments by targeting tumors directly. Recently, dipole field navigation (DFN) was proposed as the first method achieving both high field and high navigation gradient strengths for whole-body interventions in deep tissues. This is achieved by introducing large ferromagnetic cores around the patient inside a magnetic resonance imaging (MRI) scanner. However, doing so distorts the static field inside the scanner, which prevents imaging during the intervention. This limitation constrains DFN to open-loop navigation, thus exposing the risk of a harmful toxicity in case of a navigation failure. Here, we are interested in periodically assessing drug targeting efficiency using MRI even in the presence of a core. We demonstrate, using a clinical scanner, that it is in fact possible to acquire, in specific regions around a core, images of sufficient quality to perform this task. We show that the core can be moved inside the scanner to a position minimizing the distortion effect in the region of interest for imaging. Moving the core can be done automatically using the gradient coils of the scanner, which then also enables the core to be repositioned to perform navigation to additional targets. The feasibility and potential of the approach are validated in an in vitro experiment demonstrating navigation and assessment at two targets.

  19. Automated Assessment of Medical Students’ Clinical Exposures according to AAMC Geriatric Competencies

    PubMed Central

    Chen, Yukun; Wrenn, Jesse; Xu, Hua; Spickard, Anderson; Habermann, Ralf; Powers, James; Denny, Joshua C.

    2014-01-01

    Competence is essential for health care professionals. Current methods to assess competency, however, do not efficiently capture medical students’ experience. In this preliminary study, we used machine learning and natural language processing (NLP) to identify geriatric competency exposures from students’ clinical notes. The system applied NLP to generate the concepts and related features from notes. We extracted a refined list of concepts associated with corresponding competencies. This system was evaluated through 10-fold cross validation for six geriatric competency domains: “medication management (MedMgmt)”, “cognitive and behavioral disorders (CBD)”, “falls, balance, gait disorders (Falls)”, “self-care capacity (SCC)”, “palliative care (PC)”, “hospital care for elders (HCE)” – each an American Association of Medical Colleges competency for medical students. The systems could accurately assess MedMgmt, SCC, HCE, and Falls competencies with F-measures of 0.94, 0.86, 0.85, and 0.84, respectively, but did not attain good performance for PC and CBD (0.69 and 0.62 in F-measure, respectively). PMID:25954341

  20. TongueSim: Development of an Automated Method for Rapid Assessment of Fungiform Papillae Density for Taste Research.

    PubMed

    Sanyal, Shourjya; O'Brien, Shauna M; Hayes, John E; Feeney, Emma L

    2016-05-01

    Taste buds are found on the tongue in 3 types of structures: the fungiform papillae, the foliate papillae, and the circumvallate papillae. Of these, the fungiform papillae (FP) are present in the greatest numbers on the tongue, and are thought to be correlated to the overall number of taste buds. For this reason, FP density on the tongue is often used to infer taste function, although this has been controversial. Historically, videomicroscopy techniques were used to assess FP. More recently, advances in digital still photography and in software have allowed the development of rapid methods for obtaining high quality images in situ. However, these can be subject to inter-researcher variation in FP identification, and are somewhat limited in the parameters that can be measured. Here, we describe the development of a novel, automated method to count the FP, using the TongueSim suite of software. Advantages include the reduction in time required for image analysis, elimination of researcher bias, and the added potential to measure characteristics such as the degree of roundness of each papilla. We envisage that such software has a wide variety of novel research applications. PMID:26892308

  1. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  2. Automation of the quantitative determination of elemental content in samples using neutron activation analysis on the IBR-2 reactor at the frank laboratory for neutron physics, joint institute for nuclear research

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. Yu.; Pavlov, S. S.

    2013-01-01

    Software for the automated quantitative determination of element concentrations in samples is described. This software is used in neutron activation analysis (NAA) at the IBR-2 reactor of the Frank Laboratory for Neutron Physics, Joint Institute for Nuclear Research (FLNP JINR).

  3. Assessment of fully automated antibody homology modeling protocols in molecular operating environment

    PubMed Central

    Maier, Johannes K X; Labute, Paul

    2014-01-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed. PMID:24715627

  4. Assessment of fully automated antibody homology modeling protocols in molecular operating environment.

    PubMed

    Maier, Johannes K X; Labute, Paul

    2014-08-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed. PMID:24715627

  5. Development of a field-friendly automated dietary assessment tool and nutrient database for India.

    PubMed

    Daniel, Carrie R; Kapur, Kavita; McAdams, Mary J; Dixit-Joshi, Sujata; Devasenapathy, Niveditha; Shetty, Hemali; Hariharan, Sriram; George, Preethi S; Mathew, Aleyamma; Sinha, Rashmi

    2014-01-14

    Studies of diet and disease risk in India and among other Asian-Indian populations are hindered by the need for a comprehensive dietary assessment tool to capture data on the wide variety of food and nutrient intakes across different regions and ethnic groups. The nutritional component of the India Health Study, a multicentre pilot cohort study, included 3908 men and women, aged 35-69 years, residing in three regions of India (New Delhi in the north, Mumbai in the west and Trivandrum in the south). We developed a computer-based, interviewer-administered dietary assessment software known as the 'NINA-DISH (New Interactive Nutrition Assistant - Diet in India Study of Health)', which consisted of four sections: (1) a diet history questionnaire with defined questions on frequency and portion size; (2) an open-ended section for each mealtime; (3) a food-preparer questionnaire; (4) a 24 h dietary recall. Using the preferred meal-based approach, frequency of intake and portion size were recorded and linked to a nutrient database that we developed and modified from a set of existing international databases containing data on Indian foods and recipes. The NINA-DISH software was designed to be easily adaptable and was well accepted by the interviewers and participants in the field. A predominant three-meal eating pattern emerged; however, patterns in the number of foods reported and the primary contributors to macro- and micronutrient intakes differed by region and demographic factors. The newly developed NINA-DISH software provides a much-needed tool for measuring diet and nutrient profiles across the diverse populations of India with the potential for application in other South Asian populations living throughout the world. PMID:23796477

  6. Automated quantitative assessment of cardiovascular magnetic resonance-derived atrioventricular junction velocities.

    PubMed

    Leng, Shuang; Zhao, Xiao-Dan; Huang, Fei-Qiong; Wong, Jia-Ing; Su, Bo-Yang; Allen, John Carson; Kassab, Ghassan S; Tan, Ru-San; Zhong, Liang

    2015-12-01

    The assessment of atrioventricular junction (AVJ) deformation plays an important role in evaluating left ventricular systolic and diastolic function in clinical practice. This study aims to demonstrate the effectiveness and consistency of cardiovascular magnetic resonance (CMR) for quantitative assessment of AVJ velocity compared with tissue Doppler echocardiography (TDE). A group of 145 human subjects comprising 21 healthy volunteers, 8 patients with heart failure, 17 patients with hypertrophic cardiomyopathy, 52 patients with myocardial infarction, and 47 patients with repaired Tetralogy of Fallot were prospectively enrolled and underwent TDE and CMR scan. Six AVJ points were tracked with three CMR views. The peak systolic velocity (Sm1), diastolic velocity during early diastolic filling (Em), and late diastolic velocity during atrial contraction (Am) were extracted and analyzed. All CMR-derived septal and lateral AVJ velocities correlated well with TDE measurements (Sm1: r = 0.736; Em: r = 0.835; Am: r = 0.701; Em/Am: r = 0.691; all p < 0.001) and demonstrated excellent reproducibility [intrastudy: r = 0.921-0.991, intraclass correlation coefficient (ICC): 0.918-0.991; interstudy: r = 0.900-0.970, ICC: 0.887-0.957; all p < 0.001]. The evaluation of three-dimensional AVJ motion incorporating measurements from all views better differentiated normal and diseased states [area under the curve (AUC) = 0.918] and provided further insights into mechanical dyssynchrony diagnosis in HF patients (AUC = 0.987). These findings suggest that the CMR-based method is feasible, accurate, and consistent in quantifying the AVJ deformation, and subsequently in diagnosing systolic and diastolic cardiac dysfunction. PMID:26408537

  7. 296-B-5 Stack monitoring and sampling system annual system assessment report

    SciTech Connect

    Ridge, T.M.

    1995-02-01

    The B Plant Administration Manual requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 296-B-5 at B Plant. The sampling and monitoring system associated with stack 296-B-5 is functional and performing satisfactorily. This document is an annual assessment report of the systems associated with the 296-B-5 stack.

  8. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    SciTech Connect

    Gwynne, Sarah; Spezi, Emiliano; Wills, Lucy; Nixon, Lisette; Hurt, Chris; Joseph, George; Evans, Mererid; Griffiths, Gareth; Crosby, Tom; Staffurth, John

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  9. Assessing the accuracy and repeatability of automated photogrammetrically generated digital surface models from unmanned aerial system imagery

    NASA Astrophysics Data System (ADS)

    Chavis, Christopher

    Using commercial digital cameras in conjunction with Unmanned Aerial Systems (UAS) to generate 3-D Digital Surface Models (DSMs) and orthomosaics is emerging as a cost-effective alternative to Light Detection and Ranging (LiDAR). Powerful software applications such as Pix4D and APS can automate the generation of DSM and orthomosaic products from a handful of inputs. However, the accuracy of these models is relatively untested. The objectives of this study were to generate multiple DSM and orthomosaic pairs of the same area using Pix4D and APS from flights of imagery collected with a lightweight UAS. The accuracy of each individual DSM was assessed in addition to the consistency of the method to model one location over a period of time. Finally, this study determined if the DSMs automatically generated using lightweight UAS and commercial digital cameras could be used for detecting changes in elevation and at what scale. Accuracy was determined by comparing DSMs to a series of reference points collected with survey grade GPS. Other GPS points were also used as control points to georeference the products within Pix4D and APS. The effectiveness of the products for change detection was assessed through image differencing and observance of artificially induced, known elevation changes. The vertical accuracy with the optimal data and model is ≈ 25 cm and the highest consistency over repeat flights is a standard deviation of ≈ 5 cm. Elevation change detection based on such UAS imagery and DSM models should be viable for detecting infrastructure change in urban or suburban environments with little dense canopy vegetation.

  10. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    SciTech Connect

    Barbara H. Dolphin; William D. RIchins; Stephen R. Novascone

    2010-10-01

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  11. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  12. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography.

    PubMed

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-03-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of-interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  13. Near-Infrared Fluorescent Digital Pathology for the Automation of Disease Diagnosis and Biomarker Assessment

    PubMed Central

    Gibbs, Summer L.; Genega, Elizabeth; Salemi, Jeffery; Kianzad, Vida; Goodwill, Haley L.; Xie, Yang; Oketokoun, Rafiou; Khurd, Parmeshwar; Kamen, Ali; Frangioni, John V.

    2015-01-01

    Hematoxylin and eosin (H&E) staining of tissue has been the mainstay of pathology for more than a century. However, the learning curve for H&E tissue interpretation is long while intra- and interobserver variability remains high. Computer-assisted image analysis of H&E sections holds promise for increased throughput and decreased variability, but has yet to demonstrate significant improvement in diagnostic accuracy. Addition of biomarkers to H&E staining can improve diagnostic accuracy, however co-registration of immunohistochemical staining with H&E is problematic as immunostaining is completed on slides that are at best 4 μm apart. Simultaneous H&E and immunostaining would alleviate co-registration problems; however current opaque pigments used for immunostaining obscure H&E. In this study, we demonstrate that diagnostic information provided by two or more independent wavelengths of near-infrared (NIR) fluorescence leave the H&E stain unchanged while enabling computer-assisted diagnosis and assessment of human disease. Using prostate cancer as a model system, we introduce NIR digital pathology and demonstrate its utility along the spectrum from prostate biopsy to whole mount analysis of H&E-stained tissue. PMID:25812603

  14. Using Teacher Work Samples to Develop and Assess Best Practices in Physical Education Teacher Education

    ERIC Educational Resources Information Center

    Sariscsany, Mary Jo

    2010-01-01

    Teacher work samples (TWS) are an integrated, comprehensive assessment tool that can be used as evidence of a beginning teacher's readiness to teach. Unlike linear assessments used to determine teaching effectiveness, TWS are relevant and reflective of "real" teaching. They are designed to exhibit a clear relationship among teacher candidate…

  15. In situ derivatization combined to automated microextraction by packed sorbents for the determination of chlorophenols in soil samples by gas chromatography mass spectrometry.

    PubMed

    González Paredes, Rosa María; García Pinto, Carmelo; Pérez Pavón, José Luis; Moreno Cordero, Bernardo

    2014-09-12

    A method based on the coupling of in situ extraction and derivatization of chlorophenols (CPs) (2-chlorophenol, 4-chloro-3-methylphenol, 2,4-dichlorophenol, and 2,4,6-trichlorophenol) from soils, accomplishing their preconcentration by means of automated microextraction by packed sorbent (MEPS), is proposed. After extraction and acylation of the chlorophenols in aqueous medium, the liquid phase obtained is subjected to the MEPS procedure. The QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) and MEPS techniques were compared and the results confirmed the preconcentration carried out with MEPS. The existence of a matrix effect was checked and the analytical characteristics of the method were determined in a soil sample. The method provided good linearity (from 1 to 12μgkg(-1)), together with good repeatability and reproducibility values (RSD equal to or less than 10%). The limits of detection were in the 0.118-0.894μgkg(-1) range. A certified reference material was applied to validate the proposed methodology. PMID:25113872

  16. A fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure: Liquid chromatographic determination of ofloxacin in human urine samples.

    PubMed

    Vakh, Christina; Pochivalov, Aleksei; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-02-11

    A novel fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure has been suggested. In this extraction method, medium-chain saturated fatty acids were investigated as switchable hydrophilicity solvents. The conversion of fatty acid into hydrophilic form was carried out in the presence of sodium carbonate. The injection of sulfuric acid into the solution decreased the pH value of the solution, thus, microdroplets of the fatty acid were generated. Carbon dioxide bubbles were generated in-situ, and promoted the extraction process and final phase separation. The performance of the suggested approach was demonstrated by the determination of ofloxacin in human urine samples using high-performance liquid chromatography with fluorescence detection. This analytical task was used as a proof-of-concept example. Under the optimal conditions, the detector response of ofloxacin was linear in the concentration ranges of 3·10(-8)-3·10(-6) mol L(-1). The limit of detection, calculated from a blank test based on 3σ, was 1·10(-8) mol L(-1). The results demonstrated that the presented approach is highly cost-effective, simple, rapid and environmentally friendly. PMID:26803002

  17. Clinical Sampling in the Assessment of Young Children with Handicaps: Shopping for Skills.

    ERIC Educational Resources Information Center

    LeVan, Richard R.

    1990-01-01

    The article proposes a strategy for the assessment of young children with handicaps in which brief clinical sampling is used to document developmental skills. This information may then be used in designing intervention programs. Principles and techniques of brief clinical sampling are described and application examples are given. (Author/DB)

  18. Assessing Writing Competence through Writing Samples. Studies in Language Education Report No. 36.

    ERIC Educational Resources Information Center

    Hudson, Sally; Veal, Ramon

    This report discusses a plan to help Georgia school systems develop their own evaluation instruments for student writing. The report indicates that the use of actual writing samples in assessing writing skills can be valid, reliable, and economical. It includes attachments with sample writing scores from a school, the purposes and statistics that…

  19. TAXONOMIC LEVEL AND SAMPLE SIZE SUFFICIENT FOR ASSESSING POLLUTION IMPACTS ON THE SOUTHERN CALIFORNIA BIGHT MACROBENTHOS

    EPA Science Inventory

    Macrobenthic data from samples taken in 1980, 1983 and 1985 along a pollution gradient in the Southern California Bight (USA) were analyzed at 5 taxonomic levels (species, genus, family, order, phylum) to determIne the taxon and sample size sufficient for assessing pollution impa...

  20. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    ERIC Educational Resources Information Center

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…