Science.gov

Sample records for automated sampling assessment

  1. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  2. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    SciTech Connect

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the accuracy of the

  3. Continuous Monitoring, Automated Analyses, and Sampling Procedures.

    ERIC Educational Resources Information Center

    Hensley, C. P.; And Others

    1978-01-01

    Presents water analysis literature, covering publications of 1976-77. This series covers: (1) monitoring strategies and sampling protocols; (2) continuous monitoring applications; (3) biological monitoring systems; and (4) automated analysis. A list of 57 references is also presented. (HM)

  4. Automated microorganism Sample Collection Module

    NASA Technical Reports Server (NTRS)

    Gall, L. S.; Graham, M. D.; Umbreit, W.

    1969-01-01

    Modified Gelman Sampler obtains representative sample of microorganism population. Proposed Sample Collection Module is based on direct inoculation of selected solid growth media encased in a cartridge at all times except during inoculation. Cartridge can be handled with no danger of contamination to sample or operator.

  5. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  6. High throughput sample processing and automated scoring.

    PubMed

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R; Gutzkow, Kristine B

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  7. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  8. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  9. Constructing Aligned Assessments Using Automated Test Construction

    ERIC Educational Resources Information Center

    Porter, Andrew; Polikoff, Morgan S.; Barghaus, Katherine M.; Yang, Rui

    2013-01-01

    We describe an innovative automated test construction algorithm for building aligned achievement tests. By incorporating the algorithm into the test construction process, along with other test construction procedures for building reliable and unbiased assessments, the result is much more valid tests than result from current test construction…

  10. AGWA: The Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  11. Automated Geospatial Watershed Assessment Tool (AGWA)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University ...

  12. Automated data quality assessment of marine sensors.

    PubMed

    Timms, Greg P; de Souza, Paulo A; Reznik, Leon; Smith, Daniel V

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classifications of the gathered data; often as a binary decision of good or bad data that fails to quantify our confidence in the data for use in different applications. We propose a novel framework for automated data quality assessments that uses Fuzzy Logic to provide a continuous scale of data quality. This continuous quality scale is then used to compute error bars upon the data, which quantify the data uncertainty and provide a more meaningful measure of the data's fitness for purpose in a particular application compared with hard quality classifications. The design principles of the framework are presented and enable both data statistics and expert knowledge to be incorporated into the uncertainty assessment. We have implemented and tested the framework upon a real time platform of temperature and conductivity sensors that have been deployed to monitor the Derwent Estuary in Hobart, Australia. Results indicate that the error bars generated from the Fuzzy QA/QC implementation are in good agreement with the error bars manually encoded by a domain expert.

  13. Automated Bone Age Assessment: Motivation, Taxonomies, and Challenges

    PubMed Central

    Ismail, Maizatul Akmar; Herawan, Tutut; Gopal Raj, Ram; Abdul Kareem, Sameem; Nasaruddin, Fariza Hanum

    2013-01-01

    Bone age assessment (BAA) of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research. PMID:24454534

  14. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  15. Rapid Automated Sample Preparation for Biological Assays

    SciTech Connect

    Shusteff, M

    2011-03-04

    Our technology utilizes acoustic, thermal, and electric fields to separate out contaminants such as debris or pollen from environmental samples, lyse open cells, and extract the DNA from the lysate. The objective of the project is to optimize the system described for a forensic sample, and demonstrate its performance for integration with downstream assay platforms (e.g. MIT-LL's ANDE). We intend to increase the quantity of DNA recovered from the sample beyond the current {approx}80% achieved using solid phase extraction methods. Task 1: Develop and test an acoustic filter for cell extraction. Task 2: Develop and test lysis chip. Task 3: Develop and test DNA extraction chip. All chips have been fabricated based on the designs laid out in last month's report.

  16. Automated Sample collection and Analysis unit

    SciTech Connect

    Latner, Norman; Sanderson, Colin G.; Negro, Vincent C.

    1999-03-31

    Autoramp is an atmospheric radionuclide collection and analysis unit designed for unattended operation. A large volume of air passes through one of 31 filter cartridges which is then moved from a sampling chamber and past a bar code reader, to a shielded enclosure. The collected dust-borne radionuclides are counted with a high resolution germanium gamma-ray detector. An analysis is made and the results are transmitted to a central station that can also remotely control the unit.

  17. Enhanced training effectiveness using automated student assessment.

    SciTech Connect

    Forsythe, James Chris

    2010-05-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides an empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback.

  18. Automated biowaste sampling system feces monitoring system

    NASA Technical Reports Server (NTRS)

    Hunt, S. R.; Glanfield, E. J.

    1979-01-01

    The Feces Monitoring System (FMS) Program designed, fabricated, assembled and tested an engineering model waste collector system (WCS) to be used in support of life science and medical experiments related to Shuttle missions. The FMS design was patterned closely after the Shuttle WCS, including: interface provisions; mounting; configuration; and operating procedures. These similarities make it possible to eventually substitute an FMS for the Shuttle WCS of Orbiter. In addition, several advanced waste collection features, including the capability of real-time inertial fecal separation and fecal mass measurement and sampling were incorporated into the FMS design.

  19. Automated Data Quality Assessment of Marine Sensors

    PubMed Central

    Timms, Greg P.; de Souza, Paulo A.; Reznik, Leon; Smith, Daniel V.

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classifications of the gathered data; often as a binary decision of good or bad data that fails to quantify our confidence in the data for use in different applications. We propose a novel framework for automated data quality assessments that uses Fuzzy Logic to provide a continuous scale of data quality. This continuous quality scale is then used to compute error bars upon the data, which quantify the data uncertainty and provide a more meaningful measure of the data’s fitness for purpose in a particular application compared with hard quality classifications. The design principles of the framework are presented and enable both data statistics and expert knowledge to be incorporated into the uncertainty assessment. We have implemented and tested the framework upon a real time platform of temperature and conductivity sensors that have been deployed to monitor the Derwent Estuary in Hobart, Australia. Results indicate that the error bars generated from the Fuzzy QA/QC implementation are in good agreement with the error bars manually encoded by a domain expert. PMID:22163714

  20. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  1. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  2. Automated Power Assessment for Helicopter Turboshaft Engines

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Litt, Jonathan S.

    2008-01-01

    An accurate indication of available power is required for helicopter mission planning purposes. Available power is currently estimated on U.S. Army Blackhawk helicopters by performing a Maximum Power Check (MPC), a manual procedure performed by maintenance pilots on a periodic basis. The MPC establishes Engine Torque Factor (ETF), an indication of available power. It is desirable to replace the current manual MPC procedure with an automated approach that will enable continuous real-time assessment of available power utilizing normal mission data. This report presents an automated power assessment approach which processes data currently collected within helicopter Health and Usage Monitoring System (HUMS) units. The overall approach consists of: 1) a steady-state data filter which identifies and extracts steady-state operating points within HUMS data sets; 2) engine performance curve trend monitoring and updating; and 3) automated ETF calculation. The algorithm is coded in MATLAB (The MathWorks, Inc.) and currently runs on a PC. Results from the application of this technique to HUMS mission data collected from UH-60L aircraft equipped with T700-GE-701C engines are presented and compared to manually calculated ETF values. Potential future enhancements are discussed.

  3. Automated sample preparation for cholesterol determination in foods.

    PubMed

    Johnson, J H; McIntyre, P; Zdunek, J

    1995-12-22

    An automated sample preparation system has been developed for the determination of cholesterol in a wide range of matrices. Isolation of cholesterol is performed with a robotic arm coupled with a series of modular stations. Samples are introduced into the system which adds the appropriate reagents, carries out the saponification, pH adjustment, solid-phase extraction and drying steps. This system was evaluated using 15 different food matrices. The average recovery for NIST standards exceeded 97%. A solution of n-hexane-2-propanol was substituted for the traditional methanol-chloroform extraction. Manual pH adjustment was replaced with a buffer. Manual and automated methods were compared and no difference was observed at the 95% confidence level.

  4. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    NASA Astrophysics Data System (ADS)

    Kundu, S.; Nath, T. K.

    2011-07-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software.

  5. Automated Imaging Techniques for Biosignature Detection in Geologic Samples

    NASA Astrophysics Data System (ADS)

    Williford, K. H.

    2015-12-01

    Robust biosignature detection in geologic samples typically requires the integration of morphological/textural data with biogeochemical data across a variety of scales. We present new automated imaging and coordinated biogeochemical analysis techniques developed at the JPL Astrobiogeochemistry Laboratory (abcLab) in support of biosignature detection in terrestrial samples as well as those that may eventually be returned from Mars. Automated gigapixel mosaic imaging of petrographic thin sections in transmitted and incident light (including UV epifluorescence) is supported by a microscopy platform with a digital XYZ stage. Images are acquired, processed, and co-registered using multiple software platforms at JPL and can be displayed and shared using Gigapan, a freely available, web-based toolset (e.g. . Automated large area (cm-scale) elemental mapping at sub-micrometer spatial resolution is enabled by a variable pressure scanning electron microscope (SEM) with a large (150 mm2) silicon drift energy dispersive spectroscopy (EDS) detector system. The abcLab light and electron microscopy techniques are augmented by additional elemental chemistry, mineralogy and organic detection/classification using laboratory Micro-XRF and UV Raman/fluorescence systems, precursors to the PIXL and SHERLOC instrument platforms selected for flight on the NASA Mars 2020 rover mission. A workflow including careful sample preparation followed by iterative gigapixel imaging, SEM/EDS, Micro-XRF and UV fluorescence/Raman in support of organic, mineralogic, and elemental biosignature target identification and follow up analysis with other techniques including secondary ion mass spectrometry (SIMS) will be discussed.

  6. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  7. Automated blood-sample handling in the clinical laboratory.

    PubMed

    Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O

    1990-09-01

    The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.

  8. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  9. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  10. A Method For Parallel, Automated, Thermal Cycling of Submicroliter Samples

    PubMed Central

    Nakane, Jonathan; Broemeling, David; Donaldson, Roger; Marziali, Andre; Willis, Thomas D.; O'Keefe, Matthew; Davis, Ronald W.

    2001-01-01

    A large fraction of the cost of DNA sequencing and other DNA-analysis processes results from the reagent costs incurred during cycle sequencing or PCR. In particular, the high cost of the enzymes and dyes used in these processes often results in thermal cycling costs exceeding $0.50 per sample. In the case of high-throughput DNA sequencing, this is a significant and unnecessary expense. Improved detection efficiency of new sequencing instrumentation allows the reaction volumes for cycle sequencing to be scaled down to one-tenth of presently used volumes, resulting in at least a 10-fold decrease in the cost of this process. However, commercially available thermal cyclers and automated reaction setup devices have inherent design limitations which make handling volumes of <1 μL extremely difficult. In this paper, we describe a method for thermal cycling aimed at reliable, automated cycling of submicroliter reaction volumes. PMID:11230168

  11. Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments

    ERIC Educational Resources Information Center

    Blayney, Paul; Freeman, Mark

    2004-01-01

    This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…

  12. An automated microfluidic sample preparation system for laser scanning cytometry.

    PubMed

    Wu, Eric; Menon, Vidya; Geddie, William; Sun, Yu

    2011-04-01

    Laser scanning cytometry (LSC) is emerging as a clinical tool. In one application a "Clatch" slide, named after the inventor, is used in conjunction with LSC for cell surface marker immunophenotyping of patient samples. The slide requires time consuming and laborious pipetting steps, making a test tedious and prone to handling errors. The Clatch slide also uses a significant number of cells, limiting the number of analyses on paucicellular samples. This paper presents an automated microfluidic system consisting of a control circuit, a microfluidic system, and an aluminum frame, capable of performing immunophenotyping procedures. This prototype system reduces 36 pipetting steps to 1, reduces the amount of cell sample from 180 μL to 56 μL, and shortens the time used by technicians.

  13. Validation of Automated Scoring of Science Assessments

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  14. Six Key Topics for Automated Assessment Utilisation and Acceptance

    ERIC Educational Resources Information Center

    Reiners, Torsten; Dreher, Carl; Dreher, Heinz

    2011-01-01

    Automated assessment technologies have been used in education for decades (e.g., computerised multiple choice tests). In contrast, Automated Essay Grading (AEG) technologies: have existed for decades; are "good in theory" (e.g., as accurate as humans, temporally and financially efficient, and can enhance formative feedback), and yet; are…

  15. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  16. Automation of sample plan creation for process model calibration

    NASA Astrophysics Data System (ADS)

    Oberschmidt, James; Abdo, Amr; Desouky, Tamer; Al-Imam, Mohamed; Krasnoperova, Azalia; Viswanathan, Ramya

    2010-04-01

    The process of preparing a sample plan for optical and resist model calibration has always been tedious. Not only because it is required to accurately represent full chip designs with countless combinations of widths, spaces and environments, but also because of the constraints imposed by metrology which may result in limiting the number of structures to be measured. Also, there are other limits on the types of these structures, and this is mainly due to the accuracy variation across different types of geometries. For instance, pitch measurements are normally more accurate than corner rounding. Thus, only certain geometrical shapes are mostly considered to create a sample plan. In addition, the time factor is becoming very crucial as we migrate from a technology node to another due to the increase in the number of development and production nodes, and the process is getting more complicated if process window aware models are to be developed in a reasonable time frame, thus there is a need for reliable methods to choose sample plans which also help reduce cycle time. In this context, an automated flow is proposed for sample plan creation. Once the illumination and film stack are defined, all the errors in the input data are fixed and sites are centered. Then, bad sites are excluded. Afterwards, the clean data are reduced based on geometrical resemblance. Also, an editable database of measurement-reliable and critical structures are provided, and their percentage in the final sample plan as well as the total number of 1D/2D samples can be predefined. It has the advantage of eliminating manual selection or filtering techniques, and it provides powerful tools for customizing the final plan, and the time needed to generate these plans is greatly reduced.

  17. Ability-Training-Oriented Automated Assessment in Introductory Programming Course

    ERIC Educational Resources Information Center

    Wang, Tiantian; Su, Xiaohong; Ma, Peijun; Wang, Yuying; Wang, Kuanquan

    2011-01-01

    Learning to program is a difficult process for novice programmers. AutoLEP, an automated learning and assessment system, was developed by us, to aid novice programmers to obtain programming skills. AutoLEP is ability-training-oriented. It adopts a novel assessment mechanism, which combines static analysis with dynamic testing to analyze student…

  18. Using Software Tools to Automate the Assessment of Student Programs.

    ERIC Educational Resources Information Center

    Jackson, David

    1991-01-01

    Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…

  19. Manual versus automated blood sampling: impact of repeated blood sampling on stress parameters and behavior in male NMRI mice

    PubMed Central

    Kalliokoski, Otto; Sørensen, Dorte B; Hau, Jann; Abelson, Klas S P

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters measured, and expressed less anxious behavior. We conclude that repeated blood sampling by automated blood sampling and from the tail vein is less stressful than cheek blood sampling. The choice between automated blood sampling and tail blood sampling should be based on the study requirements, the resources of the laboratory and skills of the staff. PMID:24958546

  20. Automated biowaste sampling system improved feces collection, mass measurement and sampling. [by use of a breadboard model

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Young, R.

    1974-01-01

    The capability of the basic automated Biowaste Sampling System (ABSS) hardware was extended and improved through the design, fabrication and test of breadboard hardware. A preliminary system design effort established the feasibility of integrating the breadboard concepts into the ABSS.

  1. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  2. Needs Assessments for Automated Manufacturing Training Programs.

    ERIC Educational Resources Information Center

    Northampton Community Coll., Bethlehem, PA.

    This document contains needs assessments used by Northampton Community College to develop training courses for a business-industry technology resource center for firms in eastern Pennsylvania. The following needs assessments are included: (1) individual skills survey for workers at Keystone Cement Company; (2) Keystone group skills survey; (3)…

  3. Validity Arguments for Diagnostic Assessment Using Automated Writing Evaluation

    ERIC Educational Resources Information Center

    Chapelle, Carol A.; Cotos, Elena; Lee, Jooyoung

    2015-01-01

    Two examples demonstrate an argument-based approach to validation of diagnostic assessment using automated writing evaluation (AWE). "Criterion"®, was developed by Educational Testing Service to analyze students' papers grammatically, providing sentence-level error feedback. An interpretive argument was developed for its use as part of…

  4. Human and Automated Assessment of Oral Reading Fluency

    ERIC Educational Resources Information Center

    Bolaños, Daniel; Cole, Ron A.; Ward, Wayne H.; Tindal, Gerald A.; Hasbrouck, Jan; Schwanenflugel, Paula J.

    2013-01-01

    This article describes a comprehensive approach to fully automated assessment of children's oral reading fluency (ORF), one of the most informative and frequently administered measures of children's reading ability. Speech recognition and machine learning techniques are described that model the 3 components of oral reading fluency: word accuracy,…

  5. Automated Geospatial Watershed Assessment (AGWA) Documentation Version 2.0

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Http://www.epa.gov/nerlesd1/landsci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, University of Arizon...

  6. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  7. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  8. Automated Rendezvous and Capture in Space: A Technology Assessment

    NASA Technical Reports Server (NTRS)

    Polites, Michael E.

    1998-01-01

    This paper presents the results of a study to assess the technology of automated rendezvous and capture (AR&C) in space. The outline of the paper is as follows: First, the history of manual and automated rendezvous and capture and rendezvous and dock is presented. Next, the need for AR&C in space is reviewed. In light of these, AR&C systems are proposed that meet NASA's future needs, but can be developed in a reasonable amount of time with a reasonable amount of money. Technology plans for developing these systems are presented; cost and schedule are included.

  9. A modular approach for automated sample preparation and chemical analysis

    NASA Technical Reports Server (NTRS)

    Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph

    1994-01-01

    Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.

  10. Operator-based metric for nuclear operations automation assessment

    SciTech Connect

    Zacharias, G.L.; Miao, A.X.; Kalkan, A.

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  11. Requirements specification for automated fall and injury risk assessment.

    PubMed

    Currie, Leanne M; Mellino, Lourdes V; Cimino, James J; Li, Jianhua; Bakken, Suzanne

    2006-01-01

    Fall and injury prevention continues to be a challenge in the acute care environment. Identification of patients at risk can guide preventive care for these individuals. The following study employed usability engineering methods via a series of focus groups, to specify functional and design requirements for an automated Fall-Injury Risk Assessment Instrument. Focus groups were held with interdisciplinary decision makers and end-users to identify functional and design specifications for the automated instrument. The results were mapped to usability heuristics, which were used to guide design decisions. The main elements identified were data completeness, workflow processes, resource access, and cognitive burden. The main usability factors identified were efficiency of user, match with real world, error prevention, recognition not recall and minimalist design. Focus groups are a useful methodology to specify requirements for healthcare applications. Outcomes evaluation of the automated instrument is in process. PMID:17102234

  12. Assessing Mitochondrial Movement Within Neurons: Manual Versus Automated Tracking Methods.

    PubMed

    Bros, Helena; Hauser, Anja; Paul, Friedemann; Niesner, Raluca; Infante-Duarte, Carmen

    2015-08-01

    Owing to the small size of mitochondria and the complexity of their motility patterns, mitochondrial tracking is technically challenging. Mitochondria are often tracked manually; however, this is time-consuming and prone to measurement error. Here, we examined the suitability of four commercial and open-source software alternatives for automated mitochondrial tracking in neurons compared with manual measurements. We show that all the automated tracking tools dramatically underestimated track length, mitochondrial displacement and movement duration, with reductions ranging from 45 to 77% of the values obtained manually. In contrast, mitochondrial velocity was generally overestimated. Only the number of motile mitochondria and their directionality were similar between strategies. Despite these discrepancies, we show that automated tools successfully detected transport alterations after applying an oxidant agent. Thus, automated methods appear to be suitable for assessing relative transport differences between experimental groups, but not for absolute quantification of mitochondrial dynamics. Although useful for objective and time-efficient measurements of mitochondrial movements, results provided by automated methods should be interpreted with caution.

  13. Systematic Nursing Assessment: A Step toward Automation.

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. School of Nursing.

    The project's broad objective was to improve patient care through the development of a manual or computer-assisted tool for assessing patient health/illness status and recording essential information throughout a period of care. The project sought contributions from practicing nurses in identifying the descriptive clinical information needed to…

  14. Automated Assessment and Experiences of Teaching Programming

    ERIC Educational Resources Information Center

    Higgins, Colin A.; Gray, Geoffrey; Symeonidis, Pavlos; Tsintsifas, Athanasios

    2005-01-01

    This article reports on the design, implementation, and usage of the CourseMarker (formerly known as CourseMaster) courseware Computer Based Assessment (CBA) system at the University of Nottingham. Students use CourseMarker to solve (programming) exercises and to submit their solutions. CourseMarker returns immediate results and feedback to the…

  15. Evaluation of the measurement uncertainty in automated long-term sampling of PCDD/PCDFs.

    PubMed

    Vicaretti, M; D'Emilia, G; Mosca, S; Guerriero, E; Rotatori, M

    2013-12-01

    Since the publication of the first version of European standard EN-1948 in 1996, long-term sampling equipment has been improved to a high standard for the sampling and analysis of polychlorodibenzo-p-dioxin (PCDD)/polychlorodibenzofuran (PCDF) emissions from industrial sources. The current automated PCDD/PCDF sampling systems enable to extend the measurement time from 6-8 h to 15-30 days in order to have data values better representative of the real pollutant emission of the plant in the long period. EN-1948:2006 is still the European technical reference standard for the determination of PCDD/PCDF from stationary source emissions. In this paper, a methodology to estimate the measurement uncertainty of long-term automated sampling is presented. The methodology has been tested on a set of high concentration sampling data resulting from a specific experience; it is proposed with the intent that it is to be applied on further similar studies and generalized. A comparison between short-term sampling data resulting from manual and automated parallel measurements has been considered also in order to verify the feasibility and usefulness of automated systems and to establish correlations between results of the two methods to use a manual method for calibration of automatic long-term one. The uncertainty components of the manual method are analyzed, following the requirements of EN-1948-3:2006, allowing to have a preliminary evaluation of the corresponding uncertainty components of the automated system. Then, a comparison between experimental data coming from parallel sampling campaigns carried out in short- and long-term sampling periods is realized. Long-term sampling is more reliable to monitor PCDD/PCDF emissions than occasional short-term sampling. Automated sampling systems can assure very useful emission data both in short and long sampling periods. Despite this, due to the different application of the long-term sampling systems, the automated results could not be

  16. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  17. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED TOOL FOR WATERSHED ASSESSMENT AND PLANNING

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  18. Classification, change-detection and accuracy assessment: Toward fuller automation

    NASA Astrophysics Data System (ADS)

    Podger, Nancy E.

    This research aims to automate methods for conducting change detection studies using remotely sensed images. Five major objectives were tested on two study sites, one encompassing Madison, Wisconsin, and the other Fort Hood, Texas. (Objective 1) Enhance accuracy assessments by estimating standard errors using bootstrap analysis. Bootstrap estimates of the standard errors were found to be comparable to parametric statistical estimates. Also, results show that bootstrapping can be used to evaluate the consistency of a classification process. (Objective 2) Automate the guided clustering classifier. This research shows that the guided clustering classification process can be automated while maintaining highly accurate results. Three different evaluation methods were used. (Evaluation 1) Appraised the consistency of 25 classifications produced from the automated system. The classifications differed from one another by only two to four percent. (Evaluation 2) Compared accuracies produced by the automated system to classification accuracies generated following a manual guided clustering protocol. Results: The automated system produced higher overall accuracies in 50 percent of the tests and was comparable for all but one of the remaining tests. (Evaluation 3) Assessed the time and effort required to produce accurate classifications. Results: The automated system produced classifications in less time and with less effort than the manual 'protocol' method. (Objective 3) Built a flexible, interactive software tool to aid in producing binary change masks. (Objective 4) Reduced by automation the amount of training data needed to classify the second image of a two-time-period change detection project. Locations of the training sites in 'unchanged' areas employed to classify the first image were used to identify sites where spectral information was automatically extracted from the second image. Results: The automatically generated training data produces classification accuracies

  19. Automated assessment of medical training evaluation text.

    PubMed

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data.

  20. Automated assessment of medical training evaluation text.

    PubMed

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data. PMID:23304426

  1. Situation Awareness and Levels of Automation: Empirical Assessment of Levels of Automation in the Commercial Cockpit

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Schutte, Paul C. (Technical Monitor)

    2000-01-01

    This report has been prepared to closeout a NASA grant to Mississippi State University (MSU) for research into situation awareness (SA) and automation in the advanced commercial aircraft cockpit. The grant was divided into two obligations including $60,000 for the period from May 11, 2000 to December 25, 2000. The information presented in this report summarizes work completed through this obligation. It also details work to be completed with the balance of the current obligation and unobligated funds amounting to $50,043, which are to be granted to North Carolina State University for completion of the research project from July 31, 2000 to May 10, 2001. This research was to involve investigation of a broad spectrum of degrees of automation of complex systems on human-machine performance and SA. The work was to empirically assess the effect of theoretical levels of automation (LOAs) described in a taxonomy developed by Endsley & Kaber (1999) on naive and experienced subject performance and SA in simulated flight tasks. The study was to be conducted in the context of a realistic simulation of aircraft flight control. The objective of this work was to identify LOAs that effectively integrate humans and machines under normal operating conditions and failure modes. In general, the work was to provide insight into the design of automation in the commercial aircraft cockpit. Both laboratory and field investigations were to be conducted. At this point in time, a high-fidelity flight simulator of the McDonald Douglas (MD) 11 aircraft has been completed. The simulator integrates a reconfigurable flight simulator developed by the Georgia Institute of Technology and stand-alone simulations of MD-11 autoflight systems developed at MSU. Use of the simulator has been integrated into a study plan for the laboratory research and it is expected that the simulator will also be used in the field study with actual commercial pilots. In addition to the flight simulator, an electronic

  2. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  3. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  4. Automated versus Manual Sample Inoculations in Routine Clinical Microbiology: a Performance Evaluation of the Fully Automated InoqulA Instrument

    PubMed Central

    Froment, P.; Marchandin, H.; Vande Perre, P.

    2014-01-01

    The process of plate streaking has been automated to improve the culture readings, isolation quality, and workflow of microbiology laboratories. However, instruments have not been well evaluated under routine conditions. We aimed to evaluate the performance of the fully automated InoqulA instrument (BD Kiestra B.V., The Netherlands) in the automated seeding of liquid specimens and samples collected using swabs with transport medium. We compared manual and automated methods according to the (i) within-run reproducibility using Escherichia coli-calibrated suspensions, (ii) intersample contamination using a series of alternating sterile broths and broths with >105 CFU/ml of either E. coli or Proteus mirabilis, (iii) isolation quality with standardized mixed bacterial suspensions of diverse complexity and a 4-category standardized scale (very poor, poor, fair to good, or excellent), and (iv) agreement of the results obtained from 244 clinical specimens. By involving 15 technicians in the latter part of the comparative study, we estimated the variability in the culture quality at the level of the laboratory team. The instrument produced satisfactory reproducibility with no sample cross-contamination, and it performed better than the manual method, with more colony types recovered and isolated (up to 11% and 17%, respectively). Finally, we showed that the instrument did not shorten the seeding time over short periods of work compared to that for the manual method. Altogether, the instrument improved the quality and standardization of the isolation, thereby contributing to a better overall workflow, shortened the time to results, and provided more accurate results for polymicrobial specimens. PMID:24353001

  5. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    SciTech Connect

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Gaponov, Yurii; Wakatsuki, Soichi

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype of a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.

  6. The Stanford Automated Mounter: Pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    DOE PAGES

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-02-24

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.

  7. The Stanford Automated Mounter: pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    PubMed Central

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-01-01

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. Sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade. PMID:27047309

  8. Automated Research Impact Assessment: A New Bibliometrics Approach

    PubMed Central

    Drew, Christina H.; Pettibone, Kristianna G.; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-01-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in “important” research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research. PMID:26989272

  9. Automated biowaste sampling system, solids subsystem operating model, part 2

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Stauffer, R. E.

    1973-01-01

    The detail design and fabrication of the Solids Subsystem were implemented. The system's capacity for the collection, storage or sampling of feces and vomitus from six subjects was tested and verified.

  10. Automated sample treatment with the injection of large sample volumes for the determination of contaminants and metabolites in urine.

    PubMed

    Rodríguez-Gonzalo, Encarnación; García-Gómez, Diego; Herrero-Hernández, Eliseo; Carabias-Martínez, Rita

    2010-08-01

    This work reports the development of a simple and automated method for the quantitative determination of several contaminants (triazine, phenylurea, and phenoxyacid herbicides; carbamate insecticides and industrial chemicals) and their metabolites in human urine with a simplified sample treatment. The method is based on the online coupling of an extraction column with RP LC separation-UV detection; this coupling enabled fast online cleanup of the urine samples, efficiently eliminating matrix components and providing appropriate selectivity for the determination of such compounds. The variables affecting the automated method were optimized: sorbent type, washing solvent and time, and the sample volume injected. The optimized sample treatment reported here allowed the direct injection of large volumes of urine (1500 microL) into the online system as a way to improve the sensitivity of the method; limits of detection in the 1-10 ng/mL range were achieved for an injected volume of 1500 microL of urine, precision being 10% or better at a concentration level of 20 ng/mL. The online configuration proposed has advantages such as automation (all the steps involved in the analysis - injection of the urine, sample cleanup, analyte enrichment, separation and detection - are carried out automatically) with high precision and sensitivity, reducing manual sample manipulation to freezing and sample filtration.

  11. Automated biowaste sampling system urine subsystem operating model, part 1

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  12. Continuous monitoring, automated analysis, and sampling procedures. [Review (63 references)

    SciTech Connect

    Pitt, W.W. Jr.

    1981-06-01

    This article emphasizes the the need for a well documented quality control system in waste water monitoring and sampling procedures. The US EPA has continued its strong emphasis on effluent monitoring and has published a list of 155 organic chemicals and 23 plastic or synthetic materials industries for which it proposed to require monitoring the process waste water under the Clean Water Act. (KRM)

  13. An Automated Sample Divider for Farmers Stock Peanuts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In-shell peanuts are harvested, loaded into drying trailers, and delivered to a central facility where they are dried to a moisture content safe for long term storage, sampled, graded, then unloaded into bulk storage. Drying trailers have capacities ranging from five to twenty-five tons of dry farme...

  14. Automation of the Papanicolaou smear: a technology assessment perspective.

    PubMed

    Linder, J

    1997-03-01

    Cytology automation has captured the attention of industry, the public, and the pathology community as a potential solution to false negatives and other limitations of the conventional Papanicolaou smear. However, cytology automation includes a mixed group of technologies, including cytology rescreening, prescreening, independent screening, automated preparation technologies, and screening process control. While certain of these technologies may prove valuable to improving the quality of the Papanicolaou test, a structured analysis approach, such as is offered by technology assessment, is required to determine whether the technology is safe, effective under conditions of actual use, cost-effective, and whether it adds value and improves outcomes in patient care. Such studies must be carefully constructed to eliminate bias so that proper decisions can be made. The implications of these devices on individual screening standards are yet to be determined; sufficient peer-reviewed literature studies must accumulate to document their value, and the cytology community must participate with other interested parties in establishing the standard for care.

  15. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  16. The Impact of Sampling Approach on Population Invariance in Automated Scoring of Essays. Research Report. ETS RR-13-18

    ERIC Educational Resources Information Center

    Zhang, Mo

    2013-01-01

    Many testing programs use automated scoring to grade essays. One issue in automated essay scoring that has not been examined adequately is population invariance and its causes. The primary purpose of this study was to investigate the impact of sampling in model calibration on population invariance of automated scores. This study analyzed scores…

  17. Automated Portable Test System (APTS) - A performance envelope assessment tool

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Dunlap, W. P.; Jones, M. B.; Wilkes, R. L.; Bittner, A. C., Jr.

    1985-01-01

    The reliability and stability of microcomputer-based psychological tests are evaluated. The hardware, test programs, and system control of the Automated Portable Test System, which assesses human performance and subjective status, are described. Subjects were administered 11 pen-and-pencil and microcomputer-based tests for 10 sessions. The data reveal that nine of the 10 tests stabilized by the third administration; inertial correlations were high and consistent. It is noted that the microcomputer-based tests display good psychometric properties in terms of differential stability and reliability.

  18. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    SciTech Connect

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-07

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on ''puck-shaped'' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed.

  19. Automated syringe sampler. [remote sampling of air and water

    NASA Technical Reports Server (NTRS)

    Purgold, G. C. (Inventor)

    1981-01-01

    A number of sampling services are disposed in a rack which slides into a housing. In response to a signal from an antenna, the circutry elements are activated which provide power individually, collectively, or selectively to a servomechanism thereby moving an actuator arm and the attached jawed bracket supporting an evaculated tube towards a stationary needle. One open end of the needle extends through the side wall of a conduit to the interior and the other open end is maintained within the protective sleeve, supported by a bifurcated bracket. A septum in punctured by the end of the needle within the sleeve and a sample of the fluid medium in the conduit flows through the needle and is transferred to a tube. The signal to the servo is then reversed and the actuator arm moves the tube back to its original position permitting the septum to expand and seal the hole made by the needle. The jawed bracket is attached by pivot to the actuator to facilitate tube replacement.

  20. Assessing Working Memory in Spanish-Speaking Children: Automated Working Memory Assessment Battery Adaptation

    ERIC Educational Resources Information Center

    Injoque-Ricle, Irene; Calero, Alejandra D.; Alloway, Tracy P.; Burin, Debora I.

    2011-01-01

    The Automated Working Memory Assessment battery was designed to assess verbal and visuospatial passive and active working memory processing in children and adolescents. The aim of this paper is to present the adaptation and validation of the AWMA battery to Argentinean Spanish-speaking children aged 6 to 11 years. Verbal subtests were adapted and…

  1. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Fire Management and Assessment.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impacts of wildland fire on runoff and erosion. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface joi...

  2. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    NASA Astrophysics Data System (ADS)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  3. Assessing respondent-driven sampling.

    PubMed

    Goel, Sharad; Salganik, Matthew J

    2010-04-13

    Respondent-driven sampling (RDS) is a network-based technique for estimating traits in hard-to-reach populations, for example, the prevalence of HIV among drug injectors. In recent years RDS has been used in more than 120 studies in more than 20 countries and by leading public health organizations, including the Centers for Disease Control and Prevention in the United States. Despite the widespread use and growing popularity of RDS, there has been little empirical validation of the methodology. Here we investigate the performance of RDS by simulating sampling from 85 known, network populations. Across a variety of traits we find that RDS is substantially less accurate than generally acknowledged and that reported RDS confidence intervals are misleadingly narrow. Moreover, because we model a best-case scenario in which the theoretical RDS sampling assumptions hold exactly, it is unlikely that RDS performs any better in practice than in our simulations. Notably, the poor performance of RDS is driven not by the bias but by the high variance of estimates, a possibility that had been largely overlooked in the RDS literature. Given the consistency of our results across networks and our generous sampling conditions, we conclude that RDS as currently practiced may not be suitable for key aspects of public health surveillance where it is now extensively applied. PMID:20351258

  4. An automated integrated platform for rapid and sensitive multiplexed protein profiling using human saliva samples

    PubMed Central

    Nie, Shuai; Henley, W. Hampton; Miller, Scott E.; Zhang, Huaibin; Mayer, Kathryn M.; Dennis, Patty J.; Oblath, Emily A.; Alarie, Jean Pierre; Wu, Yue; Oppenheim, Frank G.; Little, Frédéric F.; Uluer, Ahmet Z.; Wang, Peidong; Ramsey, J. Michael

    2014-01-01

    During the last decade, saliva has emerged as a potentially ideal diagnostic biofluid for noninvasive testing. In this paper, we present an automated, integrated platform useable by minimally trained personnel in the field for the diagnosis of respiratory diseases using human saliva as a sample specimen. In this platform, a saliva sample is loaded onto a disposable microfluidic chip containing all the necessary reagents and components required for saliva analysis. The chip is then inserted into the automated analyzer, the SDReader, where multiple potential protein biomarkers for respiratory diseases are measured simultaneously using a microsphere-based array via fluorescence sandwich immunoassays. The results are read optically, and the images are analyzed by a custom-designed algorithm. The fully automated assay requires as little as 10 μL of saliva sample, and the results are reported in 70 min. The performance of the platform was characterized by testing protein standard solutions, and the results were comparable to those from the 3.5-h lab bench assay that we have previously reported. The device was also deployed in two clinical environments where 273 human saliva samples collected from different subjects were successfully tested, demonstrating the device’s potential to assist clinicians with the diagnosis of respiratory diseases by providing timely protein biomarker profiling information. This platform, which combines non-invasive sample collection and fully automated analysis, can also be utilized in point-of-care diagnostics. PMID:24448498

  5. Automated LSA Assessment of Summaries in Distance Education: Some Variables to Be Considered

    ERIC Educational Resources Information Center

    Jorge-Botana, Guillermo; Luzón, José M.; Gómez-Veiga, Isabel; Martín-Cordero, Jesús I.

    2015-01-01

    A latent semantic analysis-based automated summary assessment is described; this automated system is applied to a real learning from text task in a Distance Education context. We comment on the use of automated content, plagiarism, text coherence measures, and word weights average and their impact on predicting human judges summary scoring. A…

  6. Automated system for global atmospheric sampling using B-747 airliners. Final report

    SciTech Connect

    Lew, K.Q.; Gustafsson, U.R.C.; Johnson, R.E.

    1981-10-01

    The global air sampling program utilizes commercial aircraft in scheduled service to measure atmospheric constituents. A fully automated system designed for the 747 aircraft is described. Airline operational constraints and data and control subsystems are treated. The overall program management, system monitoring, and data retrieval from four aircraft in global service is described.

  7. Automated Video Quality Assessment for Deep-Sea Video

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating

  8. Automated bone age assessment of older children using the radius

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Gertych, Arkadiusz; Zhang, Aifeng; Liu, Brent J.; Huang, Han K.

    2008-03-01

    The Digital Hand Atlas in Assessment of Skeletal Development is a large-scale Computer Aided Diagnosis (CAD) project for automating the process of grading Skeletal Development of children from 0-18 years of age. It includes a complete collection of 1,400 normal hand X-rays of children between the ages of 0-18 years of age. Bone Age Assessment is used as an index of skeletal development for detection of growth pathologies that can be related to endocrine, malnutrition and other disease types. Previous work at the Image Processing and Informatics Lab (IPILab) allowed the bone age CAD algorithm to accurately assess bone age of children from 1 to 16 (male) or 14 (female) years of age using the Phalanges as well as the Carpal Bones. At the older ages (16(male) or 14(female) -19 years of age) the Phalanges as well as the Carpal Bones are fully developed and do not provide well-defined features for accurate bone age assessment. Therefore integration of the Radius Bone as a region of interest (ROI) is greatly needed and will significantly improve the ability to accurately assess the bone age of older children. Preliminary studies show that an integrated Bone Age CAD that utilizes the Phalanges, Carpal Bones and Radius forms a robust method for automatic bone age assessment throughout the entire age range (1-19 years of age).

  9. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements.

  10. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    PubMed

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences.

  11. Automated quality assessment in three-dimensional breast ultrasound images.

    PubMed

    Schwaab, Julia; Diez, Yago; Oliver, Arnau; Martí, Robert; van Zelst, Jan; Gubern-Mérida, Albert; Mourri, Ahmed Bensouda; Gregori, Johannes; Günther, Matthias

    2016-04-01

    Automated three-dimensional breast ultrasound (ABUS) is a valuable adjunct to x-ray mammography for breast cancer screening of women with dense breasts. High image quality is essential for proper diagnostics and computer-aided detection. We propose an automated image quality assessment system for ABUS images that detects artifacts at the time of acquisition. Therefore, we study three aspects that can corrupt ABUS images: the nipple position relative to the rest of the breast, the shadow caused by the nipple, and the shape of the breast contour on the image. Image processing and machine learning algorithms are combined to detect these artifacts based on 368 clinical ABUS images that have been rated manually by two experienced clinicians. At a specificity of 0.99, 55% of the images that were rated as low quality are detected by the proposed algorithms. The areas under the ROC curves of the single classifiers are 0.99 for the nipple position, 0.84 for the nipple shadow, and 0.89 for the breast contour shape. The proposed algorithms work fast and reliably, which makes them adequate for online evaluation of image quality during acquisition. The presented concept may be extended to further image modalities and quality aspects. PMID:27158633

  12. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33.

    PubMed

    Round, A R; Franke, D; Moritz, S; Huchler, R; Fritsche, M; Malthan, D; Klaering, R; Svergun, D I; Roessle, M

    2008-10-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client-server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841

  13. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies

    PubMed Central

    Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.

    2016-01-01

    Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139

  14. An automated system for assessing cognitive function in any environment

    NASA Astrophysics Data System (ADS)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  15. Automated semiquantitative direct-current-arc spectrographic analysis of eight argonne premium coal ash samples

    USGS Publications Warehouse

    Skeen, C.J.; Libby, B.J.; Crandell, W.B.

    1990-01-01

    The automated semiquantitative direct-current-arc spectre-graphic method was used to analyze 62 elements in eight Argonne Premium Coal Ash samples. All eight coal ash samples were analyzed in triplicate to verify precision and accuracy of the method. The precision for most elements was within ??10%. The accuracy of this method is limited to +50% or -33% because of the nature of the standard curves for each of the elements. Adjustments to the computer program were implemented to account for unique matrix interferences in these particular coal ash samples.

  16. Functional Profiling of Live Melanoma Samples Using a Novel Automated Platform

    PubMed Central

    Schayowitz, Adam; Bertenshaw, Greg; Jeffries, Emiko; Schatz, Timothy; Cotton, James; Villanueva, Jessie; Herlyn, Meenhard; Krepler, Clemens; Vultur, Adina; Xu, Wei; Yu, Gordon H.; Schuchter, Lynn; Clark, Douglas P.

    2012-01-01

    Aims This proof-of-concept study was designed to determine if functional, pharmacodynamic profiles relevant to targeted therapy could be derived from live human melanoma samples using a novel automated platform. Methods A series of 13 melanoma cell lines was briefly exposed to a BRAF inhibitor (PLX-4720) on a platform employing automated fluidics for sample processing. Levels of the phosphoprotein p-ERK in the mitogen-activated protein kinase (MAPK) pathway from treated and untreated sample aliquots were determined using a bead-based immunoassay. Comparison of these levels provided a determination of the pharmacodynamic effect of the drug on the MAPK pathway. A similar ex vivo analysis was performed on fine needle aspiration (FNA) biopsy samples from four murine xenograft models of metastatic melanoma, as well as 12 FNA samples from patients with metastatic melanoma. Results Melanoma cell lines with known sensitivity to BRAF inhibitors displayed marked suppression of the MAPK pathway in this system, while most BRAF inhibitor-resistant cell lines showed intact MAPK pathway activity despite exposure to a BRAF inhibitor (PLX-4720). FNA samples from melanoma xenografts showed comparable ex vivo MAPK activity as their respective cell lines in this system. FNA samples from patients with metastatic melanoma successfully yielded three categories of functional profiles including: MAPK pathway suppression; MAPK pathway reactivation; MAPK pathway stimulation. These profiles correlated with the anticipated MAPK activity, based on the known BRAF mutation status, as well as observed clinical responses to BRAF inhibitor therapy. Conclusion Pharmacodynamic information regarding the ex vivo effect of BRAF inhibitors on the MAPK pathway in live human melanoma samples can be reproducibly determined using a novel automated platform. Such information may be useful in preclinical and clinical drug development, as well as predicting response to targeted therapy in individual patients

  17. Development of automated preparation system for isotopocule analysis of N2O in various air samples

    NASA Astrophysics Data System (ADS)

    Toyoda, Sakae; Yoshida, Naohiro

    2016-05-01

    Nitrous oxide (N2O), an increasingly abundant greenhouse gas in the atmosphere, is the most important stratospheric ozone-depleting gas of this century. Natural abundance ratios of isotopocules of N2O, NNO molecules substituted with stable isotopes of nitrogen and oxygen, are a promising index of various sources or production pathways of N2O and of its sink or decomposition pathways. Several automated methods have been reported to improve the analytical precision for the isotopocule ratio of atmospheric N2O and to reduce the labor necessary for complicated sample preparation procedures related to mass spectrometric analysis. However, no method accommodates flask samples with limited volume or pressure. Here we present an automated preconcentration system which offers flexibility with respect to the available gas volume, pressure, and N2O concentration. The shortest processing time for a single analysis of typical atmospheric sample is 40 min. Precision values of isotopocule ratio analysis are < 0.1 ‰ for δ15Nbulk (average abundances of 14N15N16O and 15N14N16O relative to 14N14N16O), < 0.2 ‰ for δ18O (relative abundance of 14N14N18O), and < 0.5 ‰ for site preference (SP; difference between relative abundance of 14N15N16O and 15N14N16O). This precision is comparable to that of other automated systems, but better than that of our previously reported manual measurement system.

  18. An instrument for automated purification of nucleic acids from contaminated forensic samples.

    PubMed

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D; Poon, Hiron; Marziali, Andre

    2008-02-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, leading to frequent failure of STR profiling. We present a novel instrument for DNA purification of forensic samples that is capable of highly effective concentration of nucleic acids from soil particulates, fabric, and other complex samples including solid components. The novel concentration process, known as SCODA, is inherently selective for long charged polymers such as DNA, and therefore is able to effectively reject known contaminants. We present an automated sample preparation instrument based on this process, and preliminary results based on mock forensic samples.

  19. Using automated continual performance assessment to improve health care.

    PubMed

    Wulff, K R; Westphal, J R; Shray, S L; Hunkeler, E F

    1997-01-01

    Inefficiency in the work of health care providers is evident and contributes to health care costs. In the early 20th century, industrial engineers developed scientific methods for studying work to improve performance (efficiency) by measuring results--i.e., quality, cost, and productivity. In the mid-20th century, business managers developed ways to apply these methods to improve the work process. These scientific methods and management approaches can be applied to improving medical work. Fee-for-service practice has had incentives to maximize productivity, and prepaid practice has had incentives to minimize costs, but no sector of the health care system has systematically pursued the optimization of all performance variables: quality, cost, and productivity. We have reviewed evolving methods for the automation of continual assessment of performance in health care using touch screen and computer telephone, logging and scheduling software, appropriate combinations of generic or disease-specific health status questionnaires, physiologic measurements or laboratory assays from computerized records, and cost and productivity data from computerized registration logs. We propose that the results of outcome assessment be rapidly and continually transmitted to providers, patients, and managers so that health care processes can be progressively improved. The evolving systems we have described are the practical tools that can help us achieve our performance goals.

  20. Automated mango fruit assessment using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  1. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    PubMed

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles. PMID:26938837

  2. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    PubMed

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  3. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  4. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    PubMed Central

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  5. Electrochemical pesticide detection with AutoDip--a portable platform for automation of crude sample analyses.

    PubMed

    Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils

    2015-02-01

    Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.

  6. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    PubMed

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  7. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    PubMed

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically.

  8. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    PubMed Central

    2014-01-01

    posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training. PMID:24987523

  9. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    PubMed

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  10. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    PubMed

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method. PMID:25033319

  11. Automated sample preparation and LC-MS for high-throughput ADME quantification.

    PubMed

    O'Connor, Desmond

    2002-01-01

    Bioanalytical groups in the pharmaceutical industry provide quantitative data to support all stages of drug discovery. The increased use of 96-well plates and robotic liquid handling systems, the availability of robust triple quadruple mass spectrometers, and developments in chromatographic and samples preparation techniques, have all increased the rate at which this data can be generated. This review describes currently used methods and emerging technologies for automation of high-throughput quantitative bioanalysis. The focus is on recent applications of sample preparation and chromatography techniques compatible with detection by triple quadruple mass spectrometers.

  12. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    NASA Astrophysics Data System (ADS)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  13. Automated sample preparation facilitated by PhyNexus MEA purification system for oligosaccharide mapping of glycoproteins.

    PubMed

    Prater, Bradley D; Anumula, Kalyan R; Hutchins, Jeff T

    2007-10-15

    A reproducible high-throughput sample cleanup method for fluorescent oligosaccharide mapping of glycoproteins is described. Oligosaccharides are released from glycoproteins using PNGase F and labeled with 2-aminobenzoic acid (anthranilic acid, AA). A PhyNexus MEA system was adapted for automated isolation of the fluorescently labeled oligosaccharides from the reaction mixture prior to mapping by HPLC. The oligosaccharide purification uses a normal-phase polyamide resin (DPA-6S) in custom-made pipette tips. The resin volume, wash, and elution steps involved were optimized to obtain high recovery of oligosaccharides with the least amount of contaminating free fluorescent dye in the shortest amount of time. The automated protocol for sample cleanup eliminated all manual manipulations with a recycle time of 23 min. We have reduced the amount of excess AA by 150-fold, allowing quantitative oligosaccharide mapping from as little as 500 ng digested recombinant immunoglobulin G (rIgG). This low sample requirement allows early selection of a cell line with desired characteristics (e.g., oligosaccharide profile and high specific productivity) for the production of glycoprotein drugs. In addition, the use of Tecan or another robotic platform in conjunction with this method should allow the cleanup of 96 samples in 23 min, a significant decrease in the amount of time currently required to process such a large number of samples.

  14. Automated performance assessment of ultrasound systems using a dynamic phantom

    PubMed Central

    Riedel, F; Valente, AA; Cochran, S; Corner, GA

    2014-01-01

    Quality assurance of medical ultrasound imaging systems is limited by repeatability, difficulty in quantifying results, and the time involved. A particularly interesting approach is demonstrated in the Edinburgh pipe phantom which, with an accompanying mathematical transformation, produces a single figure of merit for image quality from individual measurements of resolution over a range of depths. However, the Edinburgh pipe phantom still requires time-consuming manual scanning, mitigating against its routine use. This paper presents a means to overcome this limitation with a new device, termed the Dundee dynamic phantom, allowing rapid set-up and automated operation. The Dundee dynamic phantom is based on imaging two filamentary targets, positioned by computer control at different depths in a tank of 9.4% ethanol–water solution. The images are analysed in real time to assess if the targets are resolved, with individual measurements at different depths again used to calculate a single figure of merit, in this case for lateral resolution only. Test results are presented for a total of 18 scanners in clinical use for different applications. As a qualitative indication of viability, the figure of merit produced by the Dundee dynamic phantom is shown to differentiate between scanners operating at different frequencies and between a relatively new, higher quality system and an older, lower quality system. PMID:27433220

  15. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    NASA Technical Reports Server (NTRS)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  16. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy.

    PubMed

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-01-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method. PMID:26260921

  17. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  18. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  19. Development and Validation of an Automated Sepsis Risk Assessment System.

    PubMed

    Back, Ji-Sun; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi

    2016-10-01

    Aggressive resuscitation can decrease sepsis mortality, but its success depends on early detection of sepsis. The purpose of this study was to develop and verify an Automated Sepsis Risk Assessment System (Auto-SepRAS), which would automatically assess the sepsis risk of inpatients by applying data mining techniques to electronic health records (EHR) data and provide daily updates. The seven predictors included in the Auto-SepRAS after initial analysis were admission via the emergency department, which had the highest odds ratio; diastolic blood pressure; length of stay; respiratory rate; heart rate; and age. Auto-SepRAS classifies inpatients into three risk levels (high, moderate, and low) based on the predictive values from the sepsis risk-scoring algorithm. The sepsis risk for each patient is presented on the nursing screen of the EHR. The AutoSepRAS was implemented retrospectively in several stages using EHR data and its cut-off scores adjusted. Overall discrimination power was moderate (AUC>.80). The Auto-SepRAS should be verified or updated continuously or intermittently to maintain high predictive performance, but it does not require invasive tests or data input by nurses that would require additional time. Nurses are able to provide patients with nursing care appropriate to their risk levels by using the sepsis risk information provided by the Auto-SepRAS. In particular, with early detection of changes related to sepsis, nurses should be able to help in providing rapid initial resuscitation of high-risk patients. © 2016 Wiley Periodicals, Inc.

  20. Device and method for automated separation of a sample of whole blood into aliquots

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  1. Automated Genotyping of Biobank Samples by Multiplex Amplification of Insertion/Deletion Polymorphisms

    PubMed Central

    Mathot, Lucy; Falk-Sörqvist, Elin; Moens, Lotte; Allen, Marie; Sjöblom, Tobias; Nilsson, Mats

    2012-01-01

    The genomic revolution in oncology will entail mutational analyses of vast numbers of patient-matched tumor and normal tissue samples. This has meant an increased risk of patient sample mix up due to manual handling. Therefore, scalable genotyping and sample identification procedures are essential to pathology biobanks. We have developed an efficient alternative to traditional genotyping methods suited for automated analysis. By targeting 53 prevalent deletions and insertions found in human populations with fluorescent multiplex ligation dependent genome amplification, followed by separation in a capillary sequencer, a peak spectrum is obtained that can be automatically analyzed. 24 tumor-normal patient samples were successfully matched using this method. The potential use of the developed assay for forensic applications is discussed. PMID:23300761

  2. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  3. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  4. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  5. A Bayesian Framework for the Automated Online Assessment of Sensor Data Quality

    PubMed Central

    Smith, Daniel; Timms, Greg; De Souza, Paulo; D'Este, Claire

    2012-01-01

    Online automated quality assessment is critical to determine a sensor's fitness for purpose in real-time applications. A Dynamic Bayesian Network (DBN) framework is proposed to produce probabilistic quality assessments and represent the uncertainty of sequentially correlated sensor readings. This is a novel framework to represent the causes, quality state and observed effects of individual sensor errors without imposing any constraints upon the physical deployment or measured phenomenon. It represents the casual relationship between quality tests and combines them in a way to generate uncertainty estimates of samples. The DBN was implemented for a particular marine deployment of temperature and conductivity sensors in Hobart, Australia. The DBN was shown to offer a substantial average improvement (34%) in replicating the error bars that were generated by experts when compared to a fuzzy logic approach. PMID:23012554

  6. Automated headspace solid-phase dynamic extraction for the determination of cannabinoids in hair samples.

    PubMed

    Musshoff, Frank; Lachenmeier, Dirk W; Kroener, Lars; Madea, Burkhard

    2003-04-23

    This article describes a fully automated procedure for detecting cannabinoids in human hair samples. The procedure uses alkaline hydrolysis and headspace solid-phase dynamic extraction (HS-SPDE), followed by on-coating derivatization and gas chromatography-mass spectrometry (GC-MS). SPDE is a further development of solid-phase microextraction (SPME), based on an inside needle capillary absorption trap. It uses a hollow needle with an internal coating of polydimethylsiloxane as extraction and pre-concentration medium. Ten mg of hair were washed with deionised water, petroleum ether and dichloromethane. After adding deuterated internal standards, the sample was hydrolyzed with sodium hydroxide and directly submitted to HS-SPDE. After absorption of analytes for an on-coating derivatization procedure, the SPDE-needle was directly placed into the headspace of a second vial containing N-methyl-N-trimethylsilyl-trifluoroacetamide before GC-MS analysis. The limit of detection was 0.14 ng/mg for Delta(9)-tetrahydrocannabinol, 0.09 ng/mg for cannabidiol, and 0.12ng/mg for cannabinol. Absolute recoveries were in the range of 0.6 to 8.4%. Linearity was verified over a range from 0.2 to 20 ng/mg, with coefficients of correlation between 0.998 and 0.999. Intra- and inter-day precision were determined at two different concentrations and resulted in ranges between 2.3 and 6.0% (intra-day) and 3.3 and 7.6% (inter-day). Compared with conventional methods of hair analysis, this automated HS-SPDE-GC-MS procedure is substantially faster. It is easy to perform without using solvents and with minimal sample quantities, and it yields the same sensitivity and reproducibility. Compared to SPME, we found a higher extraction rate, coupled with a faster automated operation and greater stability of the device.

  7. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  8. Application and flexibility of robotics in automating extraction methods for food samples.

    PubMed

    Higgs, D J; Vanderslice, J T

    1987-05-01

    Laboratory robotic technology has made it possible to automate the manually intensive operations associated with the extraction of vitamins from food. The modular approach to robotics allows the conversion from one extraction procedure to another by a simple addition or replacement of a module plus reprogramming. This is illustrated for the extraction of vitamins C and B1 from food samples. Because many of the organic micronutrients are unstable, storage and extraction conditions must be established to stabilize labile compounds if the full capabilities of robotics are to be realized.

  9. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  10. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    NASA Astrophysics Data System (ADS)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  11. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-11-20

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.

  12. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    PubMed

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  13. [Comparative validation of manual and automated methods for mixing and volume control of total blood samples].

    PubMed

    Folléa, G; Bigey, F; Jacob, D; Cazenave, J P

    1997-07-01

    During blood collection, agitation and volume limitations are critical to ensure thorough mixing of the blood with the anticoagulant and obtention of the predetermined volume. These 2 factors are essential to prevent blood activation and to obtain well standardized blood products. The objective of this study was to compare the quality of the blood collected using 2 types of collection method: tripping of a scale at a predetermined volume limit of 450 mL in the presence of manual agitation, and the 3 blood collection monitors currently available in France. A minimum of 100 collection procedures was performed for each of the 4 methods tested. Results were found to be equivalent using either the manual or the automated procedures with regard to both the accuracy and reproducibility of the blood volumes obtained and the collection times and flow rates. The characteristics of the red blood cell concentrates, platelet concentrates and plasma units prepared from the first 30 collections of each group were assessed and compared to regulatory requirements. The quality of all these products was found to be comparable to that currently observed at quality control and no product was rejected at the release control for reasons of poor collection. An assessment of the practicability of the different methods showed that the automated devices are subject to practical difficulties involving transport and battery loading. In addition, the cost of this equipment is approximately 5 times higher than that of the scales. In conclusion, the results of this study show that in our hands, no significant advantage could be expected from the use of automated blood collection monitors as compared to simple scales with manual mixing. These results further raise the question of the applicability to labile blood products of the comparative validations currently accepted in the pharmaceutical industry, in order to allow the use of correctly validated alternative methods.

  14. Automation of high-frequency sampling of environmental waters for reactive species

    NASA Astrophysics Data System (ADS)

    Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.

    2011-12-01

    Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the

  15. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  16. Automated Generation and Assessment of Autonomous Systems Test Cases

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  17. Adapting Assessment Procedures for Delivery via an Automated Format.

    ERIC Educational Resources Information Center

    Kelly, Karen L.; And Others

    The Office of Personnel Management (OPM) decided to explore alternative examining procedures for positions covered by the Administrative Careers with America (ACWA) examination. One requirement for new procedures was that they be automated for use with OPM's recently developed Microcomputer Assisted Rating System (MARS), a highly efficient system…

  18. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach.

  19. Automated Assessment of Speech Fluency for L2 English Learners

    ERIC Educational Resources Information Center

    Yoon, Su-Youn

    2009-01-01

    This dissertation provides an automated scoring method of speech fluency for second language learners of English (L2 learners) based that uses speech recognition technology. Non-standard pronunciation, frequent disfluencies, faulty grammar, and inappropriate lexical choices are crucial characteristics of L2 learners' speech. Due to the ease of…

  20. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. PMID:26241930

  1. Automated Scoring in Context: Rapid Assessment for Placed Students

    ERIC Educational Resources Information Center

    Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal

    2013-01-01

    This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…

  2. Toward Automated Computer-Based Visualization and Assessment of Team-Based Performance

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk

    2014-01-01

    A considerable amount of research has been undertaken to provide insights into the valid assessment of team performance. However, in many settings, manual and therefore labor-intensive assessment instruments for team performance have limitations. Therefore, automated assessment instruments enable more flexible and detailed insights into the…

  3. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  4. Harmonization of automated hemolysis index assessment and use: Is it possible?

    PubMed

    Dolci, Alberto; Panteghini, Mauro

    2014-05-15

    The major source of errors producing unreliable laboratory test results is the pre-analytical phase with hemolysis accounting for approximately half of them and being the leading cause of unsuitable blood specimens. Hemolysis may produce interference in many laboratory tests by a variety of biological and analytical mechanisms. Consequently, laboratories need to systematically detect and reliably quantify hemolysis in every collected sample by means of objective and consistent technical tools that assess sample integrity. This is currently done by automated estimation of hemolysis index (HI), available on almost all clinical chemistry platforms, making the hemolysis detection reliable and reportable patient test results more accurate. Despite these advantages, a degree of variability still affects the HI estimate and more efforts should be placed on harmonization of this index. The harmonization of HI results from different analytical systems should be the immediate goal, but the scope of harmonization should go beyond analytical steps to include other aspects, such as HI decision thresholds, criteria for result interpretation and application in clinical practice as well as report formats. With regard to this, relevant issues to overcome remain the objective definition of a maximum allowable bias for hemolysis interference based on the clinical application of the measurements and the management of unsuitable samples. Particularly, for the latter a recommended harmonized approach is required when not reporting numerical results of unsuitable samples with significantly increased HI and replacing the test result with a specific comment highlighting hemolysis of the sample.

  5. Automated MALDI matrix coating system for multiple tissue samples for imaging mass spectrometry.

    PubMed

    Mounfield, William P; Garrett, Timothy J

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  6. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  7. Application of automated bioacoustic identification in environmental education and assessment.

    PubMed

    Oba, Teruyo

    2004-06-01

    Developments in electronics and computer science have led to the introduction of an automated bioacoustic identification device used to resolve commonly encountered problems in the identification of animal species in the field. This technology aids our auditory observations, and also improves the quality of biological surveys and environmental monitoring. In this paper the future roles and possibilities of bioacoustics are discussed, providing some examples from the realm of environmental education and monitoring that focus on the use of nature sounds.

  8. Determination of the acaricide fenbutatin oxide in water samples by automated headspace-SPME-GC/MS.

    PubMed

    Devos, Christophe; Moens, Luc; Sandra, Pat

    2005-05-01

    The analysis of the acaricide fenbutatin oxide (FBTO) having a molecular weight of 1052.66 g mol(-1) in water samples by capillary GC/MS after in-situ derivatization with sodium tetraethylborate (NaBEt4) and headspace-SPME enrichment is described. Automated SPME is performed at 80 degrees C for 30 min. Detection is carried out in the ion monitoring mode with deuterated triphenyltin (TPhTd15) as internal standard. Good linearity (R2 = 0.9993) was obtained in the dynamic range 20 to 1000 ng L(-1) with a limit of detection of 16 ng L(1) (LOD at 3 S/N) and a limit of quantitation of 50 ng L(-1) (LOQ at 10 S/N). Intra-day RSD% for n=6 was 8.9 at the LOQ level. PMID:15912737

  9. Using an automated recruitment process to generate an unbiased study sample of multiple sclerosis patients.

    PubMed

    Miller, Deborah M; Fox, R; Atreja, A; Moore, S; Lee, J-C; Fu, A Z; Jain, A; Saupe, W; Chakraborty, S; Stadtler, M; Rudick, R A

    2010-01-01

    The objective of this study was to test the efficiency of an automated recruitment methodology developed as a component of a practical controlled trial to assess the benefits of a Web-based personal health site to guide self-management of multiple sclerosis symptoms called Mellen Center Care On-line. We describe the study's automated recruitment methodology using clinical and administrative databases and assess the comparability between subjects who completed informed consent (IC) forms, and individuals who were invited to participate but did not reply, designated as patient nonresponders (PNR). The IC and PNR groups were compared on demographics, number of physician or advanced practice nurse/physician assistant visits during the 12 months prior to the initial invitation, and level of disability as measured by the Charlson Comorbidity Index (CCI). Out of a total dynamic potential pool of 2,421 patients, 2,041 had been invited to participate, 309 had become ineligible to participate during the study, and 71 individuals remained in the pool at the end of recruitment. The IC group had a slightly greater proportion of females. Both groups were predominantly white with comparable marital status. The groups had comparable mean household income, education level, and commercial insurance. The computed mean CCI was similar between the groups. The only significant difference was that the PNR group had fewer clinic visits in the preceding 12 months. The subjects were highly representative of the target population, indicating that there was little bias in our selection process despite a constantly changing pool of eligible individuals. PMID:20064056

  10. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    USGS Publications Warehouse

    Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.

  11. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  12. Automated measurement and quantification of heterotrophic bacteria in water samples based on the MPN method.

    PubMed

    Fuchsluger, C; Preims, M; Fritz, I

    2011-01-01

    Quantification of heterotrophic bacteria is a widely used measure for water analysis. Especially in terms of drinking water analysis, testing for microorganisms is strictly regulated by the European Drinking Water Directive, including quality criteria and detection limits. The quantification procedure presented in this study is based on the most probable number (MPN) method, which was adapted to comply with the need for a quick and easy screening tool for different kinds of water samples as well as varying microbial loads. Replacing tubes with 24-well titer plates for cultivation of bacteria drastically reduces the amount of culture media and also simplifies incubation. Automated photometric measurement of turbidity instead of visual evaluation of bacterial growth avoids misinterpretation by operators. Definition of a threshold ensures definite and user-independent determination of microbial growth. Calculation of the MPN itself is done using a program provided by the US Food and Drug Administration (FDA). For evaluation of the method, real water samples of different origins as well as pure cultures of bacteria were analyzed in parallel with the conventional plating methods. Thus, the procedure described requires less preparation time, reduces costs and ensures both stable and reliable results for water samples. PMID:20835882

  13. Uranium monitoring tool for rapid analysis of environmental samples based on automated liquid-liquid microextraction.

    PubMed

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2015-03-01

    A fully automated in-syringe (IS) magnetic stirring assisted (MSA) liquid-liquid microextraction (LLME) method for uranium(VI) determination was developed, exploiting a long path-length liquid waveguide capillary cell (LWCC) with spectrophotometric detection. On-line extraction of uranium was performed within a glass syringe containing a magnetic stirrer for homogenization of the sample and the successive reagents: cyanex-272 in dodecane as extractant, EDTA as interference eliminator, hydrochloric acid to make the back-extraction of U(VI) and arsenazo-III as chromogenic reagent to accomplish the spectrophotometric detection at 655 nm. Magnetic stirring assistance was performed by a specially designed driving device placed around the syringe body creating a rotating magnetic field in the syringe, and forcing the rotation of the stirring bar located inside the syringe. The detection limit (LOD) of the developed method is 3.2 µg L(-1). Its good interday precision (Relative Standard Deviation, RSD 3.3%), and its high extraction frequency (up to 6 h(-1)) makes of this method an inexpensive and fast screening tool for monitoring uranium(VI) in environmental samples. It was successfully applied to different environmental matrices: channel sediment certified reference material (BCR-320R), soil and phosphogypsum reference materials, and natural water samples, with recoveries close to 100%. PMID:25618721

  14. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  15. To the development of an automated system of assessment of radiological images of joints

    NASA Astrophysics Data System (ADS)

    Grechikhin, A. I.; Grunina, E. A.; Karetnikova, I. R.

    2008-03-01

    An algorithm developed for the adaptive automated computer processing of radiological images of hands and feet in order to assess the degree of bone and cartilage destruction in rheumatoid arthritis is described. A set of new numeral signs was proposed in order to assess a degree of arthritis radiological progression.

  16. ADDING GLOBAL SOILS DATA TO THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL (AGWA)

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA) is a GIS-based hydrologic modeling tool that is available as an extension for ArcView 3.x from the USDA-ARS Southwest Watershed Research Center (www.tucson.ars.ag.gov/agwa). AGWA is designed to facilitate the assessment of...

  17. Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing

    ERIC Educational Resources Information Center

    Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt

    2012-01-01

    The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…

  18. Automated nanoscale flow cytometry for assessing protein-protein interactions.

    PubMed

    von Kolontaj, Kerstin; Horvath, Gabor L; Latz, Eicke; Büscher, Martin

    2016-09-01

    Despite their importance for signalling events, protein-protein interactions cannot easily be analyzed on a single cell level. We developed a robust automated FRET measurement system implemented on a commercial flow cytometer allowing for rapid profiling of molecular associations in living cells. We used this method to measure the most proximal signaling events on human T lymphocyte activation, which preceded calcium influx, and could automatically detect T cell receptor/CD3 complex clustering defects in immunocompromised patients. © 2016 International Society for Advancement of Cytometry. PMID:27584593

  19. Renewable Microcolumns for Automated DNA Purification and Flow-through Amplification: From Sediment Samples through Polymerase Chain Reaction

    SciTech Connect

    Bruckner-Lea, Cindy J. ); Tsukuda, Toyoko ); Dockendorff, Brian P. ); Follansbee, James C. ); Kingsley, Mark T. ); Ocampo, Catherine O.; Stults, Jennie R.; Chandler, Darrell P.

    2001-12-01

    There is an increasing need for field-portable systems for the detection and characterization of microorganisms in the environment. Nucleic acids analysis is frequently the method of choice for discriminating between bacteria in complex systems, but standard protocols are difficult to automate and current microfluidic devices are not configured specifically for environmental sample analysis. In this report, we describe the development of an integrated DNA purification and PCR amplification system and demonstrate its use for the automated purification and amplification of Geobacter chapelli DNA (genomic DNA or plasmid targets) from sediments. The system includes renewable separation columns for the automated capture and release of microparticle purification matrices, and can be easily reprogrammed for new separation chemistries and sample types. The DNA extraction efficiency for the automated system ranged from 3 to 25 percent, depending on the length and concentration of the DNA target . The system was more efficient than batch capture methods for the recovery of dilute genomic DNA even though the reagen volumes were smaller than required for the batch procedure. The automated DNA concentration and purification module was coupled to a flow-through, Peltier-controlled DNA amplification chamber, and used to successfully purify and amplify genomic and plasmid DNA from sediment extracts. Cleaning protocols were also developed to allow reuse of the integrated sample preparation system, including the flow-through PCR tube.

  20. Accuracy assessment and automation of free energy calculations for drug design.

    PubMed

    Christ, Clara D; Fox, Thomas

    2014-01-27

    As the free energy of binding of a ligand to its target is one of the crucial optimization parameters in drug design, its accurate prediction is highly desirable. In the present study we have assessed the average accuracy of free energy calculations for a total of 92 ligands binding to five different targets. To make this study and future larger scale applications possible we automated the setup procedure. Starting from user defined binding modes, the procedure decides which ligands to connect via a perturbation based on maximum common substructure criteria and produces all necessary parameter files for free energy calculations in AMBER 11. For the systems investigated, errors due to insufficient sampling were found to be substantial in some cases whereas differences in estimators (thermodynamic integration (TI) versus multistate Bennett acceptance ratio (MBAR)) were found to be negligible. Analytical uncertainty estimates calculated from a single free energy calculation were found to be much smaller than the sample standard deviation obtained from two independent free energy calculations. Agreement with experiment was found to be system dependent ranging from excellent to mediocre (RMSE = [0.9, 8.2, 4.7, 5.7, 8.7] kJ/mol). When restricting analyses to free energy calculations with sample standard deviations below 1 kJ/mol agreement with experiment improved (RMSE = [0.8, 6.9, 1.8, 3.9, 5.6] kJ/mol).

  1. Erratum to: Automated Sample Preparation Method for Suspension Arrays using Renewable Surface Separations with Multiplexed Flow Cytometry Fluorescence Detection

    SciTech Connect

    Grate, Jay W.; Bruckner-Lea, Cindy J.; Jarrell, Ann E.; Chandler, Darrell P.

    2003-04-10

    In this paper we describe a new method of automated sample preparation for multiplexed biological analysis systems that use flow cytometry fluorescence detection. In this approach, color-encoded microspheres derivatized to capture particular biomolecules are temporarily trapped in a renewable surface separation column to enable perfusion with sample and reagents prior to delivery to the detector. This method provides for separation of the biomolecules of interest from other sample matrix components as well as from labeling solutions.

  2. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    SciTech Connect

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  3. The Upgrade Programme for the Structural Biology beamlines at the European Synchrotron Radiation Facility - High throughput sample evaluation and automation

    NASA Astrophysics Data System (ADS)

    Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.

    2013-03-01

    Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.

  4. Plasma cortisol and norepinephrine concentrations in pigs: automated sampling of freely moving pigs housed in the PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and norepinephrine (NE) responses in automated versus manual blood sampling. A ...

  5. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  6. Strategies for automated sample preparation, nucleic acid purification, and concentration of low-target-number nucleic acids in environmental and food processing samples

    NASA Astrophysics Data System (ADS)

    Bruckner-Lea, Cynthia J.; Holman, David A.; Schuck, Beatrice L.; Brockman, Fred J.; Chandler, Darrell P.

    1999-01-01

    The purpose of this work is to develop a rapid, automated system for nucleic acid purification and concentration from environmental and food processing samples. Our current approach involves off-line filtration and cell lysis (ballistic disintegration) functions in appropriate buffers followed by automated nucleic acid capture and purification on renewable affinity matrix microcolumns. Physical cell lysis and renewable affinity microcolumns eliminate the need for toxic organic solvents, enzyme digestions or other time- consuming sample manipulations. Within the renewable affinity microcolumn, we have examined nucleic acid capture and purification efficiency with various microbead matrices (glass, polymer, paramagnetic), surface derivitization (sequence-specific capture oligonucleotides or peptide nucleic acids), and DNA target size and concentration under variable solution conditions and temperatures. Results will be presented comparing automated system performance relative to benchtop procedures for both clean (pure DNA from a laboratory culture) and environmental (soil extract) samples, including results which demonstrate 8 minute purification and elution of low-copy nucleic acid targets from a crude soil extract in a form suitable for PCR or microarray-based detectors. Future research will involve the development of improved affinity reagents and complete system integration, including upstream cell concentration and cell lysis functions and downstream, gene-based detectors. Results of this research will ultimately lead to improved processes and instrumentation for on-line, automated monitors for pathogenic micro-organisms in food, water, air, and soil samples.

  7. Fully automated algorithm for wound surface area assessment.

    PubMed

    Deana, Alessandro Melo; de Jesus, Sérgio Henrique Costa; Sampaio, Brunna Pileggi Azevedo; Oliveira, Marcelo Tavares; Silva, Daniela Fátima Teixeira; França, Cristiane Miranda

    2013-01-01

    Worldwide, clinicians, dentists, nurses, researchers, and other health professionals need to monitor the wound healing progress and to quantify the rate of wound closure. The aim of this study is to demonstrate, step by step, a fully automated numerical method to estimate the size of the wound and the percentage damaged relative to the body surface area (BSA) in images, without the requirement for human intervention. We included the formula for BSA in rats in the algorithm. The methodology was validated in experimental wounds and human ulcers and was compared with the analysis of an experienced pathologist, with good agreement. Therefore, this algorithm is suitable for experimental wounds and burns and human ulcers, as they have a high contrast with adjacent normal skin.

  8. An automated maze task for assessing hippocampus-sensitive memory in mice.

    PubMed

    Pioli, Elsa Y; Gaskill, Brianna N; Gilmour, Gary; Tricklebank, Mark D; Dix, Sophie L; Bannerman, David; Garner, Joseph P

    2014-03-15

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery.

  9. An automated maze task for assessing hippocampus-sensitive memory in mice☆

    PubMed Central

    Pioli, Elsa Y.; Gaskill, Brianna N.; Gilmour, Gary; Tricklebank, Mark D.; Dix, Sophie L.; Bannerman, David; Garner, Joseph P.

    2014-01-01

    Memory deficits associated with hippocampal dysfunction are a key feature of a number of neurodegenerative and psychiatric disorders. The discrete-trial rewarded alternation T-maze task is highly sensitive to hippocampal dysfunction. Normal mice have spontaneously high levels of alternation, whereas hippocampal-lesioned mice are dramatically impaired. However, this is a hand-run task and handling has been shown to impact crucially on behavioural responses, as well as being labour-intensive and therefore unsuitable for high-throughput studies. To overcome this, a fully automated maze was designed. The maze was attached to the mouse's home cage and the subject earned all of its food by running through the maze. In this study the hippocampal dependence of rewarded alternation in the automated maze was assessed. Bilateral hippocampal-lesioned mice were assessed in the standard, hand-run, discrete-trial rewarded alternation paradigm and in the automated paradigm, according to a cross-over design. A similarly robust lesion effect on alternation performance was found in both mazes, confirming the sensitivity of the automated maze to hippocampal lesions. Moreover, the performance of the animals in the automated maze was not affected by their handling history whereas performance in the hand-run maze was affected by prior testing history. By having more stable performance and by decreasing human contact the automated maze may offer opportunities to reduce extraneous experimental variation and therefore increase the reproducibility within and/or between laboratories. Furthermore, automation potentially allows for greater experimental throughput and hence suitability for use in assessment of cognitive function in drug discovery. PMID:24333574

  10. Development of an automated multiple-target mask CD disposition system to enable new sampling strategy

    NASA Astrophysics Data System (ADS)

    Ma, Jian; Farnsworth, Jeff; Bassist, Larry; Cui, Ying; Mammen, Bobby; Padmanaban, Ramaswamy; Nadamuni, Venkatesh; Kamath, Muralidhar; Buckmann, Ken; Neff, Julie; Freiberger, Phil

    2006-03-01

    Traditional mask critical dimension (CD) disposition systems with only one or two targets is being challenged by the new requirements from mask-users as the wafer process control becomes more complicated in the newer generation of technologies. Historically, the mask shop does not necessarily measure and disposition off the same kind of CD structures that wafer fabs do. Mask disposition specifications and structures come from the frame-design and the tapeout, while wafer-level CD dispositions are mainly based on the historical process window established per CD-skew experiments and EOL (end of line) yield. In the current high volume manufacturing environment, the mask CDs are mainly dispositioned off their mean-to-target (MTT) and uniformity (6sigma) on one or two types of pre-determined structures. The disposition specification is set to ensure the printed mask will meet the design requirements and to ensure minimum deviation from them. The CD data are also used to adjust the dose of the mask exposure tools to control CD MTT. As a result, the mask CD disposition automation system was built to allow only one or two kinds of targets at most. In contrast, wafer-fabs measure a fairly wide range of different structures to ensure their process is on target and in control. The number of such structures that are considered critical is increasing due the growing complexity of the technology. To fully comprehend the wafer-level requirements, it is highly desirable to align the mask CD sample site and disposition to be the same as that of the wafer-fabs, to measure the OPC (optical proximity correction) structures or equivalent whenever possible, and to establish the true correlation between mask CD measurements vs. wafer CD measurement. In this paper, the development of an automated multiple-target mask CD disposition system with the goal of enabling new sampling strategy is presented. The pros and cons of its implementation are discussed. The new system has been inserted in

  11. Rapid magnetic bead based sample preparation for automated and high throughput N-glycan analysis of therapeutic antibodies.

    PubMed

    Váradi, Csaba; Lew, Clarence; Guttman, András

    2014-06-17

    Full automation to enable high throughput N-glycosylation profiling and sequencing with good reproducibility is vital to fulfill the contemporary needs of the biopharmaceutical industry and requirements of national regulatory agencies. The most prevalently used glycoanalytical methods of capillary electrophoresis and hydrophilic interaction liquid chromatography, while very efficient, both necessitate extensive sample preparation and cleanup, including glycoprotein capture, N-glycan release, fluorescent derivatization, purification, and preconcentration steps during the process. Currently used protocols to fulfill these tasks require multiple centrifugation and vacuum-centrifugation steps, making liquid handling robot mediated automated sample preparation difficult and expensive. In this paper we report on a rapid magnetic bead based sample preparation approach that enables full automation including all the process phases just in a couple of hours without requiring any centrifugation and/or vacuum centrifugation steps. This novel protocol has been compared to conventional glycan sample preparation strategies using standard glycoproteins (IgG, fetuin, and RNase B) and featured rapid processing time, high release and labeling efficiency, good reproducibility, and the potential of easy automation.

  12. Automated quantification of FISH signals in urinary cells enables the assessment of chromosomal aberration patterns characteristic for bladder cancer.

    PubMed

    Köhler, Christina U; Martin, Laura; Bonberg, Nadine; Behrens, Thomas; Deix, Thomas; Braun, Katharina; Noldus, Joachim; Jöckel, Karl-Heinz; Erbel, Raimund; Sommerer, Florian; Tannapfel, Andrea; Harth, Volker; Käfferlein, Heiko U; Brüning, Thomas

    2014-06-13

    Targeting the centromeres of chromosomes 3, 7, 17 (CEP3, 7, 17) and the 9p21-locus (LSI9p21) for diagnosing bladder cancer (BC) is time- and cost-intensive and requires a manual investigation of the sample by a well-trained investigator thus overall limiting its use in clinical diagnostics and large-scaled epidemiological studies. Here we introduce a new computer-assisted FISH spot analysis tool enabling an automated, objective and quantitative assessment of FISH patterns in the urinary sediment. Utilizing a controllable microscope workstation, the microscope software Scan^R was programmed to allow automatic batch-scanning of up to 32 samples and identifying quadruple FISH signals in DAPI-scanned nuclei of urinary sediments. The assay allowed a time- and cost-efficient, automated and objective assessment of CEP3, 7 and 17 FISH signals and facilitated the quantification of nuclei harboring specific FISH patterns in all cells of the urinary sediment. To explore the diagnostic capability of the developed tool, we analyzed the abundance of 51 different FISH patterns in a pilot set of urine specimens from 14 patients with BC and 21 population controls (PC). Herein, the results of the fully automated approach yielded a high degree of conformity when compared to those obtained by an expert-guided re-evaluation of archived scans. The best cancer-identifying pattern was characterized by a concurrent gain of CEP3, 7 and 17. Overall, our automated analysis refines current FISH protocols and encourages its use to establish reliable diagnostic cutoffs in future large-scale studies with well-characterized specimens-collectives. PMID:24802410

  13. IHC Profiler: An Open Source Plugin for the Quantitative Evaluation and Automated Scoring of Immunohistochemistry Images of Human Tissue Samples

    PubMed Central

    Malhotra, Renu; De, Abhijit

    2014-01-01

    In anatomic pathology, immunohistochemistry (IHC) serves as a diagnostic and prognostic method for identification of disease markers in tissue samples that directly influences classification and grading the disease, influencing patient management. However, till today over most of the world, pathological analysis of tissue samples remained a time-consuming and subjective procedure, wherein the intensity of antibody staining is manually judged and thus scoring decision is directly influenced by visual bias. This instigated us to design a simple method of automated digital IHC image analysis algorithm for an unbiased, quantitative assessment of antibody staining intensity in tissue sections. As a first step, we adopted the spectral deconvolution method of DAB/hematoxylin color spectra by using optimized optical density vectors of the color deconvolution plugin for proper separation of the DAB color spectra. Then the DAB stained image is displayed in a new window wherein it undergoes pixel-by-pixel analysis, and displays the full profile along with its scoring decision. Based on the mathematical formula conceptualized, the algorithm is thoroughly tested by analyzing scores assigned to thousands (n = 1703) of DAB stained IHC images including sample images taken from human protein atlas web resource. The IHC Profiler plugin developed is compatible with the open resource digital image analysis software, ImageJ, which creates a pixel-by-pixel analysis profile of a digital IHC image and further assigns a score in a four tier system. A comparison study between manual pathological analysis and IHC Profiler resolved in a match of 88.6% (P<0.0001, CI = 95%). This new tool developed for clinical histopathological sample analysis can be adopted globally for scoring most protein targets where the marker protein expression is of cytoplasmic and/or nuclear type. We foresee that this method will minimize the problem of inter-observer variations across labs and further help in

  14. A semi-automated measurement technique for the assessment of radiolucency

    PubMed Central

    Pegg, E. C.; Kendrick, B. J. L.; Pandit, H. G.; Gill, H. S.; Murray, D. W.

    2014-01-01

    The assessment of radiolucency around an implant is qualitative, poorly defined and has low agreement between clinicians. Accurate and repeatable assessment of radiolucency is essential to prevent misdiagnosis, minimize cases of unnecessary revision, and to correctly monitor and treat patients at risk of loosening and implant failure. The purpose of this study was to examine whether a semi-automated imaging algorithm could improve repeatability and enable quantitative assessment of radiolucency. Six surgeons assessed 38 radiographs of knees after unicompartmental knee arthroplasty for radiolucency, and results were compared with assessments made by the semi-automated program. Large variation was found between the surgeon results, with total agreement in only 9.4% of zones and a kappa value of 0.602; whereas the automated program had total agreement in 81.6% of zones and a kappa value of 0.802. The software had a ‘fair to excellent’ prediction of the presence or the absence of radiolucency, where the area under the curve of the receiver operating characteristic curves was 0.82 on average. The software predicted radiolucency equally well for cemented and cementless implants (p = 0.996). The identification of radiolucency using an automated method is feasible and these results indicate that it could aid the definition and quantification of radiolucency. PMID:24759544

  15. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    SciTech Connect

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-02-25

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants.

  16. Using Automated Assessment Feedback to Enhance the Quality of Student Learning in Universities: A Case Study

    NASA Astrophysics Data System (ADS)

    Biggam, John

    There are many different ways of providing university students with feedback on their assessment performance, ranging from written checklists and handwritten commentaries to individual verbal feedback. Regardless of whether the feedback is summative or formative in nature, it is widely recognized that providing consistent, meaningful written feedback to students on assessment performance is not a simple task, particularly where a module is delivered by a team of staff. Typical student complaints about such feedback include: inconsistency of comment between lecturers; illegible handwriting; difficulty in relating feedback to assessment criteria; and vague remarks. For staff themselves, there is the problem that written comments, to be of any benefit to students, are immensely time-consuming. This paper illustrates, through a case study, the enormous benefits of Automated Assessment Feedback for staff and students. A clear strategy on how to develop an automated assessment feedback system, using the simplest of technologies, is provided.

  17. High-Throughput Serum 25-Hydroxy Vitamin D Testing with Automated Sample Preparation.

    PubMed

    Stone, Judy

    2016-01-01

    Serum from bar-coded tubes, and then internal standard, are pipetted to 96-well plates with an 8-channel automated liquid handler (ALH). The first precipitation reagent (methanol:ZnSO4) is added and mixed with the 8-channel ALH. A second protein precipitating agent, 1 % formic acid in acetonitrile, is added and mixed with a 96-channel ALH. After a 4-min delay for larger precipitates to settle to the bottom of the plate, the upper 36 % of the precipitate/supernatant mix is transferred with the 96-channel ALH to a Sigma Hybrid SPE(®) plate and vacuumed through for removal of phospholipids and precipitated proteins. The filtrate is collected in a second 96-well plate (collection plate) which is foil-sealed, placed in the autosampler (ALS), and injected into a multiplexed LC-MS/MS system running AB Sciex Cliquid(®) and MPX(®) software. Two Shimadzu LC stacks, with multiplex timing controlled by MPX(®) software, inject alternately to one AB Sciex API-5000 MS/MS using positive atmospheric pressure chemical ionization (APCI) and a 1.87 min water/acetonitrile LC gradient with a 2.1 × 20 mm, 2.7 μm, C18 fused core particle column (Sigma Ascentis Express). LC-MS/MS through put is ~44 samples/h/LC-MS/MS system with dual-LC channel multiplexing. Plate maps are transferred electronically from the ALH and reformatted into LC-MS/MS sample table format using the Data Innovations LLC (DI) Instrument Manager middleware application. Before collection plates are loaded into the ALS, the plate bar code is manually scanned to download the sample table from the DI middleware to the LC-MS/MS. After acquisition-LC-MS/MS data is analyzed with AB Sciex Multiquant(®) software using customized queries, and then results are transferred electronically via a DI interface to the LIS. 2500 samples/day can be extracted by two analysts using four ALHs in 4-6 h. LC-MS/MS analysis of those samples on three dual-channel LC multiplexed LC-MS/MS systems requires 19-21 h and data analysis can be

  18. Automated peroperative assessment of stents apposition from OCT pullbacks.

    PubMed

    Dubuisson, Florian; Péry, Emilie; Ouchchane, Lemlih; Combaret, Nicolas; Kauffmann, Claude; Souteyrand, Géraud; Motreff, Pascal; Sarry, Laurent

    2015-04-01

    This study's aim was to control the stents apposition by automatically analyzing endovascular optical coherence tomography (OCT) sequences. Lumen is detected using threshold, morphological and gradient operators to run a Dijkstra algorithm. Wrong detection tagged by the user and caused by bifurcation, struts'presence, thrombotic lesions or dissections can be corrected using a morphing algorithm. Struts are also segmented by computing symmetrical and morphological operators. Euclidian distance between detected struts and wall artery initializes a stent's complete distance map and missing data are interpolated with thin-plate spline functions. Rejection of detected outliers, regularization of parameters by generalized cross-validation and using the one-side cyclic property of the map also optimize accuracy. Several indices computed from the map provide quantitative values of malapposition. Algorithm was run on four in-vivo OCT sequences including different incomplete stent apposition's cases. Comparison with manual expert measurements validates the segmentation׳s accuracy and shows an almost perfect concordance of automated results. PMID:25700272

  19. Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment

    PubMed Central

    Mukherjee, Rashmi; Manohar, Dhiraj Dhane; Das, Dev Kumar; Achar, Arun; Mitra, Analava; Chakraborty, Chandan

    2014-01-01

    The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough) scheme for chronic wound (CW) evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB) wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity) color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM), were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793). PMID:25114925

  20. Automated tissue classification framework for reproducible chronic wound assessment.

    PubMed

    Mukherjee, Rashmi; Manohar, Dhiraj Dhane; Das, Dev Kumar; Achar, Arun; Mitra, Analava; Chakraborty, Chandan

    2014-01-01

    The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough) scheme for chronic wound (CW) evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB) wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity) color space and subsequently the "S" component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM), were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793).

  1. Improving EFL Graduate Students' Proficiency in Writing through an Online Automated Essay Assessing System

    ERIC Educational Resources Information Center

    Ma, Ke

    2013-01-01

    This study investigates the effects of using an online Automated Essay Assessing (AEA) system on EFL graduate students' writing. Eighty four EFL graduate students divided into the treatment group and the control group participated in this study. The treatment group was asked to use an AEA system to assist their essay writing. Both groups were…

  2. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    ERIC Educational Resources Information Center

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  3. Adult Students' Perceptions of Automated Writing Assessment Software: Does It Foster Engagement?

    ERIC Educational Resources Information Center

    LaGuerre, Joselle L.

    2013-01-01

    Generally, this descriptive study endeavored to include the voice of adult learners to the scholarly body of research regarding automated writing assessment tools (AWATs). Specifically, the study sought to determine the extent to which students perceive that the AWAT named Criterion fosters learning and if students' opinions differ depending…

  4. Automated Spacecraft Conjunction Assessment at Mars and the Moon

    NASA Technical Reports Server (NTRS)

    Berry, David; Guinn, Joseph; Tarzi, Zahi; Demcak, Stuart

    2012-01-01

    Conjunction assessment and collision avoidance are areas of current high interest in space operations. Most current conjunction assessment activity focuses on the Earth orbital environment. Several of the world's space agencies have satellites in orbit at Mars and the Moon, and avoiding collisions there is important too. Smaller number of assets than Earth, and smaller number of organizations involved, but consequences similar to Earth scenarios.This presentation will examine conjunction assessment processes implemented at JPL for spacecraft in orbit at Mars and the Moon.

  5. Assessment of Automated Measurement and Verification (M&V) Methods

    SciTech Connect

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Jump, David

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  6. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGICAL MODELING TOOL FOR WATERSHED MANAGEMENT AND LANDSCAPE ASSESSMENT

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (http://www.epa.gov/nerlesd1/land-sci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, and the University ...

  7. Automated high-throughput assessment of prostate biopsy tissue using infrared spectroscopic chemical imaging

    NASA Astrophysics Data System (ADS)

    Bassan, Paul; Sachdeva, Ashwin; Shanks, Jonathan H.; Brown, Mick D.; Clarke, Noel W.; Gardner, Peter

    2014-03-01

    Fourier transform infrared (FT-IR) chemical imaging has been demonstrated as a promising technique to complement histopathological assessment of biomedical tissue samples. Current histopathology practice involves preparing thin tissue sections and staining them using hematoxylin and eosin (H&E) after which a histopathologist manually assess the tissue architecture under a visible microscope. Studies have shown that there is disagreement between operators viewing the same tissue suggesting that a complementary technique for verification could improve the robustness of the evaluation, and improve patient care. FT-IR chemical imaging allows the spatial distribution of chemistry to be rapidly imaged at a high (diffraction-limited) spatial resolution where each pixel represents an area of 5.5 × 5.5 μm2 and contains a full infrared spectrum providing a chemical fingerprint which studies have shown contains the diagnostic potential to discriminate between different cell-types, and even the benign or malignant state of prostatic epithelial cells. We report a label-free (i.e. no chemical de-waxing, or staining) method of imaging large pieces of prostate tissue (typically 1 cm × 2 cm) in tens of minutes (at a rate of 0.704 × 0.704 mm2 every 14.5 s) yielding images containing millions of spectra. Due to refractive index matching between sample and surrounding paraffin, minimal signal processing is required to recover spectra with their natural profile as opposed to harsh baseline correction methods, paving the way for future quantitative analysis of biochemical signatures. The quality of the spectral information is demonstrated by building and testing an automated cell-type classifier based upon spectral features.

  8. Thrombin generation using the calibrated automated thrombinoscope to assess reversibility of dabigatran and rivaroxaban.

    PubMed

    Herrmann, Richard; Thom, James; Wood, Alicia; Phillips, Michael; Muhammad, Shoaib; Baker, Ross

    2014-05-01

    The new direct-acting anticoagulants such as dabigatran and rivaroxaban are usually not monitored but may be associated with haemorrhage, particularly where renal impairment occurs. They have no effective "antidotes". We studied 17 patients receiving dabigatran 150 mg twice daily for non-valvular atrial fibrillation and 15 patients receiving rivaroxaban 10 mg daily for the prevention of deep venous thrombosis after hip or knee replacement surgery. We assessed the effect of these drugs on commonly used laboratory tests and Calibrated Automated Thrombogram (CAT) using plasma samples. We also assessed effects in fresh whole blood citrated patient samples using thromboelastography on the TEG and the ROTEM. The efficacy of nonspecific haemostatic agents prothrombin complex concentrate (PCC), Factor VIII Inhibitor By-passing Activity (FEIBA) and recombinant activated factor VII (rVIIa) were tested by reversal of abnormal thrombin generation using the CAT. Concentrations added ex vivo were chosen to reflect doses normally given in vivo. Dabigatran significantly increased the dynamic parameters of the TEG and ROTEM and the lag time of the CAT. It significantly reduced the endogenous thrombin potential (ETP) and reduced the peak height of the CAT. Rivaroxaban did not affect the TEG and ROTEM parameters but did increase the lag time and reduce ETP and peak height of the CAT. For both drugs, these parameters were significantly and meaningfully corrected by PCC and FEIBA and to a lesser but still significant extent by rFVIIa. These results may be useful in devising a reversal strategy in patients but clinical experience will be needed to verify them.

  9. Automated fiber-type-specific cross-sectional area assessment and myonuclei counting in skeletal muscle

    PubMed Central

    Liu, Fujun; Fry, Christopher S.; Mula, Jyothi; Jackson, Janna R.; Lee, Jonah D.; Peterson, Charlotte A.

    2013-01-01

    Skeletal muscle is an exceptionally adaptive tissue that compromises 40% of mammalian body mass. Skeletal muscle functions in locomotion, but also plays important roles in thermogenesis and metabolic homeostasis. Thus characterizing the structural and functional properties of skeletal muscle is important in many facets of biomedical research, ranging from myopathies to rehabilitation sciences to exercise interventions aimed at improving quality of life in the face of chronic disease and aging. In this paper, we focus on automated quantification of three important morphological features of muscle: 1) muscle fiber-type composition; 2) muscle fiber-type-specific cross-sectional area, and 3) myonuclear content and location. We experimentally prove that the proposed automated image analysis approaches for fiber-type-specific assessments and automated myonuclei counting are fast, accurate, and reliable. PMID:24092696

  10. Assessing Creative Problem-Solving with Automated Text Grading

    ERIC Educational Resources Information Center

    Wang, Hao-Chuan; Chang, Chun-Yen; Li, Tsai-Yen

    2008-01-01

    The work aims to improve the assessment of creative problem-solving in science education by employing language technologies and computational-statistical machine learning methods to grade students' natural language responses automatically. To evaluate constructs like creative problem-solving with validity, open-ended questions that elicit…

  11. Falcon: automated optimization method for arbitrary assessment criteria

    DOEpatents

    Yang, Tser-Yuan; Moses, Edward I.; Hartmann-Siantar, Christine

    2001-01-01

    FALCON is a method for automatic multivariable optimization for arbitrary assessment criteria that can be applied to numerous fields where outcome simulation is combined with optimization and assessment criteria. A specific implementation of FALCON is for automatic radiation therapy treatment planning. In this application, FALCON implements dose calculations into the planning process and optimizes available beam delivery modifier parameters to determine the treatment plan that best meets clinical decision-making criteria. FALCON is described in the context of the optimization of external-beam radiation therapy and intensity modulated radiation therapy (IMRT), but the concepts could also be applied to internal (brachytherapy) radiotherapy. The radiation beams could consist of photons or any charged or uncharged particles. The concept of optimizing source distributions can be applied to complex radiography (e.g. flash x-ray or proton) to improve the imaging capabilities of facilities proposed for science-based stockpile stewardship.

  12. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen

    2014-01-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925

  13. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    NASA Technical Reports Server (NTRS)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads

  14. Donor disc attachment assessment with intraoperative spectral optical coherence tomography during descemet stripping automated endothelial keratoplasty

    PubMed Central

    Wylegala, Edward; Nowinska, Anna K; Wroblewska-Czajka, Ewa; Janiszewska, Dominika

    2013-01-01

    Optical coherence tomography has already been proven to be useful for pre- and post-surgical anterior eye segment assessment, especially in lamellar keratoplasty procedures. There is no evidence for intraoperative usefulness of optical coherence tomography (OCT). We present a case report of the intraoperative donor disc attachment assessment with spectral-domain optical coherence tomography in case of Descemet stripping automated endothelial keratoplasty (DSAEK) surgery combined with corneal incisions. The effectiveness of the performed corneal stab incisions was visualized directly by OCT scan analysis. OCT assisted DSAEK allows the assessment of the accuracy of the Descemet stripping and donor disc attachment. PMID:24104711

  15. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    PubMed

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images.

  16. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    PubMed

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images. PMID:26186748

  17. Bayesian stratified sampling to assess corpus utility

    SciTech Connect

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  18. Automated Assessment of Right Ventricular Volumes and Function Using Three-Dimensional Transesophageal Echocardiography.

    PubMed

    Nillesen, Maartje M; van Dijk, Arie P J; Duijnhouwer, Anthonie L; Thijssen, Johan M; de Korte, Chris L

    2016-02-01

    Assessment of right ventricular (RV) function is known to be of diagnostic value in patients with RV dysfunction. Because of its complex anatomic shape, automated determination of the RV volume is difficult and strong reliance on geometric assumptions is not desired. A method for automated RV assessment was developed using three-dimensional (3-D) echocardiography without relying on a priori knowledge of the cardiac anatomy. A 3-D adaptive filtering technique that optimizes the discrimination between blood and myocardium was applied to facilitate endocardial border detection. Filtered image data were incorporated in a segmentation model to automatically detect the endocardial RV border. End-systolic and end-diastolic RV volumes, as well as ejection fraction, were computed from the automatically segmented endocardial surfaces and compared against reference volumes manually delineated by two expert cardiologists. The results reported good performance in terms of correlation and agreement with the results from the reference volumes.

  19. Assessing genetic diversity in a sugarcane germplasm collection using an automated AFLP analysis.

    PubMed

    Besse, P; Taylor, G; Carroll, B; Berding, N; Burner, D; McIntyre, C L

    1998-10-01

    An assessment of genetic diversity within and between Saccharum, Old World Erianthus sect. Ripidium, and North American E.giganteus (S.giganteum) was conducted using Amplified Fragment Length Polymorphism (AFLP(TM)) markers. An automated gel scoring system (GelCompar(TM)) was successfully used to analyse the complex AFLP patterns obtained in sugarcane and its relatives. Similarity coefficient calculations and clustering revealed a genetic structure for Saccharum and Erianthus sect. Ripidium that was identical to the one previously obtained using other molecular marker types, showing the appropriateness of AFLP markers and the associated automated analysis in assessing genetic diversity in sugarcane. A genetic structure that correlated with cytotype (2n=30, 60, 90) was revealed within the North American species, E. giganteus (S.giganteum). Complex relationships among Saccharum, Erianthus sect. Ripidium, and North American E.giganteus were revealed and are discussed in the light of a similar study which involved RAPD markers.

  20. Automated Gel Size Selection to Improve the Quality of Next-generation Sequencing Libraries Prepared from Environmental Water Samples.

    PubMed

    Uyaguari-Diaz, Miguel I; Slobodan, Jared R; Nesbitt, Matthew J; Croxen, Matthew A; Isaac-Renton, Judith; Prystajecky, Natalie A; Tang, Patrick

    2015-04-17

    Next-generation sequencing of environmental samples can be challenging because of the variable DNA quantity and quality in these samples. High quality DNA libraries are needed for optimal results from next-generation sequencing. Environmental samples such as water may have low quality and quantities of DNA as well as contaminants that co-precipitate with DNA. The mechanical and enzymatic processes involved in extraction and library preparation may further damage the DNA. Gel size selection enables purification and recovery of DNA fragments of a defined size for sequencing applications. Nevertheless, this task is one of the most time-consuming steps in the DNA library preparation workflow. The protocol described here enables complete automation of agarose gel loading, electrophoretic analysis, and recovery of targeted DNA fragments. In this study, we describe a high-throughput approach to prepare high quality DNA libraries from freshwater samples that can be applied also to other environmental samples. We used an indirect approach to concentrate bacterial cells from environmental freshwater samples; DNA was extracted using a commercially available DNA extraction kit, and DNA libraries were prepared using a commercial transposon-based protocol. DNA fragments of 500 to 800 bp were gel size selected using Ranger Technology, an automated electrophoresis workstation. Sequencing of the size-selected DNA libraries demonstrated significant improvements to read length and quality of the sequencing reads.

  1. Automated sample preparation for radiogenic and non-traditional metal isotope analysis by MC-ICP-MS

    NASA Astrophysics Data System (ADS)

    Field, M. P.; Romaniello, S. J.; Gordon, G. W.; Anbar, A. D.

    2012-12-01

    High throughput analysis is becoming increasingly important for many applications of radiogenic and non-traditional metal isotopes. While MC-ICP-MS instruments offer the potential for very high sample throughout, the requirement for labor-intensive sample preparation and purification procedures remains a substantial bottleneck. Current purification protocols require manually feeding gravity-driven separation columns, a process that is both costly and time consuming. This bottleneck is eliminated with the prepFAST-MC™, an automated, low-pressure ion exchange chromatography system that can process from 1 to 60 samples in unattended operation. The syringe-driven system allows sample loading, multiple acid washes, column conditioning and elution cycles necessary to isolate elements of interest and automatically collect up to 3 discrete eluent fractions at user-defined intervals (time, volume and flow rate). Newly developed protocols for automated purification of uranium illustrates high throughput (>30 per run), multiple samples processed per column (>30), complete (>99%) matrix removal, high recovery (> 98%, n=25), and excellent precision (2 sigma =0.03 permil, n=10). The prepFAST-MC™ maximizes sample throughput and minimizes costs associated with personnel and consumables providing an opportunity to greatly expand research horizons in fields where large isotopic data sets are required, including archeology, geochemistry, and climate/environmental science

  2. SAMPLING DESIGN FOR ASSESSING RECREATIONAL WATER QUALITY

    EPA Science Inventory

    Current U.S. EPA guidelines for monitoring recreatoinal water quality refer to the geometric mean density of indicator organisms, enterococci and E. coli in marine and fresh water, respectively, from at least five samples collected over a four-week period. In order to expand thi...

  3. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    SciTech Connect

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  4. A conceptual model of the automated credibility assessment of the volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  5. The Effects of Finite Sampling Corrections on State Assessment Sample Requirements. NAEP Validity Studies (NVS).

    ERIC Educational Resources Information Center

    Chromy, James R.

    States participating in the National Assessment of Educational Progress State Assessment program (state NAEP) are required to sample at least 2,500 students from at least 100 schools per subject assessed. In this ideal situation, 25 students are assessed for a subject in each school selected for that subject. Two problems have arisen: some states…

  6. Determination of heterocyclic aromatic amines in food products: automation of the sample preparation method prior to HPLC and HPLC-MS quantification.

    PubMed

    Fay, L B; Ali, S; Gross, G A

    1997-05-12

    Heat-processing protein-rich foods may cause the formation of heterocyclic aromatic amines (HAAs), all of which have mutagenic and some also carcinogenic potential. Accurately measuring HAA levels in food products is therefore a necessary to realistically assess this risk factor. A solid-phase extraction method for quantitative HAA analysis has been developed by us over the last few years. This method has recently been automated using a robotic workstation and now allows almost unattended sample preparation, a process which saves a human operator about five hours of benchwork. Cleaned-up samples were analyzed by high performance liquid chromatography (HPLC) and ultraviolet (UV) or mass spectrometric (MS) detection. While HPLC-UV remains the daily tool to quantify HAAs, we found HPLC-electrospray-MS to be an alternative detection method with unique advantages, suited for both HAA identification and quantification.

  7. Automated Multiple-Sample Tray Manipulation Designed and Fabricated for Atomic Oxygen Facility

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Stueber, Thomas J.; Dever, Joyce A.; Banks, Bruce A.; Rutledge, Sharon K.

    2000-01-01

    Extensive improvements to increase testing capacity and flexibility and to automate the in situ Reflectance Measurement System (RMS) are in progress at the Electro-Physics Branch s Atomic Oxygen (AO) beam facility of the NASA Glenn Research Center at Lewis Field. These improvements will triple the system s capacity while placing a significant portion of the testing cycle under computer control for added reliability, repeatability, and ease of use.

  8. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    NASA Astrophysics Data System (ADS)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  9. Automated assessment of conditioning parameters for context and cued fear in mice.

    PubMed

    Contarino, Angelo; Baca, Leonardo; Kennelly, Arthur; Gold, Lisa H

    2002-01-01

    A behavioral technique often used to evaluate the cognitive performance of rats and mice is the fear conditioning paradigm. During conditioned fear experiments, freezing responses shown by rodents after exposure to environmental stimuli previously paired to an aversive experience provide a behavioral index of the animal's associative abilities. The present study examined the ability of a computer-controlled automated Freeze Monitor system for recording immobility behavior in mice. The sensitivity of the automated procedure to detect group differences caused by the application of various training protocols was also evaluated. Statistical analyses revealed significant positive correlations between immobility scores obtained with the automated apparatus and hand-scored data collected by a continuous or a time-sampling method. Behavioral patterns recorded by the computerized system were very similar to those obtained by the hand-scoring methods adopted. In particular, during context testing, exposure to environmental stimuli previously paired with a mild foot shock (unconditioned stimulus [US]) evoked increased immobility behavior in mice conditioned with the US compared with levels of immobility displayed by mice previously confined to the same contextual stimuli without receiving the US. Moreover, although during conditioned stimulus (CS) testing, mice previously exposed to the US displayed high levels of immobility when confined to environmental cues much different from those paired with the US (contextual fear generalization), both hand-scored and automated results revealed the effect of CS-US pairing (increased immobility) only in mice trained to associate the two stimuli (paired group) but not in mice exposed to both CS and US separated by a 40-sec time interval (unpaired group) or in mice receiving only the US (US group) during conditioning sessions. Overall, the results show associative conditioning measured in an automated apparatus and highlight the utility

  10. Automated Detection of Toxigenic Clostridium difficile in Clinical Samples: Isothermal tcdB Amplification Coupled to Array-Based Detection

    PubMed Central

    Pasko, Chris; Groves, Benjamin; Ager, Edward; Corpuz, Maylene; Frech, Georges; Munns, Denton; Smith, Wendy; Warcup, Ashley; Denys, Gerald; Ledeboer, Nathan A.; Lindsey, Wes; Owen, Charles; Rea, Larry; Jenison, Robert

    2012-01-01

    Clostridium difficile can carry a genetically variable pathogenicity locus (PaLoc), which encodes clostridial toxins A and B. In hospitals and in the community at large, this organism is increasingly identified as a pathogen. To develop a diagnostic test that combines the strengths of immunoassays (cost) and DNA amplification assays (sensitivity/specificity), we targeted a genetically stable PaLoc region, amplifying tcdB sequences and detecting them by hybridization capture. The assay employs a hot-start isothermal method coupled to a multiplexed chip-based readout, creating a manual assay that detects toxigenic C. difficile with high sensitivity and specificity within 1 h. Assay automation on an electromechanical instrument produced an analytical sensitivity of 10 CFU (95% probability of detection) of C. difficile in fecal samples, along with discrimination against other enteric bacteria. To verify automated assay function, 130 patient samples were tested: 31/32 positive samples (97% sensitive; 95% confidence interval [CI], 82 to 99%) and 98/98 negative samples (100% specific; 95% CI, 95 to 100%) were scored correctly. Large-scale clinical studies are now planned to determine clinical sensitivity and specificity. PMID:22675134

  11. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    NASA Astrophysics Data System (ADS)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  12. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  13. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    SciTech Connect

    Williams, Alex C; Hitt, Austin N; Voisin, Sophie; Tourassi, Georgia

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  14. An Assessment of the Technology of Automated Rendezvous and Capture in Space

    NASA Technical Reports Server (NTRS)

    Polites, M. E.

    1998-01-01

    This paper presents the results of a study to assess the technology of automated rendezvous and capture (AR&C) in space. The outline of the paper is as follows. First, the history of manual and automated rendezvous and capture and rendezvous and dock is presented. Next, the need for AR&C in space is established. Then, today's technology and ongoing technology efforts related to AR&C in space are reviewed. In light of these, AR&C systems are proposed that meet NASA's future needs, but can be developed in a reasonable amount of time with a reasonable amount of money. Technology plans for developing these systems are presented; cost and schedule are included.

  15. The development of an automated sentence generator for the assessment of reading speed.

    PubMed

    Crossland, Michael D; Legge, Gordon E; Dakin, Steven C

    2008-03-28

    Reading speed is an important outcome measure for many studies in neuroscience and psychology. Conventional reading speed tests have a limited corpus of sentences and usually require observers to read sentences aloud. Here we describe an automated sentence generator which can create over 100,000 unique sentences, scored using a true/false response. We propose that an estimate of the minimum exposure time required for observers to categorise the truth of such sentences is a good alternative to reading speed measures that guarantees comprehension of the printed material. Removing one word from the sentence reduces performance to chance, indicating minimal redundancy. Reading speed assessed using rapid serial visual presentation (RSVP) of these sentences is not statistically different from using MNREAD sentences. The automated sentence generator would be useful for measuring reading speed with button-press response (such as within MRI scanners) and for studies requiring many repeated measures of reading speed.

  16. Towards Automating Clinical Assessments: A Survey of the Timed Up and Go (TUG)

    PubMed Central

    Sprint, Gina; Cook, Diane; Weeks, Douglas

    2016-01-01

    Older adults often suffer from functional impairments that affect their ability to perform everyday tasks. To detect the onset and changes in abilities, healthcare professionals administer standardized assessments. Recently, technology has been utilized to complement these clinical assessments to gain a more objective and detailed view of functionality. In the clinic and at home, technology is able to provide more information about patient performance and reduce subjectivity in outcome measures. The timed up and go (TUG) test is one such assessment recently instrumented with technology in several studies, yielding promising results towards the future of automating clinical assessments. Potential benefits of technological TUG implementations include additional performance parameters, generated reports, and the ability to be self-administered in the home. In this paper, we provide an overview of the TUG test and technologies utilized for TUG instrumentation. We then critically review the technological advancements and follow up with an evaluation of the benefits and limitations of each approach. Finally, we analyze the gaps in the implementations and discuss challenges for future research towards automated, self-administered assessment in the home. PMID:25594979

  17. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    PubMed

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  18. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    PubMed Central

    Otten, Paul; Kim, Jonghyun; Son, Sang Hyuk

    2015-01-01

    Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA), are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA) for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods. PMID:26287206

  19. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study.

    PubMed

    Otten, Paul; Kim, Jonghyun; Son, Sang Hyuk

    2015-01-01

    Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA), are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA) for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient's upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods. PMID:26287206

  20. A semi-automated micro-method for the histological assessment of fat embolism.

    PubMed

    Busuttil, A; Hanley, J J

    1994-01-01

    A method of quantitatively determining the volume of fat emboli in a tissue using an image analysis system (I.B.A.S.) was developed. This procedure is an interactive, semi-automated tool allowing the quick and accurate gathering of large quantities of data from sections of different tissue samples stained by osmium tetroxide. The development of this procedure was aimed at producing a system which is reliable, reproducible and semi-automated thereby enabling epidemiological and serial studies to be made of a large number of histological sections from different tissues. The system was tested in a study of tissue sections from a series of fatalities from an aircraft crash in an attempt at correlating the incidence of fat emboli with the presence of multiple fractures and soft tissue injuries, the correlation to be made being between the quantitative presence of fat emboli and the extent and severity of injuries suffered. PMID:7529546

  1. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    PubMed

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions.

  2. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  3. A New Automated Sample Transfer System for Instrumental Neutron Activation Analysis

    PubMed Central

    Ismail, S. S.

    2010-01-01

    A fully automated and fast pneumatic transport system for short-time activation analysis was recently developed. It is suitable for small nuclear research reactors or laboratories that are using neutron generators and other neutron sources. It is equipped with a programmable logic controller, software package, and 12 devices to facilitate optimal analytical procedures. 550 ms were only necessary to transfer the irradiated capsule (diameter: 15 mm, length: 50 mm, weight: 4 gram) to the counting chamber at a distance of 20 meters using pressurized air (4 bars) as a transport gas. PMID:20369063

  4. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  5. An automated procedure for the simultaneous determination of specific conductance and pH in natural water samples

    USGS Publications Warehouse

    Eradmann, D.E.; Taylor, H.E.

    1978-01-01

    An automated, continuous-flow system is utilized to determine specific conductance and pH simultaneously in natural waters. A direct electrometric procedure is used to determine values in the range pH 4-9. The specific conductance measurements are made with an electronically modified, commercially available conductivity meter interfaced to a separate module containing the readout control devices and printer. The system is designed to switch ranges automatically to accommodate optimum analysis of widely varying conductances ranging from a few ??mhos cm-1 to 15,000 ??mho cm-1. Thirty samples per hour can be analyzed. Comparison of manual and automated procedures for 40 samples showed that the average differences were 1.3% for specific conductance and 0.07 units for pH. The relative standard deviation for 25 replicate values for each of five samples was significantly less than 1% for the specific conductance determination; the standard deviation for the pH determination was ??? 0.06 pH units. ?? 1978.

  6. Sensitivity testing of trypanosome detection by PCR from whole blood samples using manual and automated DNA extraction methods.

    PubMed

    Dunlop, J; Thompson, C K; Godfrey, S S; Thompson, R C A

    2014-11-01

    Automated extraction of DNA for testing of laboratory samples is an attractive alternative to labour-intensive manual methods when higher throughput is required. However, it is important to maintain the maximum detection sensitivity possible to reduce the occurrence of type II errors (false negatives; failure to detect the target when it is present), especially in the biomedical field, where PCR is used for diagnosis. We used blood infected with known concentrations of Trypanosoma copemani to test the impact of analysis techniques on trypanosome detection sensitivity by PCR. We compared combinations of a manual and an automated DNA extraction method and two different PCR primer sets to investigate the impact of each on detection levels. Both extraction techniques and specificity of primer sets had a significant impact on detection sensitivity. Samples extracted using the same DNA extraction technique performed substantially differently for each of the separate primer sets. Type I errors (false positives; detection of the target when it is not present), produced by contaminants, were avoided with both extraction methods. This study highlights the importance of testing laboratory techniques with known samples to optimise accuracy of test results.

  7. SU-E-I-94: Automated Image Quality Assessment of Radiographic Systems Using An Anthropomorphic Phantom

    SciTech Connect

    Wells, J; Wilson, J; Zhang, Y; Samei, E; Ravin, Carl E.

    2014-06-01

    Purpose: In a large, academic medical center, consistent radiographic imaging performance is difficult to routinely monitor and maintain, especially for a fleet consisting of multiple vendors, models, software versions, and numerous imaging protocols. Thus, an automated image quality control methodology has been implemented using routine image quality assessment with a physical, stylized anthropomorphic chest phantom. Methods: The “Duke” Phantom (Digital Phantom 07-646, Supertech, Elkhart, IN) was imaged twice on each of 13 radiographic units from a variety of vendors at 13 primary care clinics. The first acquisition used the clinical PA chest protocol to acquire the post-processed “FOR PRESENTATION” image. The second image was acquired without an antiscatter grid followed by collection of the “FOR PROCESSING” image. Manual CNR measurements were made from the largest and thickest contrast-detail inserts in the lung, heart, and abdominal regions of the phantom in each image. An automated image registration algorithm was used to estimate the CNR of the same insert using similar ROIs. Automated measurements were then compared to the manual measurements. Results: Automatic and manual CNR measurements obtained from “FOR PRESENTATION” images had average percent differences of 0.42%±5.18%, −3.44%±4.85%, and 1.04%±3.15% in the lung, heart, and abdominal regions, respectively; measurements obtained from “FOR PROCESSING” images had average percent differences of -0.63%±6.66%, −0.97%±3.92%, and −0.53%±4.18%, respectively. The maximum absolute difference in CNR was 15.78%, 10.89%, and 8.73% in the respective regions. In addition to CNR assessment of the largest and thickest contrast-detail inserts, the automated method also provided CNR estimates for all 75 contrast-detail inserts in each phantom image. Conclusion: Automated analysis of a radiographic phantom has been shown to be a fast, robust, and objective means for assessing radiographic

  8. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    SciTech Connect

    Rahman, Nur Aira Abd Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B.; Salim, Nazaratul Ashifa Bt. Abdullah; Ismail, Nadiah Binti

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  9. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  10. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    NASA Technical Reports Server (NTRS)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  11. Non-destructive automated sampling of mycotoxins in bulk food and feed - A new tool for required harmonization.

    PubMed

    Spanjer, M; Stroka, J; Patel, S; Buechler, S; Pittet, A; Barel, S

    2001-06-01

    Mycotoxins contamination is highly non-uniformly distributed as is well recog-nized by the EC, by not only setting legal limits in a series of commodities, but also schedule a sampling plan that takes this heterogeneity into account. In practice however, it turns out that it is very difficult to carry out this sampling plan in a harmonised way. Applying the sampling plan to a container filled with pallets of bags (i.e. with nuts or coffee beans) varies from very laborious to almost impossible. The presented non-destructive automated method to sample bulk food could help to overcome these practical problems and to enforcing of EC directives. It is derived from a tested and approved technology for detection of illicit substances in security applications. It has capability to collect and iden-tify ultra trace contaminants, i.e. from a fingerprint of chemical substance in a bulk of goods, a cargo pallet load (~ 1000 kg) with boxes and commodities.The technology, patented for explosives detection, uses physical and chemistry processes for excitation and remote rapid enhanced release of contaminant residues, vapours and particulate, of the inner/outer surfaces of inspected bulk and collect them on selective probes. The process is automated, takes only 10 minutes, is non-destructive and the bulk itself remains unharmed. The system design is based on applicable international regulations for shipped cargo hand-ling and transportation by road, sea and air. After this process the pallet can be loaded on a truck, ship or plane. Analysis can be carried out before the cargo leaves the place of shipping. The potent application of this technology for myco-toxins detection, has been demonstrated by preliminary feasibility experiments. Aflatoxins were detected in pistachios and ochratoxin A in green coffee beans bulk. Both commodities were naturally contaminated, priory found and confirm-ed by common methods as used at routine inspections. Once the contaminants are extracted from a

  12. Automated protein hydrolysis delivering sample to a solid acid catalyst for amino acid analysis.

    PubMed

    Masuda, Akiko; Dohmae, Naoshi

    2010-11-01

    In this study, we developed an automatic protein hydrolysis system using strong cation-exchange resins as solid acid catalysts. Examining several kinds of inorganic solid acids and cation-exchange resins, we found that a few cation-exchange resins worked as acid catalysts for protein hydrolysis when heated in the presence of water. The most efficient resin yielded amounts of amino acids that were over 70% of those recovered after conventional hydrolysis with hydrochloric acid and resulted in amino acid compositions matching the theoretical values. The solid-acid hydrolysis was automated by packing the resin into columns, combining the columns with a high-performance liquid chromatography system, and heating them. The amino acids that constitute a protein can thereby be determined, minimizing contamination from the environment.

  13. Automated assessment of noninvasive filling pressure using color Doppler M-mode echocardiography

    NASA Technical Reports Server (NTRS)

    Greenberg, N. L.; Firstenberg, M. S.; Cardon, L. A.; Zuckerman, J.; Levine, B. D.; Garcia, M. J.; Thomas, J. D.

    2001-01-01

    Assessment of left ventricular filling pressure usually requires invasive hemodynamic monitoring to follow the progression of disease or the response to therapy. Previous investigations have shown accurate estimation of wedge pressure using noninvasive Doppler information obtained from the ratio of the wave propagation slope from color M-mode (CMM) images and the peak early diastolic filling velocity from transmitral Doppler images. This study reports an automated algorithm that derives an estimate of wedge pressure based on the spatiotemporal velocity distribution available from digital CMM Doppler images of LV filling.

  14. IntelliCages and automated assessment of learning in group-housed mice

    NASA Astrophysics Data System (ADS)

    Puścian, Alicja; Knapska, Ewelina

    2014-11-01

    IntelliCage is a fully automated, computer controlled system, which can be used for long-term monitoring of behavior of group-housed mice. Using standardized experimental protocols we can assess cognitive abilities and behavioral flexibility in appetitively and aversively motivated tasks, as well as measure social influences on learning of the subjects. We have also identified groups of neurons specifically activated by appetitively and aversively motivated learning within the amygdala, function of which we are going to investigate optogenetically in the future.

  15. Laboratory and Field Testing of an Automated Atmospheric Particle-Bound Reactive Oxygen Species Sampling-Analysis System

    PubMed Central

    Wang, Yungang; Hopke, Philip K.; Sun, Liping; Chalupa, David C.; Utell, Mark J.

    2011-01-01

    In this study, various laboratory and field tests were performed to develop an effective automated particle-bound ROS sampling-analysis system. The system uses 2′ 7′-dichlorofluorescin (DCFH) fluorescence method as a nonspecific, general indicator of the particle-bound ROS. A sharp-cut cyclone and a particle-into-liquid sampler (PILS) were used to collect PM2.5 atmospheric particles into slurry produced by a DCFH-HRP solution. The laboratory results show that the DCFH and H2O2 standard solutions could be kept at room temperature for at least three and eight days, respectively. The field test in Rochester, NY, shows that the average ROS concentration was 8.3 ± 2.2 nmol of equivalent H2O2 m−3 of air. The ROS concentrations were observed to be greater after foggy conditions. This study demonstrates the first practical automated sampling-analysis system to measure this ambient particle component. PMID:21577270

  16. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    PubMed

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-01

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  17. Qualification of an automated device to objectively assess the effect of hair care products on hair shine.

    PubMed

    Hagens, Ralf; Wiersbinski, Tim; Becker, Michael E; Weisshaar, Jürgen; Schreiner, Volker; Wenck, Horst

    2011-01-01

    The authors developed and qualified an automated routine screening tool to quantify hair shine. This tool is able to separately record individual properties of hair shine such as specular reflection and multiple reflection, as well as additional features such as sparkle, parallelism of hair fibers, and hair color, which strongly affect the subjective ranking by individual readers. A side-by-side comparison of different hair care and styling products with regard to hair shine using the automated screening tool in parallel with standard panel assessment showed that the automated system provides an almost identical ranking and the same statistical significances as the panel assessment. Provided stringent stratification of hair fibers for color and parallelism, the automated tool competes favorably with panel assessments of hair shine. In this case, data generated with the opsira Shine-Box are clearly superior over data generated by panel assessment in terms of reliability and repeatability, workload and time consumption, and sensitivity and specificity to detect differences after shampoo, conditioner, and leave-in treatment. The automated tool is therefore well suited to replace standard panel assessments in claim support, at least as a screening tool. A further advantage of the automated system over panel assessments is the fact that absolute numeric values are generated for a given hair care product, whereas panel assessments can only give rankings of a series of hair care products included in the same study. Thus, the absolute numeric data generated with the automated system allow comparison of hair care products between studies or at different time points after treatment.

  18. In vivo assessment of human burn scars through automated quantification of vascularity using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liew, Yih Miin; McLaughlin, Robert A.; Gong, Peijun; Wood, Fiona M.; Sampson, David D.

    2013-06-01

    In scars arising from burns, objective assessment of vascularity is important in the early identification of pathological scarring, and in the assessment of progression and treatment response. We demonstrate the first clinical assessment and automated quantification of vascularity in cutaneous burn scars of human patients in vivo that uses optical coherence tomography (OCT). Scar microvasculature was delineated in three-dimensional OCT images using speckle decorrelation. The diameter and area density of blood vessels were automatically quantified. A substantial increase was observed in the measured density of vasculature in hypertrophic scar tissues (38%) when compared against normal, unscarred skin (22%). A proliferation of larger vessels (diameter≥100 μm) was revealed in hypertrophic scarring, which was absent from normal scars and normal skin over the investigated physical depth range of 600 μm. This study establishes the feasibility of this methodology as a means of clinical monitoring of scar progression.

  19. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    PubMed Central

    Burge, Scott R.; Hoffman, Dave A.; Hartman, Mary J.; Venedam, Richard J.

    2005-01-01

    The capabilities of a “universal platform” for the deployment of analytical sensors in the field for long-term monitoring of environmental contaminants were expanded in this investigation. The platform was previously used to monitor trichloroethene in monitoring wells and at groundwater treatment systems (1,2). The platform was interfaced with chromium (VI) and conductivity analytical systems to monitor shallow wells installed adjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. A groundwater plume of hexavalent chromium is discharging into the Columbia River through the gravels beds used by spawning salmon. The sampling/analytical platform was deployed for the purpose of collecting data on subsurface hexavalent chromium concentrations at more frequent intervals than was possible with the previous sampling and analysis methods employed a the Site.

  20. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination.

    PubMed

    Milliard, Alex; Durand-Jézéquel, Myriam; Larivière, Dominic

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO(2)/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg(-1) for 5-300 mg of sample. PMID:21167982

  1. ALVEOLAR BREATH SAMPLING AND ANALYSIS IN HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Alveolar breath sampling and analysis can be extremely useful in exposure assessment studies involving volatile organic compounds (VOCs). Over recent years scientists from the EPA's National Exposure Research Laboratory have developed and refined an alveolar breath collection ...

  2. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  3. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  4. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    PubMed Central

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G.; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  5. Use of automated monitoring to assess behavioral toxicology in fish: Linking behavior and physiology

    USGS Publications Warehouse

    Brewer, S.K.; DeLonay, A.J.; Beauvais, S.L.; Little, E.E.; Jones, S.B.

    1999-01-01

    We measured locomotory behaviors (distance traveled, speed, tortuosity of path, and rate of change in direction) with computer-assisted analysis in 30 day posthatch rainbow trout (Oncorhynchus mykiss) exposed to pesticides. We also examined cholinesterase inhibition as a potential endpoint linking physiology and behavior. Sublethal exposure to chemicals often causes changes in swimming behavior, reflecting alterations in sensory and motor systems. Swimming behavior also integrates functions of the nervous system. Rarely are the connections between physiology and behavior made. Although behavior is often suggested as a sensitive, early indicator of toxicity, behavioral toxicology has not been used to its full potential because conventional methods of behavioral assessment have relied on manual techniques, which are often time-consuming and difficult to quantify. This has severely limited the application and utility of behavioral procedures. Swimming behavior is particularly amenable to computerized assessment and automated monitoring. Locomotory responses are sensitive to toxicants and can be easily measured. We briefly discuss the use of behavior in toxicology and automated techniques used in behavioral toxicology. We also describe the system we used to determine locomotory behaviors of fish, and present data demonstrating the system's effectiveness in measuring alterations in response to chemical challenges. Lastly, we correlate behavioral and physiological endpoints.

  6. Development of a fully automated Flow Injection analyzer implementing bioluminescent biosensors for water toxicity assessment.

    PubMed

    Komaitis, Efstratios; Vasiliou, Efstathios; Kremmydas, Gerasimos; Georgakopoulos, Dimitrios G; Georgiou, Constantinos

    2010-01-01

    This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb(2+), Hg(2+) and Cu(2+)) solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor's response to carrier solutions of different pHs was tested. Vibrio fischeri's bioluminescence is promoted in the pH 5-10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions. PMID:22163592

  7. Rapid and automated sample preparation for nucleic acid extraction on a microfluidic CD (compact disk)

    NASA Astrophysics Data System (ADS)

    Kim, Jitae; Kido, Horacio; Zoval, Jim V.; Gagné, Dominic; Peytavi, Régis; Picard, François J.; Bastien, Martine; Boissinot, Maurice; Bergeron, Michel G.; Madou, Marc J.

    2006-01-01

    Rapid and automated preparation of PCR (polymerase chain reaction)-ready genomic DNA was demonstrated on a multiplexed CD (compact disk) platform by using hard-to-lyse bacterial spores. Cell disruption is carried out while beadcell suspensions are pushed back and forth in center-tapered lysing chambers by angular oscillation of the disk - keystone effect. During this lysis period, the cell suspensions are securely held within the lysing chambers by heatactivated wax valves. Upon application of a remote heat to the disk in motion, the wax valves release lysate solutions into centrifuge chambers where cell debris are separated by an elevated rotation of the disk. Only debris-free DNA extract is then transferred to collection chambers by capillary-assisted siphon and collected for heating that inactivates PCR inhibitors. Lysing capacity was evaluated using a real-time PCR assay to monitor the efficiency of Bacillus globigii spore lysis. PCR analysis showed that 5 minutes' CD lysis run gave spore lysis efficiency similar to that obtained with a popular commercial DNA extraction kit (i.e., IDI-lysis kit from GeneOhm Sciences Inc.) which is highly efficient for microbial cell and spore lysis. This work will contribute to the development of an integrated CD-based assay for rapid diagnosis of infectious diseases.

  8. Automated, Unobtrusive, Action-by-Action Assessment of Self-Regulation during Learning with an Intelligent Tutoring System

    ERIC Educational Resources Information Center

    Aleven, Vincent; Roll, Ido; McLaren, Bruce M.; Koedinger, Kenneth R.

    2010-01-01

    Assessment of students' self-regulated learning (SRL) requires a method for evaluating whether observed actions are appropriate acts of self-regulation in theEv specific learning context in which they occur. We review research that has resulted in an automated method for context-sensitive assessment of a specific SRL strategy, help seeking while…

  9. The Effects of Finite Sampling on State Assessment Sample Requirements. NAEP Validity Studies. Working Paper Series.

    ERIC Educational Resources Information Center

    Chromy, James R.

    This study addressed statistical techniques that might ameliorate some of the sampling problems currently facing states with small populations participating in State National Assessment of Educational Progress (NAEP) assessments. The study explored how the application of finite population correction factors to the between-school component of…

  10. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    PubMed

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-01

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  11. Automated Radioanalytical System Incorporating Microwave-Assisted Sample Preparation, Chemical Separation, and Online Radiometric Detection for the Monitoring of Total 99Tc in Nuclear Waste Processing Streams

    SciTech Connect

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  12. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  13. Performance on the Defense Automated Neurobehavioral Assessment across controlled environmental conditions.

    PubMed

    Haran, F Jay; Dretsch, Michael N; Bleiberg, Joseph

    2016-01-01

    Neurocognitive assessment tools (NCAT) are commonly used to screen for changes in cognitive functioning following a mild traumatic brain injury and to assist with a return to duty decision. As such, it is critical to determine if performance on the Defense Automated Neurobehavioral Assessment (DANA) is adversely affected by operationally-relevant field environments. Differences in DANA performance between a thermoneutral environment and three simulated operationally-relevant field environments across the thermal stress continuum were calculated for 16 healthy U.S. Navy service members. Practice effects associated with brief test-retest intervals were calculated within each environmental condition. There were no significant differences between the simulated environmental conditions suggesting that performance on the DANA Brief is not impacted by thermal stress. Additionally, there were no significant differences in performance within each simulated environmental condition associated with repeated administrations. PMID:27182844

  14. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples.

    PubMed

    Budding, Andries E; Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M J E; Savelkoul, Paul H M

    2016-04-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice.

  15. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    NASA Technical Reports Server (NTRS)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  16. Investigation of Mercury Wet Deposition Physicochemistry in the Ohio River Valley through Automated Sequential Sampling

    EPA Science Inventory

    Intra-storm variability and soluble fractionation was explored for summer-time rain events in Steubenville, Ohio to evaluate the physical processes controlling mercury (Hg) in wet deposition in this industrialized region. Comprehensive precipitation sample collection was conducte...

  17. Automated sample preparation station for studying self-diffusion in porous solids with NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Hedin, Niklas; DeMartin, Gregory J.; Reyes, Sebastián C.

    2006-03-01

    In studies of gas diffusion in porous solids with nuclear magnetic resonance (NMR) spectroscopy the sample preparation procedure becomes very important. An apparatus is presented here that pretreats the sample ex situ and accurately sets the desired pressure and temperature within the NMR tube prior to its introduction in the spectrometer. The gas manifold that supplies the NMR tube is also connected to a microbalance containing another portion of the same sample, which is kept at the same temperature as the sample in the NMR tube. This arrangement permits the simultaneous measurement of the adsorption loading on the sample, which is required for the interpretation of the NMR diffusion experiments. Furthermore, to ensure a good seal of the NMR tube, a hybrid valve design composed of titanium, a Teflon® seat, and Kalrez® O-rings is utilized. A computer controlled algorithm ensures the accuracy and reproducibility of all the procedures, enabling the NMR diffusion experiments to be performed at well controlled conditions of pressure, temperature, and amount of gas adsorbed on the porous sample.

  18. Automated total and radioactive strontium separation and preconcentration in samples of environmental interest exploiting a lab-on-valve system.

    PubMed

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Victor

    2012-07-15

    A novel lab-on-valve system has been developed for strontium determination in environmental samples. Miniaturized lab-on-valve system potentially offers facilities to allow any kind of chemical and physical processes, including fluidic and microcarrier bead control, homogenous reaction and liquid-solid interaction. A rapid, inexpensive and fully automated method for the separation and preconcentration of total and radioactive strontium, using a solid phase extraction material (Sr-Resin), has been developed. Total strontium concentrations are determined by ICP-OES and (90)Sr activities by a low background proportional counter. The method has been successfully applied to different water samples of environmental interest. The proposed system offers minimization of sample handling, drastic reduction of reagent volume, improvement of the reproducibility and sample throughput and attains a significant decrease of both time and cost per analysis. The LLD of the total Sr reached is 1.8ng and the minimum detectable activity for (90)Sr is 0.008Bq. The repeatability of the separation procedure is 1.2% (n=10). PMID:22817934

  19. Detection of motile micro-organisms in biological samples by means of a fully automated image processing system

    NASA Astrophysics Data System (ADS)

    Alanis, Elvio; Romero, Graciela; Alvarez, Liliana; Martinez, Carlos C.; Hoyos, Daniel; Basombrio, Miguel A.

    2001-08-01

    A fully automated image processing system for detection of motile microorganism is biological samples is presented. The system is specifically calibrated for determining the concentration of Trypanosoma Cruzi parasites in blood samples of mice infected with Chagas disease. The method can be adapted for use in other biological samples. A thin layer of blood infected by T. cruzi parasites is examined in a common microscope in which the images of the vision field are taken by a CCD camera and temporarily stored in the computer memory. In a typical field, a few motile parasites are observable surrounded by blood red cells. The parasites have low contrast. Thus, they are difficult to detect visually but their great motility betrays their presence by the movement of the nearest neighbor red cells. Several consecutive images of the same field are taken, decorrelated with each other where parasites are present, and digitally processed in order to measure the number of parasites present in the field. Several fields are sequentially processed in the same fashion, displacing the sample by means of step motors driven by the computer. A direct advantage of this system is that its results are more reliable and the process is less time consuming than the current subjective evaluations made visually by technicians.

  20. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples

    PubMed Central

    Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M. J. E.; Savelkoul, Paul H. M.

    2016-01-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice. PMID:26763956

  1. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    PubMed Central

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  2. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    PubMed

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  3. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    PubMed

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  4. Automation of preparation of nonmetallic samples for analysis by atomic absorption and inductively coupled plasma spectrometry

    NASA Technical Reports Server (NTRS)

    Wittmann, A.; Willay, G.

    1986-01-01

    For a rapid preparation of solutions intended for analysis by inductively coupled plasma emission spectrometry or atomic absorption spectrometry, an automatic device called Plasmasol was developed. This apparatus used the property of nonwettability of glassy C to fuse the sample in an appropriate flux. The sample-flux mixture is placed in a composite crucible, then heated at high temperature, swirled until full dissolution is achieved, and then poured into a water-filled beaker. After acid addition, dissolution of the melt, and filling to the mark, the solution is ready for analysis. The analytical results obtained, either for oxide samples or for prereduced iron ores show that the solutions prepared with this device are undistinguished from those obtained by manual dissolutions done by acid digestion or by high temperature fusion. Preparation reproducibility and analytical tests illustrate the performance of Plasmasol.

  5. Rapid habitability assessment of Mars samples by pyrolysis-FTIR

    NASA Astrophysics Data System (ADS)

    Gordon, Peter R.; Sephton, Mark A.

    2016-02-01

    Pyrolysis Fourier transform infrared spectroscopy (pyrolysis FTIR) is a potential sample selection method for Mars Sample Return missions. FTIR spectroscopy can be performed on solid and liquid samples but also on gases following preliminary thermal extraction, pyrolysis or gasification steps. The detection of hydrocarbon and non-hydrocarbon gases can reveal information on sample mineralogy and past habitability of the environment in which the sample was created. The absorption of IR radiation at specific wavenumbers by organic functional groups can indicate the presence and type of any organic matter present. Here we assess the utility of pyrolysis-FTIR to release water, carbon dioxide, sulfur dioxide and organic matter from Mars relevant materials to enable a rapid habitability assessment of target rocks for sample return. For our assessment a range of minerals were analyzed by attenuated total reflectance FTIR. Subsequently, the mineral samples were subjected to single step pyrolysis and multi step pyrolysis and the products characterised by gas phase FTIR. Data from both single step and multi step pyrolysis-FTIR provide the ability to identify minerals that reflect habitable environments through their water and carbon dioxide responses. Multi step pyrolysis-FTIR can be used to gain more detailed information on the sources of the liberated water and carbon dioxide owing to the characteristic decomposition temperatures of different mineral phases. Habitation can be suggested when pyrolysis-FTIR indicates the presence of organic matter within the sample. Pyrolysis-FTIR, therefore, represents an effective method to assess whether Mars Sample Return target rocks represent habitable conditions and potential records of habitation and can play an important role in sample triage operations.

  6. Automated Lung Segmentation and Image Quality Assessment for Clinical 3-D/4-D-Computed Tomography

    PubMed Central

    Li, Guang

    2014-01-01

    4-D-computed tomography (4DCT) provides not only a new dimension of patient-specific information for radiation therapy planning and treatment, but also a challenging scale of data volume to process and analyze. Manual analysis using existing 3-D tools is unable to keep up with vastly increased 4-D data volume, automated processing and analysis are thus needed to process 4DCT data effectively and efficiently. In this paper, we applied ideas and algorithms from image/signal processing, computer vision, and machine learning to 4DCT lung data so that lungs can be reliably segmented in a fully automated manner, lung features can be visualized and measured on the fly via user interactions, and data quality classifications can be computed in a robust manner. Comparisons of our results with an established treatment planning system and calculation by experts demonstrated negligible discrepancies (within ±2%) for volume assessment but one to two orders of magnitude performance enhancement. An empirical Fourier-analysis-based quality measure-delivered performances closely emulating human experts. Three machine learners are inspected to justify the viability of machine learning techniques used to robustly identify data quality of 4DCT images in the scalable manner. The resultant system provides a toolkit that speeds up 4-D tasks in the clinic and facilitates clinical research to improve current clinical practice. PMID:25621194

  7. Automated radioanalytical system for the determination of 90Sr in environmental water samples by 90Y Cherenkov radiation counting.

    PubMed

    O'Hara, Matthew J; Burge, Scott R; Grate, Jay W

    2009-02-01

    Strontium-90 is an environmental contaminant at several U.S. Department of Energy sites, including the Hanford site, Washington. Due to its high biological toxicity and moderately long half-life of approximately 29 years, groundwater and surface water contamination plumes containing 90Sr must be closely monitored. The highly energetic beta radiation from the short-lived 90Y daughter of 90Sr generates Cherenkov photons in aqueous media that can be detected by photomultiplier tubes with good sensitivity, without the use of scintillation cocktails. A laboratory-based automated fluid handling system coupled to a Cherenkov radiation detector for measuring 90Sr via the high-energy beta decay of its daughter, 90Y, has been assembled and tested using standards prepared in Hanford groundwater. A SuperLig 620 column in the system enables preconcentration and separation of 90Sr from matrix and radiological interferences and, by removing the 90Y present in the sample, creates a pure 90Sr source from which subsequent 90Y ingrowth can be measured. This 90Y is fluidically transferred from the column to the Cherenkov detection flow cell for quantification and calculation of the original 90Sr concentration. Preconcentrating 0.35 L sample volumes by this approach, we have demonstrated a detection limit of 0.057 Bq/L using a 5 mL volume Cherenkov flow cell, which is below the drinking water limit of 0.30 Bq/L. This method does not require that the sample be at secular equilibrium prior to measurement. The system can also deliver water samples directly to the counting cell for analysis without preconcentration or separation, assuming that the sample is in secular equilibrium, with a detection limit of 7 Bq/L. The performance of the analysis method using a preconcentrating separation column is characterized in detail and compared with direct counting. This method is proposed as the basis for an automated fluidic monitor for 90Sr for unattended at-site operation.

  8. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-01

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits.

  9. [Automated serial diagnosis of donor blood samples. Ergonomic and economic organization structure].

    PubMed

    Stoll, T; Fischer-Fröhlich, C L; Mayer, G; Hanfland, P

    1990-01-01

    A comprehensive computer-aided administration-system for blood-donors is presented. Ciphered informations of barcode-labels allow the automatic and nevertheless selective pipetting of samples by pipetting-robots. Self-acting analysis-results are transferred to a host-computer in order to actualize a donor data-base.

  10. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples.

    PubMed

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M; Fortina, Paolo; Pienta, Kenneth J; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization.

  11. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples

    PubMed Central

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A.; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M.; Fortina, Paolo; Pienta, Kenneth J.; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization. PMID:26808060

  12. Automated on-line preconcentration of palladium on different sorbents and its determination in environmental samples.

    PubMed

    Sánchez Rojas, Fuensanta; Bosch Ojeda, Catalina; Cano Pavón, José Manuel

    2007-01-01

    The determination of noble metals in environmental samples is of increasing importance. Palladium is often employed as a catalyst in chemical industry and is also used with platinum and rhodium in motor car catalytic converters which might cause environmental pollution problems. Two different sorbents for palladium preconcentration in different samples were investigated: silica gel functionalized with 1,5-bis(di-2-pyridyl)methylene tbiocarbohydrazide (DPTH-gel) and [1,5-Bis(2-pyridyl)-3-sulphophenyI methylene thiocarbonohydrazide (PSTH) immobilised on an anion-exchange resin (Dowex lx8-200)]. The sorbents were tested in a micro-column, placed in the auto-sampler arm, at the flow rate 2.8 mL min(-1). Elution was performed with 4 M HCl and 4 M HNO3, respectively. Satisfactory results were obtained for two sorbents.

  13. Improving semi-automated segmentation by integrating learning with active sampling

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Okada, Kazunori; Brown, Matthew

    2012-02-01

    Interactive segmentation algorithms such as GrowCut usually require quite a few user interactions to perform well, and have poor repeatability. In this study, we developed a novel technique to boost the performance of the interactive segmentation method GrowCut involving: 1) a novel "focused sampling" approach for supervised learning, as opposed to conventional random sampling; 2) boosting GrowCut using the machine learned results. We applied the proposed technique to the glioblastoma multiforme (GBM) brain tumor segmentation, and evaluated on a dataset of ten cases from a multiple center pharmaceutical drug trial. The results showed that the proposed system has the potential to reduce user interaction while maintaining similar segmentation accuracy.

  14. Automated microextraction sample preparation coupled on-line to FT-ICR-MS: application to desalting and concentration of river and marine dissolved organic matter.

    PubMed

    Morales-Cid, Gabriel; Gebefugi, Istvan; Kanawati, Basem; Harir, Mourad; Hertkorn, Norbert; Rosselló-Mora, Ramón; Schmitt-Kopplin, Philippe

    2009-10-01

    Sample preparation procedures are in most cases sample- and time-consuming and commonly require the use of a large amount of solvents. Automation in this regard can optimize the minimal-needed injection volume and the solvent consumption will be efficiently reduced. A new fully automated sample desalting and pre-concentration technique employing microextraction by packed sorbents (MEPS) cartridges is implemented and coupled to an ion cyclotron resonance Fourier-transform mass spectrometer (ICR-FT/MS). The performance of non-target mass spectrometric analysis is compared for the automated versus off-line sample preparation for several samples of aqueous natural organic matter. This approach can be generalized for any metabolite profiling or metabolome analysis of biological materials but was optimized herein using a well characterized but highly complex organic mixture: a surface water and its well-characterized natural organic matter and a marine sample having a highly salt charge and enabling to validate the presented automatic system for salty samples. The analysis of Suwannee River water showed selective C18-MEPS enrichment of chemical signatures with average H/C and O/C elemental ratios and loss of both highly polar and highly aromatic structures from the original sample. Automated on-line application to marine samples showed desalting and different chemical signatures from surface to bottom water. Relative comparison of structural footprints with the C18-concentration/desalting procedure however enabled to demonstrate that the surface water film was more concentrated in surface-active components of natural (fatty acids) and anthropogenic origin (sulfur-containing surfactants). Overall, the relative standard deviation distribution in terms of peak intensity was improved by automating the proposed on-line method. PMID:19685041

  15. Steady-State Vacuum Ultraviolet Exposure Facility With Automated Lamp Calibration and Sample Positioning Fabricated

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Steuber, Thomas J.; Banks, Bruce A.; Dever, Joyce A.

    2000-01-01

    The Next Generation Space Telescope (NGST) will be placed in an orbit that will subject it to constant solar radiation during its planned 10-year mission. A sunshield will be necessary to passively cool the telescope, protecting it from the Sun s energy and assuring proper operating temperatures for the telescope s instruments. This sunshield will be composed of metalized polymer multilayer insulation with an outer polymer membrane (12 to 25 mm in thickness) that will be metalized on the back to assure maximum reflectance of sunlight. The sunshield must maintain mechanical integrity and optical properties for the full 10 years. This durability requirement is most challenging for the outermost, constantly solar-facing polymer membrane of the sunshield. One of the potential threats to the membrane material s durability is from vacuum ultraviolet (VUV) radiation in wavelengths below 200 nm. Such radiation can be absorbed in the bulk of these thin polymer membrane materials and degrade the polymer s optical and mechanical properties. So that a suitable membrane material can be selected that demonstrates durability to solar VUV radiation, ground-based testing of candidate materials must be conducted to simulate the total 10- year VUV exposure expected during the Next Generation Space Telescope mission. The Steady State Vacuum Ultraviolet exposure facility was designed and fabricated at the NASA Glenn Research Center at Lewis Field to provide unattended 24-hr exposure of candidate materials to VUV radiation of 3 to 5 times the Sun s intensity in the wavelength range of 115 to 200 nm. The facility s chamber, which maintains a pressure of approximately 5 10(exp -6) torr, is divided into three individual exposure cells, each with a separate VUV source and sample-positioning mechanism. The three test cells are separated by a water-cooled copper shield plate assembly to minimize thermal effects from adjacent test cells. Part of the interior sample positioning mechanism of one

  16. Standardized assessment of cognitive functioning during development and aging using an automated touchscreen battery.

    PubMed

    Clark, C Richard; Paul, Robert H; Williams, Leanne M; Arns, Martijn; Fallahpour, Kamran; Handmer, Carolyn; Gordon, Evian

    2006-08-01

    This study examined the effects of age, gender and education on subjects spanning nine decades on a new cognitive battery of 12 tests. One thousand and seven participants between 6 and 82 completed the battery under standardized conditions using an automated, computerized touchscreen. Sensitive indicators of change were obtained on measures of attention and working memory, learning and memory retrieval, and language, visuospatial function, sensori-motor and executive function. Improvement tended to occur through to the third and fourth decade of life, followed by gradual decrement and/or stabilized performance thereafter. Gender differences were obtained on measures of sustained attention, verbal learning and memory, visuospatial processing and dexterity. Years of education in adults was reflected in performance on measures of verbal function. Overall, the test battery provided sensitive indicators on a range of cognitive functions suitable for the assessment of abnormal cognition, the evaluation of treatment effects and for longitudinal case management.

  17. Automated cell viability assessment using a microfluidics based portable imaging flow analyzer

    PubMed Central

    Jagannadh, Veerendra Kalyan; Adhikari, Jayesh Vasudeva; Gorthi, Sai Siva

    2015-01-01

    In this work, we report a system-level integration of portable microscopy and microfluidics for the realization of optofluidic imaging flow analyzer with a throughput of 450 cells/s. With the use of a cellphone augmented with off-the-shelf optical components and custom designed microfluidics, we demonstrate a portable optofluidic imaging flow analyzer. A multiple microfluidic channel geometry was employed to demonstrate the enhancement of throughput in the context of low frame-rate imaging systems. Using the cell-phone based digital imaging flow analyzer, we have imaged yeast cells present in a suspension. By digitally processing the recorded videos of the flow stream on the cellphone, we demonstrated an automated cell viability assessment of the yeast cell population. In addition, we also demonstrate the suitability of the system for blood cell counting. PMID:26015835

  18. Automated structural design with aeroelastic constraints - A review and assessment of the state of the art

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.

    1974-01-01

    A review and assessment of the state of the art in automated aeroelastic design is presented. Most of the aeroelastic design studies appearing in the literature deal with flutter, and, therefore, this paper also concentrates on flutter. The flutter design problem is divided into three cases: as isolated flutter mode, neighboring flutter modes, and a hump mode which can rise and cause a sudden, discontinuous change in the flutter velocity. Synthesis procedures are presented in terms of techniques that are appropriate for problems of various levels of difficulty. Current trends, which should result in more efficient, powerful and versatile design codes, are discussed. Approximate analysis procedures and the need for simultaneous consideration of multiple design requirements are emphasized.

  19. Automated cell viability assessment using a microfluidics based portable imaging flow analyzer.

    PubMed

    Jagannadh, Veerendra Kalyan; Adhikari, Jayesh Vasudeva; Gorthi, Sai Siva

    2015-03-01

    In this work, we report a system-level integration of portable microscopy and microfluidics for the realization of optofluidic imaging flow analyzer with a throughput of 450 cells/s. With the use of a cellphone augmented with off-the-shelf optical components and custom designed microfluidics, we demonstrate a portable optofluidic imaging flow analyzer. A multiple microfluidic channel geometry was employed to demonstrate the enhancement of throughput in the context of low frame-rate imaging systems. Using the cell-phone based digital imaging flow analyzer, we have imaged yeast cells present in a suspension. By digitally processing the recorded videos of the flow stream on the cellphone, we demonstrated an automated cell viability assessment of the yeast cell population. In addition, we also demonstrate the suitability of the system for blood cell counting. PMID:26015835

  20. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  1. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Assessing the Impact of Urban Growth and the use of Low Impact Development Practices.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impact of urban growth and evaluate the effects of low impact development (LID) practices. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov...

  2. Consistency of breast density categories in serial screening mammograms: A comparison between automated and human assessment.

    PubMed

    Holland, Katharina; van Zelst, Jan; den Heeten, Gerard J; Imhof-Tas, Mechli; Mann, Ritse M; van Gils, Carla H; Karssemeijer, Nico

    2016-10-01

    Reliable breast density measurement is needed to personalize screening by using density as a risk factor and offering supplemental screening to women with dense breasts. We investigated the categorization of pairs of subsequent screening mammograms into density classes by human readers and by an automated system. With software (VDG) and by four readers, including three specialized breast radiologists, 1000 mammograms belonging to 500 pairs of subsequent screening exams were categorized into either two or four density classes. We calculated percent agreement and the percentage of women that changed from dense to non-dense and vice versa. Inter-exam agreement (IEA) was calculated with kappa statistics. Results were computed for each reader individually and for the case that each mammogram was classified by one of the four readers by random assignment (group reading). Higher percent agreement was found with VDG (90.4%, CI 87.9-92.9%) than with readers (86.2-89.2%), while less plausible changes from non-dense to dense occur less often with VDG (2.8%, CI 1.4-4.2%) than with group reading (4.2%, CI 2.4-6.0%). We found an IEA of 0.68-0.77 for the readers using two classes and an IEA of 0.76-0.82 using four classes. IEA is significantly higher with VDG compared to group reading. The categorization of serial mammograms in density classes is more consistent with automated software than with a mixed group of human readers. When using breast density to personalize screening protocols, assessment with software may be preferred over assessment by radiologists.

  3. Automated quantitation of hemoglobin-based blood substitutes in whole blood samples.

    PubMed

    Kunicka, J; Malin, M; Zelmanovic, D; Katzenberg, M; Canfield, W; Shapiro, P; Mohandas, N

    2001-12-01

    It is necessary to develop methods for accurate monitoring of cell-free hemoglobin in circulation. Routine monitoring of circulating cell-free hemoglobin will be useful for evaluating the efficacy of blood substitute administration andfor determining the clearance rates of the blood substitute from circulation. In addition, discriminating between cell-free hemoglobin and cell-associated hemoglobin will enable accurate determination of RBC indices, mean cell hemoglobin and mean corpuscular hemoglobin concentration, in individuals receiving hemoglobin-based blood substitutes. As colorimetric methods used by hematology analyzers to quantitate the hemoglobin value of a blood sample cannot distinguish between cell-associated and cell-free hemoglobin, it is currently not feasible to quantitate the levels of hemoglobin substitutes in circulation. The advent of a technology that measures volume and hemoglobin concentration of individual RBCs provides an alternative strategy for quantitating the cell-associated hemoglobin in a blood sample. We document that the combined use of cell-based and colorimetric hemoglobin measurements provides accurate discrimination between cell-associated and cell-free hemoglobin over a wide range of hemoglobin levels. This strategy should enable rapid and accurate monitoring of the levels of cell-free hemoglobin substitutes in the circulation of recipients of these blood substitutes.

  4. Evaluation of the appropriate time period between sampling and analyzing for automated urinalysis

    PubMed Central

    Dolscheid-Pommerich, Ramona C.; Klarmann-Schulz, Ute; Conrad, Rupert; Stoffel-Wagner, Birgit; Zur, Berndt

    2016-01-01

    Introduction Preanalytical specifications for urinalysis must be strictly adhered to avoid false interpretations. Aim of the present study is to examine whether the preanalytical factor ‘time point of analysis’ significantly influences stability of urine samples for urine particle and dipstick analysis. Materials and methods In 321 pathological spontaneous urine samples, urine dipstick (Urisys™2400, Combur-10-Test™strips, Roche Diagnostics, Mannheim, Germany) and particle analysis (UF-1000 i™, Sysmex, Norderstedt, Germany) were performed within 90 min, 120 min and 240 min after urine collection. Results For urine particle analysis, a significant increase in conductivity (120 vs. 90 min: P < 0.001, 240 vs. 90 min: P < 0.001) and a significant decrease in WBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), RBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), casts (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and epithelial cells (120 vs. 90 min P = 0.610, 240 vs. 90 min P = 0.041) were found. There were no significant changes for bacteria. Regarding urine dipstick analysis, misclassification rates between measurements were significant for pH (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), leukocytes (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), nitrite (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), protein (120 vs. 90 min P < 0.001, 240 vs. 90 min P<0.001), ketone (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), blood (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), specific gravity (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and urobilinogen (120 vs. 90 min, P = 0.031). Misclassification rates were not significant for glucose and bilirubin. Conclusion Most parameters critically depend on the time window between sampling and analysis. Our study stresses the importance of adherence to early time points in urinalysis (within 90 min). PMID:26981022

  5. Versatile sample environments and automation for biological solution X-ray scattering experiments at the P12 beamline (PETRA III, DESY)

    PubMed Central

    Blanchet, Clement E.; Spilotros, Alessandro; Schwemmer, Frank; Graewert, Melissa A.; Kikhney, Alexey; Jeffries, Cy M.; Franke, Daniel; Mark, Daniel; Zengerle, Roland; Cipriani, Florent; Fiedler, Stefan; Roessle, Manfred; Svergun, Dmitri I.

    2015-01-01

    A high-brilliance synchrotron P12 beamline of the EMBL located at the PETRA III storage ring (DESY, Hamburg) is dedicated to biological small-angle X-ray scattering (SAXS) and has been designed and optimized for scattering experiments on macromolecular solutions. Scatterless slits reduce the parasitic scattering, a custom-designed miniature active beamstop ensures accurate data normalization and the photon-counting PILATUS 2M detector enables the background-free detection of weak scattering signals. The high flux and small beam size allow for rapid experiments with exposure time down to 30–50 ms covering the resolution range from about 300 to 0.5 nm. P12 possesses a versatile and flexible sample environment system that caters for the diverse experimental needs required to study macromolecular solutions. These include an in-vacuum capillary mode for standard batch sample analyses with robotic sample delivery and for continuous-flow in-line sample purification and characterization, as well as an in-air capillary time-resolved stopped-flow setup. A novel microfluidic centrifugal mixing device (SAXS disc) is developed for a high-throughput screening mode using sub-microlitre sample volumes. Automation is a key feature of P12; it is controlled by a beamline meta server, which coordinates and schedules experiments from either standard or nonstandard operational setups. The integrated SASFLOW pipeline automatically checks for consistency, and processes and analyses the data, providing near real-time assessments of overall parameters and the generation of low-resolution models within minutes of data collection. These advances, combined with a remote access option, allow for rapid high-throughput analysis, as well as time-resolved and screening experiments for novice and expert biological SAXS users. PMID:25844078

  6. Dried blood spot proteomics: surface extraction of endogenous proteins coupled with automated sample preparation and mass spectrometry analysis.

    PubMed

    Martin, Nicholas J; Bunch, Josephine; Cooper, Helen J

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  7. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  8. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  9. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  10. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    PubMed

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  11. Automated Pilot Performance Assessment in the T-37: A Feasibility Study. Final Report (May 1968-April 1971).

    ERIC Educational Resources Information Center

    Knoop, Patricia A.; Welde, William L.

    Air Force investigators conducted a three year program to develop a capability for automated quantification and assessment of in-flight pilot performance. Such a capability enhances pilot training by making ratings more objective, valid, reliable and sensitive, and by freeing instructors from rating responsibilities, allowing them to concentrate…

  12. An Automated Version of the BAT Syntactic Comprehension Task for Assessing Auditory L2 Proficiency in Healthy Adults

    ERIC Educational Resources Information Center

    Achim, Andre; Marquis, Alexandra

    2011-01-01

    Studies of bilingualism sometimes require healthy subjects to be assessed for proficiency at auditory sentence processing in their second language (L2). The Syntactic Comprehension task of the Bilingual Aphasia Test could satisfy this need. For ease and uniformity of application, we automated its English (Paradis, M., Libben, G., and Hummel, K.…

  13. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    SciTech Connect

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  14. Assessment of sampling strategy for explosives-contaminated soils

    SciTech Connect

    Thiboutot, S.; Ampleman, G.; Jenkins, T.F.; Walsh, M.E.; Thorne, P.G.; Ranney, T.A.; Grant, C.L.

    1997-12-31

    An explosives-contaminated site was characterized using composite sampling, in-field sample homogenization and on-site analysis. Explosives contaminated sites demonstrate large short-range heterogeneity due to the crystalline nature and poor water solubility of the dispersed contaminants. The sampling strategy must be carefully planned in order to minimize sampling error and total uncertainty. The site investigated in this particular study is an anti-tank firing range that has been in-use for over 20 years. The ammunition fired at this range is a melt-cast explosive based on a mixture of HMX and TNT in the ratio of 70:30. Two previous preliminary sampling surveys of this site have shown high levels of HMX in soil samples collected nearby the targeted tanks. This particular site was chosen for a collaborative effort between the Canadian Department of National Defence and the USA Department of Defense to study sampling strategies and sample heterogeneity where HMX is the main contaminant. On-site colorimetric TNT and HMX methods and enzyme immunoassay TNT and RDX methods were used initially to evaluate if the sampling pattern used provided representative results. A 6 m square grid (36 m{sup 2}) pattern was established, including two of the targeted tanks. Seventeen grids were installed and composite samples were collected within those grids. Four surface composite samples were collected in each quadrant of each grid using a circular pattern that sampled about 10% of the top 5 cm of the surface. Replicates were collected to assess the representativeness achieved. Field analysis showed concentrations of HMX ranged from as high as 1640 mg/kg near one target to 2.1 mg/kg at a distance of 15 m from the target. On the other hand, TNT concentrations were much lower than would be expected based on the 70:30 composition ratio. Results from the colorimetric on-site analyses were in excellent agreement with laboratory results.

  15. Automated contour mapping using sparse volume sampling for 4D radiation therapy

    SciTech Connect

    Chao Ming; Schreibmann, Eduard; Li Tianfang; Wink, Nicole; Xing Lei

    2007-10-15

    The purpose of this work is to develop a novel strategy to automatically map organ contours from one phase of respiration to all other phases on a four-dimensional computed tomography (4D CT). A region of interest (ROI) was manually delineated by a physician on one phase specific image set of a 4D CT. A number of cubic control volumes of the size of {approx}1 cm were automatically placed along the contours. The control volumes were then collectively mapped to the next phase using a rigid transformation. To accommodate organ deformation, a model-based adaptation of the control volume positions was followed after the rigid mapping procedure. This further adjustment of control volume positions was performed by minimizing an energy function which balances the tendency for the control volumes to move to their correspondences with the desire to maintain similar image features and shape integrity of the contour. The mapped ROI surface was then constructed based on the central positions of the control volumes using a triangulated surface construction technique. The proposed technique was assessed using a digital phantom and 4D CT images of three lung patients. Our digital phantom study data indicated that a spatial accuracy better than 2.5 mm is achievable using the proposed technique. The patient study showed a similar level of accuracy. In addition, the computational speed of our algorithm was significantly improved as compared with a conventional deformable registration-based contour mapping technique. The robustness and accuracy of this approach make it a valuable tool for the efficient use of the available spatial-tempo information for 4D simulation and treatment.

  16. Automated Assessment of Children’s Postoperative Pain Using Computer Vision

    PubMed Central

    Sikka, Karan; Ahmed, Alex A.; Diaz, Damaris; Goodwin, Matthew S.; Craig, Kenneth D.; Bartlett, Marian S.

    2015-01-01

    BACKGROUND: Current pain assessment methods in youth are suboptimal and vulnerable to bias and underrecognition of clinical pain. Facial expressions are a sensitive, specific biomarker of the presence and severity of pain, and computer vision (CV) and machine-learning (ML) techniques enable reliable, valid measurement of pain-related facial expressions from video. We developed and evaluated a CVML approach to measure pain-related facial expressions for automated pain assessment in youth. METHODS: A CVML-based model for assessment of pediatric postoperative pain was developed from videos of 50 neurotypical youth 5 to 18 years old in both endogenous/ongoing and exogenous/transient pain conditions after laparoscopic appendectomy. Model accuracy was assessed for self-reported pain ratings in children and time since surgery, and compared with by-proxy parent and nurse estimates of observed pain in youth. RESULTS: Model detection of pain versus no-pain demonstrated good-to-excellent accuracy (Area under the receiver operating characteristic curve 0.84–0.94) in both ongoing and transient pain conditions. Model detection of pain severity demonstrated moderate-to-strong correlations (r = 0.65–0.86 within; r = 0.47–0.61 across subjects) for both pain conditions. The model performed equivalently to nurses but not as well as parents in detecting pain versus no-pain conditions, but performed equivalently to parents in estimating pain severity. Nurses were more likely than the model to underestimate youth self-reported pain ratings. Demographic factors did not affect model performance. CONCLUSIONS: CVML pain assessment models derived from automatic facial expression measurements demonstrated good-to-excellent accuracy in binary pain classifications, strong correlations with patient self-reported pain ratings, and parent-equivalent estimation of children’s pain levels over typical pain trajectories in youth after appendectomy. PMID:26034245

  17. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    SciTech Connect

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  18. Reliability of manual versus automated techniques for assessing passive stiffness of the posterior muscles of the hip and thigh.

    PubMed

    Palmer, Ty B; Jenkins, Nathaniel D M; Cramer, Joel T

    2013-01-01

    The purpose of this study was to compare the reliability of passive stiffness, passive torque, range of motion (ROM), and electromyography (EMG) of the biceps femoris during passive thigh flexion motions intended to assess the ROM of the posterior muscles of the hip and thigh during manual versus automated assessment techniques. Eleven healthy men (mean ± s age = 22 ± 4 years; mass = 85 ± 12 kg; and height = 178 ± 4 cm) and nine healthy women (age = 19 ± 1 years; mass = 66 ± 15 kg; and height = 164 ± 5 cm) completed four randomly ordered passive straight-legged ROM assessments. Two ROM assessments were performed using a manual technique, which consisted of the primary investigator applying slow passive resistance against a load cell attached to the heel while the foot was moved toward the head. Two automated ROM assessments were also performed using a Biodex System 3 isokinetic dynamometer programmed in passive mode to move the foot toward the head at 0.087 rad · s(-1). The intraclass correlation coefficients (ICCs) for passive stiffness measured with the manual technique ranged from 0.81-0.86, while for the automated technique they were 0.72-0.92. Standard error of measurement (SEM) values for passive stiffness expressed as a percentage of the mean ranged from 15.5-21.7% for the manual and 17.8-23.7% for the automated technique. Both techniques (manual and automated) were comparably reliable across the three trials, which suggested that the manual technique could be applied outside the laboratory.

  19. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    PubMed

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment.

  20. An automated serial Grinding, Imaging and Reconstruction Instrument (GIRI) for digital modeling of samples with weak density contrasts

    NASA Astrophysics Data System (ADS)

    Maloof, A. C.; Samuels, B.; Mehra, A.; Spatzier, A.

    2013-12-01

    We present the first results from the new Princeton University Grinder Lab dedicated to the digital reconstruction of hidden objects through serial grinding and imaging. The purpose of a destructive technique like serial grinding is to facilitate the discovery of embedded objects with weak density contrasts outside the sensitivity limits of X-ray CT-scanning devices (Feature segmentation and object reconstruction are based on color and textural contrasts in the stack of images rather than density). The device we have developed is a retrofit imaging station designed for a precision CNC surface. The instrument is capable of processing a sample 20x25x40 cm in size at 1 micron resolution in x, y and z axes. Directly coupled to the vertical axis of the grinder is an 80 megapixel medium format camera and specialty macro lens capable of imaging a 4x5 cm surface at 5 micron resolution in full 16 bit color. The system is automated such that after each surface grind, the sample is cleaned, travels to the opposite end of the bed from the grinder wheel, is photographed, and then moved back to the grinding position. This process establishes a comprehensive archive of the specimen that is used for digital reconstruction and quantitative analysis. For example, in one night, a 7 cm thick sample can be imaged completely at 20 micron horizontal and vertical resolution without human supervision. Some of the initial results we present here include new digital reconstructions of early animal fossils, 3D sedimentary bedforms, the size and shape distribution of chondrules in chondritic meteorites, and the porosity structure of carbonate cemented reservoir rocks.

  1. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    PubMed

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  2. Examples of Optical Assessment of Surface Cleanliness of Genesis Samples

    NASA Technical Reports Server (NTRS)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.; Gonzalez, C. P.

    2013-01-01

    Optical microscope assessment of Genesis solar wind collector surfaces is a coordinated part of the effort to obtain an assessed clean subset of flown wafer material for the scientific community. Microscopic survey is typically done at 50X magnification at selected approximately 1 square millimeter areas on the fragment surface. This survey is performed each time a principle investigator (PI) returns a sample to JSC for documentation as part of the established cleaning plan. The cleaning plan encompasses sample handling and analysis by Genesis science team members, and optical survey is done at each step in the process. Sample surface cleaning is performed at JSC (ultrapure water [1] and UV ozone cleaning [2]) and experimentally by other science team members (acid etch [3], acetate replica peels [4], CO2 snow [5], etc.). The documentation of each cleaning method can potentially be assessed with optical observation utilizing Image Pro Plus software [6]. Differences in particle counts can be studied and discussed within analysis groups. Approximately 25 samples have been identified as part of the cleaning matrix effort to date.

  3. Automated Cognitive Health Assessment From Smart Home-Based Behavior Data.

    PubMed

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-07-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behavior in the home and predicting clinical scores of the residents. To accomplish this goal, we propose a clinical assessment using activity behavior (CAAB) approach to model a smart home resident's daily behavior and predict the corresponding clinical scores. CAAB uses statistical features that describe characteristics of a resident's daily activity performance to train machine learning algorithms that predict the clinical scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years. We obtain a statistically significant correlation ( r=0.72) between CAAB-predicted and clinician-provided cognitive scores and a statistically significant correlation ( r=0.45) between CAAB-predicted and clinician-provided mobility scores. These prediction results suggest that it is feasible to predict clinical scores using smart home sensor data and learning-based data analysis.

  4. Performance of the Automated Neuropsychological Assessment Metrics (ANAM) in Detecting Cognitive Impairment in Heart Failure Patients

    PubMed Central

    Xie, Susan S.; Goldstein, Carly M.; Gathright, Emily C.; Gunstad, John; Dolansky, Mary A.; Redle, Joseph; Hughes, Joel W.

    2015-01-01

    Objective Evaluate capacity of the Automated Neuropsychological Assessment Metrics (ANAM) to detect cognitive impairment (CI) in heart failure (HF) patients. Background CI is a key prognostic marker in HF. Though the most widely used cognitive screen in HF, the Mini-Mental State Examination (MMSE) is insufficiently sensitive. The ANAM has demonstrated sensitivity to cognitive domains affected by HF, but has not been assessed in this population. Methods Investigators administered the ANAM and MMSE to 57 HF patients, compared against a composite model of cognitive function. Results ANAM efficiency (p < .05) and accuracy scores (p < .001) successfully differentiated CI and non-CI. ANAM efficiency and accuracy scores classified 97.7% and 93.0% of non-CI patients, and 14.3% and 21.4% with CI, respectively. Conclusions The ANAM is more effective than the MMSE for detecting CI, but further research is needed to develop a more optimal cognitive screen for routine use in HF patients. PMID:26354858

  5. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    PubMed

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  6. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    PubMed

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples.

  7. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    PubMed

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. PMID:27006022

  8. Information-Theoretic Assessment of Sample Imaging Systems

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.

  9. Development of Automated Scoring Algorithms for Complex Performance Assessments: A Comparison of Two Approaches.

    ERIC Educational Resources Information Center

    Clauser, Brian E.; Margolis, Melissa J.; Clyman, Stephen G.; Ross, Linette P.

    1997-01-01

    Research on automated scoring is extended by comparing alternative automated systems for scoring a computer simulation of physicians' patient management skills. A regression-based system is more highly correlated with experts' evaluations than a system that uses complex rules to map performances into score levels, but both approaches are feasible.…

  10. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    ERIC Educational Resources Information Center

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  11. In Support of Collection Assessment: The Role of Automation in the Acquisitions and Serials Departments.

    ERIC Educational Resources Information Center

    Hawks, Carol Pitts

    1992-01-01

    Describes the role of automation in library acquisitions and serials departments in support of collection development. Highlights include workstations and expert systems; links to external databases; vendor services, including serials services and online review and selection of approval books; and automated access to collection development…

  12. Comparison of Automated Scoring Methods for a Computerized Performance Assessment of Clinical Judgment

    ERIC Educational Resources Information Center

    Harik, Polina; Baldwin, Peter; Clauser, Brian

    2013-01-01

    Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that "compare" automated scoring strategies. Here, comparisons are made among five strategies for…

  13. Microwave-assisted sample treatment in a fully automated flow-based instrument: oxidation of reduced technetium species in the analysis of total technetium-99 in caustic aged nuclear waste samples.

    PubMed

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment and removal of the gaseous reaction products and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total (99)Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total (99)Tc in caustic aged nuclear waste matrixes from the Hanford site.

  14. Assessment of the 296-S-21 Stack Sampling Probe Location

    SciTech Connect

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  15. Automated content and quality assessment of full-motion-video for the generation of meta data

    NASA Astrophysics Data System (ADS)

    Harguess, Josh

    2015-05-01

    Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.

  16. Automated Health Alerts Using In-Home Sensor Data for Embedded Health Assessment

    PubMed Central

    Guevara, Rainer Dane; Rantz, Marilyn

    2015-01-01

    We present an example of unobtrusive, continuous monitoring in the home for the purpose of assessing early health changes. Sensors embedded in the environment capture behavior and activity patterns. Changes in patterns are detected as potential signs of changing health. We first present results of a preliminary study investigating 22 features extracted from in-home sensor data. A 1-D alert algorithm was then implemented to generate health alerts to clinicians in a senior housing facility. Clinicians analyze each alert and provide a rating on the clinical relevance. These ratings are then used as ground truth for training and testing classifiers. Here, we present the methodology for four classification approaches that fuse multisensor data. Results are shown using embedded sensor data and health alert ratings collected on 21 seniors over nine months. The best results show similar performance for two techniques, where one approach uses only domain knowledge and the second uses supervised learning for training. Finally, we propose a health change detection model based on these results and clinical expertise. The system of in-home sensors and algorithms for automated health alerts provides a method for detecting health problems very early so that early treatment is possible. This method of passive in-home sensing alleviates compliance issues. PMID:27170900

  17. Automated and manufacturer independent assessment of the battery status of implanted cardiac pacemakers by electrocardiogram analysis.

    PubMed

    Schreier, G; Hayn, D; Kollmann, A; Scherr, D; Lercher, P; Rotman, B; Klein, W

    2004-01-01

    According to international standards, cardiac pacemakers have to indicate the status of their batteries upon magnet application by specific stimulation patterns. The purpose of this study has been to assess whether this concept can be used as a basis for automated and manufacturer independent examination of the depletion level of pacemakers in the framework of a collaborative telemedical pacemaker follow-up system. A prototype of such a system was developed and tested in a real clinical environment. Electrocardiograms (ECGs) were recorded during magnet application and automatically processed to extract the specific stimulation patterns. The results were used to assign each signal a corresponding pacemaker status: "ok," "replace" or "undefined," based on the expected behavior of the devices as specified by the manufacturer. The outcome of this procedure was compared to the result of an expert examination, resulting in a positive predictive value of 100% for the detection of ECGs indicating pacemaker status "ok." The method can, therefore, be utilized to quickly, safely and manufacturer neutrally classify cases into the categories "ok" and "needs further checking," which - in a telemedical setting - may be used to increase the efficiency of pacemaker follow-up procedures in the future. PMID:17271607

  18. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  19. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    SciTech Connect

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose estimates

  20. Beyond crosswalks: reliability of exposure assessment following automated coding of free-text job descriptions for occupational epidemiology.

    PubMed

    Burstyn, Igor; Slutsky, Anton; Lee, Derrick G; Singer, Alison B; An, Yuan; Michael, Yvonne L

    2014-05-01

    Epidemiologists typically collect narrative descriptions of occupational histories because these are less prone than self-reported exposures to recall bias of exposure to a specific hazard. However, the task of coding these narratives can be daunting and prohibitively time-consuming in some settings. The aim of this manuscript is to evaluate the performance of a computer algorithm to translate the narrative description of occupational codes into standard classification of jobs (2010 Standard Occupational Classification) in an epidemiological context. The fundamental question we address is whether exposure assignment resulting from manual (presumed gold standard) coding of the narratives is materially different from that arising from the application of automated coding. We pursued our work through three motivating examples: assessment of physical demands in Women's Health Initiative observational study, evaluation of predictors of exposure to coal tar pitch volatiles in the US Occupational Safety and Health Administration's (OSHA) Integrated Management Information System, and assessment of exposure to agents known to cause occupational asthma in a pregnancy cohort. In these diverse settings, we demonstrate that automated coding of occupations results in assignment of exposures that are in reasonable agreement with results that can be obtained through manual coding. The correlation between physical demand scores based on manual and automated job classification schemes was reasonable (r = 0.5). The agreement between predictive probability of exceeding the OSHA's permissible exposure level for polycyclic aromatic hydrocarbons, using coal tar pitch volatiles as a surrogate, based on manual and automated coding of jobs was modest (Kendall rank correlation = 0.29). In the case of binary assignment of exposure to asthmagens, we observed that fair to excellent agreement in classifications can be reached, depending on presence of ambiguity in assigned job classification (κ

  1. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  2. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism

    PubMed Central

    Puścian, Alicja; Łęski, Szymon; Górkiewicz, Tomasz; Meyza, Ksenia; Lipp, Hans-Peter; Knapska, Ewelina

    2014-01-01

    Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order) and cognitive rigidity (higher-order). Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP). To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment. PMID:24808839

  3. Assessment of anti-Salmonella activity of boot dip samples.

    PubMed

    Rabie, André J; McLaren, Ian M; Breslin, Mark F; Sayers, Robin; Davies, Rob H

    2015-01-01

    The introduction of pathogens from the external environment into poultry houses via the boots of farm workers and visitors presents a significant risk. The use of boot dips containing disinfectant to help prevent this from happening is common practice, but the effectiveness of these boot dips as a preventive measure can vary. The aim of this study was to assess the anti-Salmonella activity of boot dips that are being used on poultry farms. Boot dip samples were collected from commercial laying hen farms in the UK and tested within 24 hours of receipt at the laboratory to assess their anti-Salmonella activity. All boot dip samples were tested against a field strain of Salmonella enterica serovar Enteritidis using three test models: pure culture, paper disc surface matrix and yeast suspension model. Of the 112 boot dip samples tested 83.6% were effective against Salmonella in pure culture, 37.3% in paper disc surface matrix and 44.5% in yeast suspension model. Numerous factors may influence the efficacy of the disinfectants. Disinfectants used in the dips may not always be fully active against surface or organic matter contamination; they may be inaccurately measured or diluted to a concentration other than that specified or recommended; dips may not be changed regularly or may have been exposed to rain and other environmental elements. This study showed that boot dips in use on poultry farms are frequently ineffective. PMID:25650744

  4. Influence of commonly used primer systems on automated ribosomal intergenic spacer analysis of bacterial communities in environmental samples.

    PubMed

    Purahong, Witoon; Stempfhuber, Barbara; Lentendu, Guillaume; Francioli, Davide; Reitz, Thomas; Buscot, François; Schloter, Michael; Krüger, Dirk

    2015-01-01

    Due to the high diversity of bacteria in many ecosystems, their slow generation times, specific but mostly unknown nutrient requirements and syntrophic interactions, isolation based approaches in microbial ecology mostly fail to describe microbial community structure. Thus, cultivation independent techniques, which rely on directly extracted nucleic acids from the environment, are a well-used alternative. For example, bacterial automated ribosomal intergenic spacer analysis (B-ARISA) is one of the widely used methods for fingerprinting bacterial communities after PCR-based amplification of selected regions of the operon coding for rRNA genes using community DNA. However, B-ARISA alone does not provide any taxonomic information and the results may be severely biased in relation to the primer set selection. Furthermore, amplified DNA stemming from mitochondrial or chloroplast templates might strongly bias the obtained fingerprints. In this study, we determined the applicability of three different B-ARISA primer sets to the study of bacterial communities. The results from in silico analysis harnessing publicly available sequence databases showed that all three primer sets tested are specific to bacteria but only two primers sets assure high bacterial taxa coverage (1406f/23Sr and ITSF/ITSReub). Considering the study of bacteria in a plant interface, the primer set ITSF/ITSReub was found to amplify (in silico) sequences of some important crop species such as Sorghum bicolor and Zea mays. Bacterial genera and plant species potentially amplified by different primer sets are given. These data were confirmed when DNA extracted from soil and plant samples were analyzed. The presented information could be useful when interpreting existing B-ARISA results and planning B-ARISA experiments, especially when plant DNA can be expected. PMID:25749323

  5. Assessment of the application of an automated electronic milk analyzer for the enumeration of total bacteria in raw goat milk.

    PubMed

    Ramsahoi, L; Gao, A; Fabri, M; Odumeru, J A

    2011-07-01

    Automated electronic milk analyzers for rapid enumeration of total bacteria counts (TBC) are widely used for raw milk testing by many analytical laboratories worldwide. In Ontario, Canada, Bactoscan flow cytometry (BsnFC; Foss Electric, Hillerød, Denmark) is the official anchor method for TBC in raw cow milk. Penalties are levied at the BsnFC equivalent level of 50,000 cfu/mL, the standard plate count (SPC) regulatory limit. This study was conducted to assess the BsnFC for TBC in raw goat milk, to determine the mathematical relationship between the SPC and BsnFC methods, and to identify probable reasons for the difference in the SPC:BsnFC equivalents for goat and cow milks. Test procedures were conducted according to International Dairy Federation Bulletin guidelines. Approximately 115 farm bulk tank milk samples per month were tested for inhibitor residues, SPC, BsnFC, psychrotrophic bacteria count, composition (fat, protein, lactose, lactose and other solids, and freezing point), and somatic cell count from March 2009 to February 2010. Data analysis of the results for the samples tested indicated that the BsnFC method would be a good alternative to the SPC method, providing accurate and more precise results with a faster turnaround time. Although a linear regression model showed good correlation and prediction, tests for linearity indicated that the relationship was linear only beyond log 4.1 SPC. The logistic growth curve best modeled the relationship between the SPC and BsnFC for the entire sample population. The BsnFC equivalent to the SPC 50,000 cfu/mL regulatory limit was estimated to be 321,000 individual bacteria count (ibc)/mL. This estimate differs considerably from the BsnFC equivalent for cow milk (121,000 ibc/mL). Because of the low frequency of bulk tank milk pickups at goat farms, 78.5% of the samples had their oldest milking in the tank to be 6.5 to 9.0 d old when tested, compared with the cow milk samples, which had their oldest milking at 4 d

  6. Mixed species radioiodine air sampling readout and dose assessment system

    DOEpatents

    Distenfeld, Carl H.; Klemish, Jr., Joseph R.

    1978-01-01

    This invention provides a simple, reliable, inexpensive and portable means and method for determining the thyroid dose rate of mixed airborne species of solid and gaseous radioiodine without requiring highly skilled personnel, such as health physicists or electronics technicians. To this end, this invention provides a means and method for sampling a gas from a source of a mixed species of solid and gaseous radioiodine for collection of the mixed species and readout and assessment of the emissions therefrom by cylindrically, concentrically and annularly molding the respective species around a cylindrical passage for receiving a conventional probe-type Geiger-Mueller radiation detector.

  7. Automated gravimetric sample pretreatment using an industrial robot for the high-precision determination of plutonium by isotope dilution mass spectrometry.

    PubMed

    Surugaya, Naoki; Hiyama, Toshiaki; Watahiki, Masaru

    2008-06-01

    A robotized sample-preparation method for the determination of Pu, which is recovered by extraction reprocessing of spent nuclear fuel, by isotope dilution mass spectrometry (IDMS) is described. The automated system uses a six-axis industrial robot, whose motility is very fast, accurate, and flexible, installed in a glove box. The automation of the weighing and dilution steps enables operator-unattended sample pretreatment for the high-precision analysis of Pu in aqueous solutions. Using the developed system, the Pu concentration in a HNO(3) medium was successfully determined using a set of subsequent mass spectrometric measurements. The relative uncertainty in determining the Pu concentration by IDMS using this system was estimated to be less than 0.1% (k = 2), which is equal to that expected of a talented analyst. The operation time required was the same as that for a skilled operator.

  8. Performance assessment of automated tissue characterization for prostate H and E stained histopathology

    NASA Astrophysics Data System (ADS)

    DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette

    2015-03-01

    Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.

  9. Fully Automated Assessment of the Severity of Parkinson’s Disease from Speech

    PubMed Central

    Bayestehtashk, Alireza; Asgari, Meysam; Shafran, Izhak; McNames, James

    2014-01-01

    For several decades now, there has been sporadic interest in automatically characterizing the speech impairment due to Parkinson’s disease (PD). Most early studies were confined to quantifying a few speech features that were easy to compute. More recent studies have adopted a machine learning approach where a large number of potential features are extracted and the models are learned automatically from the data. In the same vein, here we characterize the disease using a relatively large cohort of 168 subjects, collected from multiple (three) clinics. We elicited speech using three tasks – the sustained phonation task, the diadochokinetic task and a reading task, all within a time budget of 4 minutes, prompted by a portable device. From these recordings, we extracted 1582 features for each subject using openSMILE, a standard feature extraction tool. We compared the effectiveness of three strategies for learning a regularized regression and find that ridge regression performs better than lasso and support vector regression for our task. We refine the feature extraction to capture pitch-related cues, including jitter and shimmer, more accurately using a time-varying harmonic model of speech. Our results show that the severity of the disease can be inferred from speech with a mean absolute error of about 5.5, explaining 61% of the variance and consistently well-above chance across all clinics. Of the three speech elicitation tasks, we find that the reading task is significantly better at capturing cues than diadochokinetic or sustained phonation task. In all, we have demonstrated that the data collection and inference can be fully automated, and the results show that speech-based assessment has promising practical application in PD. The techniques reported here are more widely applicable to other paralinguistic tasks in clinical domain. PMID:25382935

  10. Cardiac activity in marine invertebrates in response to pollutants: Automated interpulse duration assessment

    SciTech Connect

    Lundebye, A.K.; Curtis, T.; Depledge, M.H.

    1995-12-31

    The updated method of the Computer-Aided Physiological Monitoring (CAPMON) system was used to study the effects of copper exposure on cardiac activity in the shore crab (Carcinus maenas) and the common mussel (Mytilus edulis). This new Automated Interpulse Duration Assessment (AIDA) system measures the time interval between heart beats, and was found to be a more sensitive tool for evaluating cardiac responses to pollutant exposure than other techniques. In addition to information regarding heart rate, also obtained by the CAPMON system (as beats per minute), the new system enables frequency distribution analysis of interpulse duration. An experiment involving C. maenas examined the effects of short term (24 h) and chronic exposure (4 weeks) to copper concentrations 0, 0.2, 0.4, 0.6 and 0.8 mgl{sup {minus}1} Cu. Subsequent recovery (6 weeks) of cardiac activity was also examined. In a second experiment mussels were exposed to one of five copper concentrations (in the range of 0--0.1 mgl{sup {minus}1} Cu) and `normal` cardiac activity was compared with activity after copper exposure. A dose-response relationship was established between copper concentration and heart rate in crabs. The control group had the longest mean inter-pulse duration, and mean interpulse duration decreased in a concentration-dependent manner for the copper treatments, reflecting an increase in heart rate. Distribution of interpulse duration changed from a variable, rather wide distribution in control crabs, to a sharp-peaked normal distribution in exposed crabs. Results after 4 weeks exposure were not significantly different from those found after 24 h. Return to normal cardiac activity was evident after a 6 week `recovery` period. Results from the mussel experiment showed burst activity followed by a decline in heart rate in response to copper exposure.

  11. Data Quality Verification at STScI - Automated Assessment and Your Data

    NASA Astrophysics Data System (ADS)

    Dempsey, R.; Swade, D.; Scott, J.; Hamilton, F.; Holm, A.

    1996-12-01

    As satellite based observatories improve their ability to deliver wider varieties and more complex types of scientific data, so to does the process of analyzing and reducing these data. It becomes correspondingly imperative that Guest Observers or Archival Researchers have access to an accurate, consistent, and easily understandable summary of the quality of their data. Previously, at the STScI, an astronomer would display and examine the quality and scientific usefulness of every single observation obtained with HST. Recently, this process has undergone a major reorganization at the Institute. A major part of the new process is that the majority of data are assessed automatically with little or no human intervention. As part of routine processing in the OSS--PODPS Unified System (OPUS), the Observatory Monitoring System (OMS) observation logs, the science processing trailer file (also known as the TRL file), and the science data headers are inspected by an automated tool, AUTO_DQ. AUTO_DQ then determines if any anomalous events occurred during the observation or through processing and calibration of the data that affects the procedural quality of the data. The results are placed directly into the Procedural Data Quality (PDQ) file as a string of predefined data quality keywords and comments. These in turn are used by the Contact Scientist (CS) to check the scientific usefulness of the observations. In this manner, the telemetry stream is checked for known problems such as losses of lock, re-centerings, or degraded guiding, for example, while missing data or calibration errors are also easily flagged. If the problem is serious, the data are then queued for manual inspection by an astronomer. The success of every target acquisition is verified manually. If serious failures are confirmed, the PI and the scheduling staff are notified so that options concerning rescheduling the observations can be explored.

  12. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  13. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed.

  14. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  15. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. PMID:21609694

  16. Experimental Assessment of Mouse Sociability Using an Automated Image Processing Approach.

    PubMed

    Varghese, Frency; Burket, Jessica A; Benson, Andrew D; Deutsch, Stephen I; Zemlin, Christian W

    2016-01-01

    Mouse is the preferred model organism for testing drugs designed to increase sociability. We present a method to quantify mouse sociability in which the test mouse is placed in a standardized apparatus and relevant behaviors are assessed in three different sessions (called session I, II, and III). The apparatus has three compartments (see Figure 1), the left and right compartments contain an inverted cup which can house a mouse (called "stimulus mouse"). In session I, the test mouse is placed in the cage and its mobility is characterized by the number of transitions made between compartments. In session II, a stimulus mouse is placed under one of the inverted cups and the sociability of the test mouse is quantified by the amounts of time it spends near the cup containing the enclosed stimulus mouse vs. the empty inverted cup. In session III, the inverted cups are removed and both mice interact freely. The sociability of the test mouse in session III is quantified by the number of social approaches it makes toward the stimulus mouse and by the number of times it avoids a social approach by the stimulus mouse. The automated evaluation of the movie detects the nose of the test mouse, which allows the determination of all described sociability measures in session I and II (in session III, approaches are identified automatically but classified manually). To find the nose, the image of an empty cage is digitally subtracted from each frame of the movie and the resulting image is binarized to identify the mouse pixels. The mouse tail is automatically removed and the two most distant points of the remaining mouse are determined; these are close to nose and base of tail. By analyzing the motion of the mouse and using continuity arguments, the nose is identified. Figure 1. Assessment of Sociability During 3 sessions. Session I (top): Acclimation of test mouse to the cage. Session II (middle): Test mouse moving freely in the cage while the stimulus mouse is enclosed in an

  17. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  18. Fully automated, quantitative, noninvasive assessment of collagen fiber content and organization in thick collagen gels

    NASA Astrophysics Data System (ADS)

    Bayan, Christopher; Levitt, Jonathan M.; Miller, Eric; Kaplan, David; Georgakoudi, Irene

    2009-05-01

    Collagen is the most prominent protein of human tissues. Its content and organization define to a large extent the mechanical properties of tissue as well as its function. Methods that have been used traditionally to visualize and analyze collagen are invasive, provide only qualitative or indirect information, and have limited use in studies that aim to understand the dynamic nature of collagen remodeling and its interactions with the surrounding cells and other matrix components. Second harmonic generation (SHG) imaging emerged as a promising noninvasive modality for providing high-resolution images of collagen fibers within thick specimens, such as tissues. In this article, we present a fully automated procedure to acquire quantitative information on the content, orientation, and organization of collagen fibers. We use this procedure to monitor the dynamic remodeling of collagen gels in the absence or presence of fibroblasts over periods of 12 or 14 days. We find that an adaptive thresholding and stretching approach provides great insight to the content of collagen fibers within SHG images without the need for user input. An additional feature-erosion and feature-dilation step is useful for preserving structure and noise removal in images with low signal. To quantitatively assess the orientation of collagen fibers, we extract the orientation index (OI), a parameter based on the power distribution of the spatial-frequency-averaged, two-dimensional Fourier transform of the SHG images. To measure the local organization of the collagen fibers, we access the Hough transform of small tiles of the image and compute the entropy distribution, which represents the probability of finding the direction of fibers along a dominant direction. Using these methods we observed that the presence and number of fibroblasts within the collagen gel significantly affects the remodeling of the collagen matrix. In the absence of fibroblasts, gels contract, especially during the first few

  19. Development and testing of external quality assessment samples for Salmonella detection in poultry samples.

    PubMed

    Martelli, F; Gosling, R; McLaren, I; Wales, A; Davies, R

    2014-10-01

    Salmonella-contaminated poultry house dust plus 10 g chicken faeces inoculated with Salmonella Enteritidis and then frozen for storage and transport were used as candidate external quality assurance test samples. Variations in faeces sample preparation, storage and culture were examined initially. This indicated that, within modest limits, the age of the inoculating culture and of the faeces did not affect detection, nor did swirling the pre-enrichment culture or extending its duration. Under optimal conditions of preparation and storage, Salmonella numbers of 70 colony-forming units (CFU) and above were reliably detected at the originating laboratory. A ring trial was performed, involving 13 external UK laboratories plus the originating laboratory. Faeces samples inoculated with Salmonella Enteritidis were frozen, transported on dry ice and tested by the ISO 6579:2002 (Annex D) method. Detection by the originating laboratory was consistent with the previously established lower limit for reliability of 70 CFU. However, the sensitivity of detection by the external laboratories was apparently poorer in several cases, with significant interlaboratory variation seen at the lowest inoculum level, using Fisher's exact test. Detection of Salmonella in poultry house dust appeared to be more sensitive and uniform among laboratories. Significance and impact of the study: Salmonella surveillance and control regimes in the European poultry industry and elsewhere require sensitive culture detection of Salmonella in environmental samples, including poultry faeces. A ring trial was conducted, and the results highlighted that some of the participating laboratories failed to identify Salmonella. This suggests that contaminated frozen faeces cubes could be beneficial to assess proficiency, according to the results of this preliminary study. The data obtained in this study can be used as an indication for the design of realistic external quality assurance for laboratories involved in

  20. Automated analysis of perfluorinated compounds in human hair and urine samples by turbulent flow chromatography coupled to tandem mass spectrometry.

    PubMed

    Perez, Francisca; Llorca, Marta; Farré, Marinella; Barceló, Damià

    2012-03-01

    Perfluorinated compounds (PFCs) are ubiquitous contaminants of humans and animals worldwide. PFCs are bioaccumulated because of their affinity for proteins. It has been shown they could have a variety of toxicological effects and cause damage to human health, emphasizing the need for sensitive and robust analytical methods to assess their bioaccumulation in humans. In this paper we report the development and validation of an analytical method for analysis of PFCs in the non-invasive human matrices hair and urine. The method is based on rapid and simple sample pre-treatment followed by online turbulent flow liquid chromatography and tandem mass spectrometry (TFC-LC-MS-MS) for analysis of 21 PFCs. The method was validated for both matrices. Percentage recovery was between 60 and 105 for most compounds in both matrices. Limits of quantification ranged from 0.1 to 9 ng mL(-1) in urine and from 0.04 to 13.4 in hair. The good performance of the method was proved by investigating the presence of selected PFCs in 24 hair and 30 urine samples from different donors living in Barcelona (NE Spain). The results were indicative of bioaccumulation of these compounds in both types of sample. PFOS and PFOA were most frequently detected in hair and PFBA in urine.

  1. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  2. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  3. Sampling for Soil Carbon Stock Assessment in Rocky Agricultural Soils

    NASA Technical Reports Server (NTRS)

    Beem-Miller, Jeffrey P.; Kong, Angela Y. Y.; Ogle, Stephen; Wolfe, David

    2016-01-01

    Coring methods commonly employed in soil organic C (SOC) stock assessment may not accurately capture soil rock fragment (RF) content or soil bulk density (rho (sub b)) in rocky agricultural soils, potentially biasing SOC stock estimates. Quantitative pits are considered less biased than coring methods but are invasive and often cost-prohibitive. We compared fixed-depth and mass-based estimates of SOC stocks (0.3-meters depth) for hammer, hydraulic push, and rotary coring methods relative to quantitative pits at four agricultural sites ranging in RF content from less than 0.01 to 0.24 cubic meters per cubic meter. Sampling costs were also compared. Coring methods significantly underestimated RF content at all rocky sites, but significant differences (p is less than 0.05) in SOC stocks between pits and corers were only found with the hammer method using the fixed-depth approach at the less than 0.01 cubic meters per cubic meter RF site (pit, 5.80 kilograms C per square meter; hammer, 4.74 kilograms C per square meter) and at the 0.14 cubic meters per cubic meter RF site (pit, 8.81 kilograms C per square meter; hammer, 6.71 kilograms C per square meter). The hammer corer also underestimated rho (sub b) at all sites as did the hydraulic push corer at the 0.21 cubic meters per cubic meter RF site. No significant differences in mass-based SOC stock estimates were observed between pits and corers. Our results indicate that (i) calculating SOC stocks on a mass basis can overcome biases in RF and rho (sub b) estimates introduced by sampling equipment and (ii) a quantitative pit is the optimal sampling method for establishing reference soil masses, followed by rotary and then hydraulic push corers.

  4. Development of Automated Signal and Meta-data Quality Assessment at the USGS ANSS NOC

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Buland, R.; Boaz, R.; Benz, H.; Gee, L.; Leith, W.

    2007-05-01

    Real-time earthquake processing systems at the Advanced National Seismic System (ANSS) National Operations Center (NOC) rely on high-quality broadband seismic data to compute accurate earthquake locations, moment-tensor solutions, finite-fault models, Shakemaps and impact assessments. The NEIC receives real- time seismic data from the ANSS backbone, the Global Seismographic Network, ANSS regional network operators, foreign regional and national networks, the tsunami warning centers and the International Monitoring System. For many contributed stations, calibration information is not well known. In addition, equipment upgrades or changes may occur, making it difficult to maintain accurate metadata. The high-degree of real-time integration of seismic data necessitates the development of automated QC tools and procedures that identify changes in instrument response, quality of waveforms and other systematic changes in station performance that might affect NEIC computations and products. We present new tools and methods that will allow NEIC and other network operations to evaluate seismic station performance and characteristics both in the time and frequency domain using probability density functions (PDF) of power spectral densities (PSD) (McNamara and Buland, 2004). The method involves determining station standard noise conditions and characterizing deviations from the standard using the probabilistic distribution hourly PSDs. We define the standard station noise conditions to lie within the 10th and 90th percentile of the PSD distribution. The computed PSDs are stored in a database, allowing a user to access specific time periods of PSDs (PDF subsets) and time series segments through a client-interface or programmatic database calls. This allows the user to visually define the spectral characteristics of known system transients. In order to identify instrument response changes or systems transients we compare short-term spectral envelopes (1 hour to 1 day) against

  5. Context Sampling Descriptive Assessment: A Pilot Study of a Further Approach to Functional Assessment

    ERIC Educational Resources Information Center

    Garbutt, Nathalie; Furniss, Frederick

    2007-01-01

    Background: The ability of descriptive assessments to differentiate functions of problem behaviours might be increased by systematically sampling natural contexts characterized by different establishing operations. This study evaluated the stability of such characteristics, and variability in challenging behaviour, for three school contexts.…

  6. Taking Advantage of Automated Assessment of Student-Constructed Graphs in Science

    ERIC Educational Resources Information Center

    Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.

    2015-01-01

    We present a new system for automated scoring of graph construction items that address complex science concepts, feature qualitative prompts, and support a range of possible solutions. This system utilizes analysis of spatial features (e.g., slope of a line) to evaluate potential student ideas represented within graphs. Student ideas are then…

  7. Assessment of H.264 video compression on automated face recognition performance in surveillance and mobile video scenarios

    NASA Astrophysics Data System (ADS)

    Klare, Brendan; Burge, Mark

    2010-04-01

    We assess the impact of the H.264 video codec on the match performance of automated face recognition in surveillance and mobile video applications. A set of two hundred access control (90 pixel inter-pupilary distance) and distance surveillance (45 pixel inter-pupilary distance) videos taken under non-ideal imaging and facial recognition (e.g., pose, illumination, and expression) conditions were matched using two commercial face recognition engines in the studies. The first study evaluated automated face recognition performance on access control and distance surveillance videos at CIF and VGA resolutions using the H.264 baseline profile at nine bitrates rates ranging from 8kbs to 2048kbs. In our experiments, video signals were able to be compressed up to 128kbs before a significant drop face recognition performance occurred. The second study evaluated automated face recognition on mobile devices at QCIF, iPhone, and Android resolutions for each of the H.264 PDA profiles. Rank one match performance, cumulative match scores, and failure to enroll rates are reported.

  8. Negative symptoms in schizophrenia: a study in a large clinical sample of patients using a novel automated method

    PubMed Central

    Patel, Rashmi; Jayatilleke, Nishamali; Broadbent, Matthew; Chang, Chin-Kuo; Foskett, Nadia; Gorrell, Genevieve; Hayes, Richard D; Jackson, Richard; Johnston, Caroline; Shetty, Hitesh; Roberts, Angus; McGuire, Philip; Stewart, Robert

    2015-01-01

    Objectives To identify negative symptoms in the clinical records of a large sample of patients with schizophrenia using natural language processing and assess their relationship with clinical outcomes. Design Observational study using an anonymised electronic health record case register. Setting South London and Maudsley NHS Trust (SLaM), a large provider of inpatient and community mental healthcare in the UK. Participants 7678 patients with schizophrenia receiving care during 2011. Main outcome measures Hospital admission, readmission and duration of admission. Results 10 different negative symptoms were ascertained with precision statistics above 0.80. 41% of patients had 2 or more negative symptoms. Negative symptoms were associated with younger age, male gender and single marital status, and with increased likelihood of hospital admission (OR 1.24, 95% CI 1.10 to 1.39), longer duration of admission (β-coefficient 20.5 days, 7.6–33.5), and increased likelihood of readmission following discharge (OR 1.58, 1.28 to 1.95). Conclusions Negative symptoms were common and associated with adverse clinical outcomes, consistent with evidence that these symptoms account for much of the disability associated with schizophrenia. Natural language processing provides a means of conducting research in large representative samples of patients, using data recorded during routine clinical practice. PMID:26346872

  9. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis

  10. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  11. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius

  12. Assessing tiger population dynamics using photographic capture-recapture sampling.

    PubMed

    Karanth, K Ullas; Nichols, James D; Kumar, N Samba; Hines, James E

    2006-11-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, "robust design" capture-recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of gamma" = gamma' = 0.10 +/- 0.069 (values are estimated mean +/- SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 +/- 0.051, and the estimated probability that a newly caught animal was a transient was tau = 0.18 +/- 0.11. During the period when the sampled area was of constant size, the estimated population size N(t) varied from 17 +/- 1.7 to 31 +/- 2.1 tigers, with a geometric mean rate of annual population change estimated as lambda = 1.03 +/- 0.020, representing a 3% annual increase. The estimated recruitment of new animals, B(t), varied from 0 +/- 3.0 to 14 +/- 2.9 tigers. Population density estimates, D, ranged from 7.33 +/- 0.8 tigers/100 km2 to 21.73 +/- 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis

  13. Assessing tiger population dynamics using photographic capture-recapture sampling

    USGS Publications Warehouse

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain

  14. Assessing Racial Microaggression Distress in a Diverse Sample.

    PubMed

    Torres-Harding, Susan; Turner, Tasha

    2015-12-01

    Racial microaggressions are everyday subtle or ambiguous racially related insults, slights, mistreatment, or invalidations. Racial microaggressions are a type of perceived racism that may negatively impact the health and well-being of people of color in the United States. This study examined the reliability and validity of the Racial Microaggression Scale distress subscales, which measure the perceived stressfulness of six types of microaggression experiences in a racially and ethnically diverse sample. These subscales exhibited acceptable to good internal consistency. The distress subscales also evidenced good convergent validity; the distress subscales were positively correlated with additional measures of stressfulness due to experiencing microaggressions or everyday discrimination. When controlling for the frequency of one's exposure to microaggression incidents, some racial/ethnic group differences were found. Asian Americans reported comparatively lower distress and Latinos reporting comparatively higher distress in response to Foreigner, Low-Achieving, Invisibility, and Environmental microaggressions. African Americans reported higher distress than the other groups in response to Environmental microaggressions. Results suggest that the Racial Microaggressions Scale distress subscales may aid health professionals in assessing the distress elicited by different types of microaggressions. In turn, this may facilitate diagnosis and treatment planning in order to provide multiculturally competent care for African American, Latino, and Asian American clients.

  15. Assessing Racial Microaggression Distress in a Diverse Sample.

    PubMed

    Torres-Harding, Susan; Turner, Tasha

    2015-12-01

    Racial microaggressions are everyday subtle or ambiguous racially related insults, slights, mistreatment, or invalidations. Racial microaggressions are a type of perceived racism that may negatively impact the health and well-being of people of color in the United States. This study examined the reliability and validity of the Racial Microaggression Scale distress subscales, which measure the perceived stressfulness of six types of microaggression experiences in a racially and ethnically diverse sample. These subscales exhibited acceptable to good internal consistency. The distress subscales also evidenced good convergent validity; the distress subscales were positively correlated with additional measures of stressfulness due to experiencing microaggressions or everyday discrimination. When controlling for the frequency of one's exposure to microaggression incidents, some racial/ethnic group differences were found. Asian Americans reported comparatively lower distress and Latinos reporting comparatively higher distress in response to Foreigner, Low-Achieving, Invisibility, and Environmental microaggressions. African Americans reported higher distress than the other groups in response to Environmental microaggressions. Results suggest that the Racial Microaggressions Scale distress subscales may aid health professionals in assessing the distress elicited by different types of microaggressions. In turn, this may facilitate diagnosis and treatment planning in order to provide multiculturally competent care for African American, Latino, and Asian American clients. PMID:25237154

  16. Assessing uncertainty in DNA evidence caused by sampling effects.

    PubMed

    Curran, J M; Buckleton, J S; Triggs, C M; Weir, B S

    2002-01-01

    Sampling error estimation in forensic DNA testimony was discussed. Is an estimate necessary and how should it be made? The authors find that all modern methods have areas of strength and weakness. The assessment of which is the 'best' is subjective and depends on the performance of the method, the type of problem (criminal work or paternity), the database size and availability of computing software and support. The authors preferred the highest posterior density approach for performance, however the other methods all have areas where their performance is adequate. For single-contributor stains normal approximation methods are suitable, also the bootstrap and the highest posterior density method. For multiple-contributor stains or other complex situations the match probability expressions become quite complex and it may not be possible to derive the necessary variance expressions. The highest posterior density or the bootstrap provide a better general method, with non-zero theta. The size-bias correction and the factor of 10 approaches may be considered acceptable by many forensic scientists as long as their limitations are understood.

  17. Using Group Projects to Assess the Learning of Sampling Distributions

    ERIC Educational Resources Information Center

    Neidigh, Robert O.; Dunkelberger, Jake

    2012-01-01

    In an introductory business statistics course, student groups used sample data to compare a set of sample means to the theoretical sampling distribution. Each group was given a production measurement with a population mean and standard deviation. The groups were also provided an excel spreadsheet with 40 sample measurements per week for 52 weeks…

  18. Manual versus Automated Rodent Behavioral Assessment: Comparing Efficacy and Ease of Bederson and Garcia Neurological Deficit Scores to an Open Field Video-Tracking System.

    PubMed

    Desland, Fiona A; Afzal, Aqeela; Warraich, Zuha; Mocco, J

    2014-01-01

    Animal models of stroke have been crucial in advancing our understanding of the pathophysiology of cerebral ischemia. Currently, the standards for determining neurological deficit in rodents are the Bederson and Garcia scales, manual assessments scoring animals based on parameters ranked on a narrow scale of severity. Automated open field analysis of a live-video tracking system that analyzes animal behavior may provide a more sensitive test. Results obtained from the manual Bederson and Garcia scales did not show significant differences between pre- and post-stroke animals in a small cohort. When using the same cohort, however, post-stroke data obtained from automated open field analysis showed significant differences in several parameters. Furthermore, large cohort analysis also demonstrated increased sensitivity with automated open field analysis versus the Bederson and Garcia scales. These early data indicate use of automated open field analysis software may provide a more sensitive assessment when compared to traditional Bederson and Garcia scales.

  19. Sampling cows to assess lying time for on-farm animal welfare assessment.

    PubMed

    Vasseur, E; Rushen, J; Haley, D B; de Passillé, A M

    2012-09-01

    The time that dairy cows spend lying down is an important measure of their welfare, and data loggers can be used to automatically monitor lying time on commercial farms. To determine how the number of days of sampling, parity, stage of lactation, and production level affect lying time, electronic data loggers were used to record lying time for 10 d consecutively, at 3 stages of lactation [early: when cows were at 10-40 d in milk (DIM), mid: 100-140 DIM, late: 200-240 DIM] of 96 Holstein cows in tiestalls (TS) and 127 in freestalls (FS). We calculated daily duration of lying, bout frequency, and mean bout duration. We observed complex interactions between parity and stage of lactation, which differed somewhat between tiestalls and freestalls. First-parity cows had higher bout frequency and shorter lying bouts than older cows but bout frequency decreased and mean bout duration increased as DIM increased. We found that individual cows were not consistent in time spent lying between early and mid lactation (Pearson coefficient, TS: r = 0.1, FS: r = 0.2), whereas cows seemed to be more consistent in time spent lying between mid and late lactation (TS: r = 0.7, FS: r = 0.3). For both TS and FS cows, daily milk production was significantly, but slightly negatively, correlated with lying time across the lactation (range, r: -0.2 to -0.4), whereas parity was slightly to moderately positively correlated with mean bout duration across the lactation (r: +0.2 to +0.6) and negatively with bout frequency (r: -0.2 to -0.5). To estimate how the duration of the time sample affected the estimates of lying time subsets of data subsets consisting of 1, 2, 3, 4, 5, 6, 7, 8, and 9 d per cow were created, and the relationship between the overall mean (based on 10 d) and the mean of each subset was tested by regression. For both TS and FS, lying time based on 4 d of sampling provided good estimates of the average 10-d estimate (90% of accuracy). Automated monitoring of lying time has

  20. Automated DNA-based plant identification for large-scale biodiversity assessment.

    PubMed

    Papadopoulou, Anna; Chesters, Douglas; Coronado, Indiana; De la Cadena, Gissela; Cardoso, Anabela; Reyes, Jazmina C; Maes, Jean-Michel; Rueda, Ricardo M; Gómez-Zurita, Jesús

    2015-01-01

    Rapid degradation of tropical forests urges to improve our efficiency in large-scale biodiversity assessment. DNA barcoding can assist greatly in this task, but commonly used phenetic approaches for DNA-based identifications rely on the existence of comprehensive reference databases, which are infeasible for hyperdiverse tropical ecosystems. Alternatively, phylogenetic methods are more robust to sparse taxon sampling but time-consuming, while multiple alignment of species-diagnostic, typically length-variable, markers can be problematic across divergent taxa. We advocate the combination of phylogenetic and phenetic methods for taxonomic assignment of DNA-barcode sequences against incomplete reference databases such as GenBank, and we developed a pipeline to implement this approach on large-scale plant diversity projects. The pipeline workflow includes several steps: database construction and curation, query sequence clustering, sequence retrieval, distance calculation, multiple alignment and phylogenetic inference. We describe the strategies used to establish these steps and the optimization of parameters to fit the selected psbA-trnH marker. We tested the pipeline using infertile plant samples and herbivore diet sequences from the highly threatened Nicaraguan seasonally dry forest and exploiting a valuable purpose-built resource: a partial local reference database of plant psbA-trnH. The selected methodology proved efficient and reliable for high-throughput taxonomic assignment, and our results corroborate the advantage of applying 'strict' tree-based criteria to avoid false positives. The pipeline tools are distributed as the scripts suite 'BAGpipe' (pipeline for Biodiversity Assessment using GenBank data), which can be readily adjusted to the purposes of other projects and applied to sequence-based identification for any marker or taxon.

  1. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  2. Space Station Freedom automation and robotics: An assessment of the potential for increased productivity

    NASA Technical Reports Server (NTRS)

    Weeks, David J.; Zimmerman, Wayne F.; Swietek, Gregory E.; Reid, David H.; Hoffman, Ronald B.; Stammerjohn, Lambert W., Jr.; Stoney, William; Ghovanlou, Ali H.

    1990-01-01

    This report presents the results of a study performed in support of the Space Station Freedom Advanced Development Program, under the sponsorship of the Space Station Engineering (Code MT), Office of Space Flight. The study consisted of the collection, compilation, and analysis of lessons learned, crew time requirements, and other factors influencing the application of advanced automation and robotics, with emphasis on potential improvements in productivity. The lessons learned data collected were based primarily on Skylab, Spacelab, and other Space Shuttle experiences, consisting principally of interviews with current and former crew members and other NASA personnel with relevant experience. The objectives of this report are to present a summary of this data and its analysis, and to present conclusions regarding promising areas for the application of advanced automation and robotics technology to the Space Station Freedom and the potential benefits in terms of increased productivity. In this study, primary emphasis was placed on advanced automation technology because of its fairly extensive utilization within private industry including the aerospace sector. In contrast, other than the Remote Manipulator System (RMS), there has been relatively limited experience with advanced robotics technology applicable to the Space Station. This report should be used as a guide and is not intended to be used as a substitute for official Astronaut Office crew positions on specific issues.

  3. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.

  4. Parenchymal texture analysis in digital mammography: A fully automated pipeline for breast cancer risk assessment

    PubMed Central

    Zheng, Yuanjie; Keller, Brad M.; Ray, Shonket; Wang, Yan; Conant, Emily F.; Gee, James C.; Kontos, Despina

    2015-01-01

    Purpose: Mammographic percent density (PD%) is known to be a strong risk factor for breast cancer. Recent studies also suggest that parenchymal texture features, which are more granular descriptors of the parenchymal pattern, can provide additional information about breast cancer risk. To date, most studies have measured mammographic texture within selected regions of interest (ROIs) in the breast, which cannot adequately capture the complexity of the parenchymal pattern throughout the whole breast. To better characterize patterns of the parenchymal tissue, the authors have developed a fully automated software pipeline based on a novel lattice-based strategy to extract a range of parenchymal texture features from the entire breast region. Methods: Digital mammograms from 106 cases with 318 age-matched controls were retrospectively analyzed. The lattice-based approach is based on a regular grid virtually overlaid on each mammographic image. Texture features are computed from the intersection (i.e., lattice) points of the grid lines within the breast, using a local window centered at each lattice point. Using this strategy, a range of statistical (gray-level histogram, co-occurrence, and run-length) and structural (edge-enhancing, local binary pattern, and fractal dimension) features are extracted. To cover the entire breast, the size of the local window for feature extraction is set equal to the lattice grid spacing and optimized experimentally by evaluating different windows sizes. The association between their lattice-based texture features and breast cancer was evaluated using logistic regression with leave-one-out cross validation and further compared to that of breast PD% and commonly used single-ROI texture features extracted from the retroareolar or the central breast region. Classification performance was evaluated using the area under the curve (AUC) of the receiver operating characteristic (ROC). DeLong’s test was used to compare the different ROCs in

  5. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  6. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  7. Automated ambulatory assessment of cognitive performance, environmental conditions, and motor activity during military operations

    NASA Astrophysics Data System (ADS)

    Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.

    2005-05-01

    Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.

  8. Automated isotope dilution liquid chromatography-tandem mass spectrometry with on-line dilution and solid phase extraction for the measurement of cortisol in human serum sample.

    PubMed

    Kawaguchi, Migaku; Eyama, Sakae; Takatsu, Akiko

    2014-08-01

    A candidate reference measurement procedure involving automated isotope dilution coupled with liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS) with on-line dilution and solid phase extraction (SPE) has been developed and critically evaluated. We constructed the LC-MS/MS with on-line dilution and SPE system. An isotopically labelled internal standard, cortisol-d4, was added to serum sample. After equilibration, the methanol was added to the sample, and deproteination was performed. Then, the sample was applied to the LC-MS/MS system. The limit of detection (LOD) and limit of quantification (LOQ) were 0.2 and 1ngg(-1), respectively. Excellent precision was obtained with within-day variation (RSD) of 1.9% for ID-LC-MS/MS analysis (n=6). This method, which demonstrates simple, easy, good accuracy, high precision, and is free from interferences from structural analogues, qualifies as a reference measurement procedure.

  9. An automated system to mount cryo-cooled protein crystals on a synchrotron beam line, using compact sample cassettes and a small-scale robot

    PubMed Central

    Cohen, Aina E.; Ellis, Paul J.; Miller, Mitchell D.; Deacon, Ashley M.; Phizackerley, R. Paul

    2014-01-01

    An automated system for mounting and dismounting pre-frozen crystals has been implemented at the Stanford Synchrotron Radiation Laboratory (SSRL). It is based on a small industrial robot and compact cylindrical cassettes, each holding up to 96 crystals mounted on Hampton Research sample pins. For easy shipping and storage, the cassette fits inside several popular dry-shippers and long-term storage Dewars. A dispensing Dewar holds up to three cassettes in liquid nitrogen adjacent to the beam line goniometer. The robot uses a permanent magnet tool to extract samples from, and insert samples into a cassette, and a cryo-tong tool to transfer them to and from the beam line goniometer. The system is simple, with few moving parts, reliable in operation and convenient to use. PMID:24899734

  10. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  11. Preliminary biogeochemical assessment of EPICA LGM and Holocene ice samples

    NASA Astrophysics Data System (ADS)

    Bulat, S.; Alekhina, I.; Marie, D.; Wagenbach, D.; Raynaud, D.; Petit, J. R.

    2009-04-01

    weak signals were possible to generate which are now under cloning. The signals were hard to reproduce because of rather low volume of samples. More ice volume is needed to get the biosignal stronger and reproducible. Meantime we are adjusting PCR and in addition testing DNA repair-enzyme cocktail in case of DNA damage. As a preliminary conclusion we would like to highlight the following. Both Holocene and LGM ice samples (EDC99 and EDML) are very clean in terms of Ultra low biomass and Ultra low DOC content. The most basal ice of EDC and EDML ice cores could help in assessing microbial biomass and diversity if present under the glacier at the ice-bedrock boundary. * The present-day consortium includes S. Bulat, I. Alekhina, P. Normand, D. Prieur, J-R. Petit and D. Raynaud (France) and E. Willerslev and J.P. Steffensen (Denmark)

  12. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    PubMed Central

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors. DOI: http://dx.doi.org/10.7554/eLife.19532.001 PMID:27731798

  13. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  14. Toxicological Assessment of ISS Air Quality: Contingency Sampling - February 2013

    NASA Technical Reports Server (NTRS)

    Meyers, Valerie

    2013-01-01

    Two grab sample containers (GSCs) were collected by crew members onboard ISS in response to a vinegar-like odor in the US Lab. On February 5, the first sample was collected approximately 1 hour after the odor was noted by the crew in the forward portion of the Lab. The second sample was collected on February 22 when a similar odor was noted and localized to the end ports of the microgravity science glovebox (MSG). The crewmember removed a glove from the MSG and collected the GSC inside the glovebox volume. Both samples were returned on SpaceX-2 for ground analysis.

  15. Automated assessment of split lung functon in post-lung-transplant evaluation

    NASA Astrophysics Data System (ADS)

    Goldin, Jonathan G.; Brown, Matthew S.; McNitt-Gray, Michael F.; Greaser, Lloyd E.; Martin, Katherine; Sayre, James W.; Aberle, Denise R.

    1998-07-01

    The purpose of this work was to develop an automated technique for calculating dynamic lung attenuation changes, through a forced expiratory maneuver, as a measure of split lung function. A total of ten patients post single lung transplantation (SLT) for emphysema were imaged using an Electron Beam CT Scanner; three were studied twice following stent placement. A single-slice flow study, using 100 msec exposures and 3 mm collimation, was performed at the level of the anastomosis during a forced expiration. Images were acquired every 500 msec for the first 3 seconds and every second for the last 4 seconds. An automated, knowledge-based system was developed to segment the chest wall, mediastinum, large airways and lung parenchyma in each image. Knowledge of the expected size, shape, topology and X-ray attenuation of anatomical structures were used to guide image segmentation involving attenuation thresholding, region-growing and morphology. From the segmented left and right parenchyma, the system calculated median attenuation (HU) and cross-sectional areas. These results were plotted against time for both the native and transplanted lungs. In five patients, significant shift of the attenuation/time curve to the right (slower flow) was detected, although the end expiration attenuation was not different. Following stent placement the curve shifted back to the left (faster flow).

  16. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  17. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  18. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    SciTech Connect

    Bennett, Bonnie; Boddy, Mark; Doyle, Frank; Jamshidi, Mo; Ogunnaike, Tunde

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  19. Automated assessment of footpad dermatitis in broiler chickens at the slaughter-line: evaluation and correspondence with human expert scores.

    PubMed

    Vanderhasselt, R F; Sprenger, M; Duchateau, L; Tuyttens, F A M

    2013-01-01

    Footpad dermatitis is increasingly used as an indicator of decreased broiler welfare, and automation of dermatitis monitoring potentially reduces the effort needed to monitor commercial flocks. In this study we evaluated a prototype system for the automatic assessment of footpad dermatitis in broiler chickens by comparing the automatic assessment with a human expert assessment. The expert aimed at selecting 2 times (different period) 20 broilers per footpad dermatitis category (5 categories in total), from 2 different flocks of 38-d-old broilers on an experimental farm. Two days later these broilers were transported to the slaughterhouse, where footpad dermatitis was assessed by the automatic system. Subsequently the footpads were reassessed by the same expert that had selected the birds. Automatic scores were only weakly correlated with scores given by the expert on-farm (r = 0.54) and at the slaughterhouse (r = 0.59). Manual evaluation of the photographs on which the automatic system based its scores revealed several errors. For 41.1% of the birds, the automatic system assessed only one of the footpads, whereas for 15.2% neither footpad was assessed. For 49.4% of the birds, scores were based on partially incorrectly identified areas. When data from such incomplete and obviously incorrect assessments were discarded, stronger correlations between automatic and expert scores were found (r = 0.68 and r = 0.74 for expert scores given on-farm and at-slaughter, respectively). Footpads that were missed by the automatic system were more likely to receive a high expert score at slaughter (P = 0.02). However, average flock scores did not differ greatly between automatic and expert scores. The prototype system for automatic dermatitis assessment needs to be improved on several points if it is to replace expert assessment of footpad dermatitis.

  20. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  1. High-performance liquid chromatographic determination of ochratoxin A in artificially contaminated cocoa beans using automated sample clean-up.

    PubMed

    Hurst, W J; Martin, R A

    1998-06-12

    A HPLC method is described for the analysis of ochratoxin A at low-ppb levels in samples of artificially contaminated cocoa beans. The samples are extracted in a mixture of methanol-water containing ascorbic acid, adjusted to pH and evaporated to dryness. Samples in this state are then placed onto a Benchmate sample preparation workstation where C18 solid-phase extraction operations are performed. The resulting materials are evaporated to dryness and analyzed by reversed-phase HPLC with fluorescence detection. The method was evaluated for accuracy and precision with R.S.D.s for multiple injections of sample and standard calculated to 1.1% and 2.5% for sample and standard, respectively. Recoveries of ochratoxin A added to cocoa beans ranged from 87-106% over the range of the assay.

  2. Quantifying Vocal Mimicry in the Greater Racket-Tailed Drongo: A Comparison of Automated Methods and Human Assessment

    PubMed Central

    Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini

    2014-01-01

    Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717

  3. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    NASA Astrophysics Data System (ADS)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98). Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  4. Automated system for generation of soil moisture products for agricultural drought assessment

    NASA Astrophysics Data System (ADS)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  5. Design, construction and six years' experience of an integrated system for automated handling of discrete blood samples.

    PubMed

    Andersson, J L; Schneider, H

    1998-01-01

    The present paper describes the design of an integrated system to aid in the taking and measurement of manual blood samples during nuclear medical examinations requiring blood sampling. In contrast to previously published systems, the present system is not used in the actual sampling of the blood, but aims to aid in all other aspects of handling and measurement. It consists of two main parts. One part is a distributed software system running on the scanner host computer used to register sample times, to display information pertaining to the ongoing examination and to collect data from a number of well crystals. The other main part consists of an industrial robot used to perform the actual weighing, centrifugation, pipetting and measurement of the samples. The system has been operational for 6 years, during which time it has had an "up-time" in excess of 95% and has handled and measured the blood samples from more than 5000 examinations, each comprising an average of 15 blood samples. The throughput of the system is 50 whole blood samples or 21 plasma samples per hour. In addition it has to a large extent removed the "human factor" from the process, thereby increasing the reliability of the data.

  6. [Information as physical factor: problems of measurement, hygienic assessment and IT-automation].

    PubMed

    Denisov, É I; Prokopenko, L V; Eremin, A L; Kur'erov, N N; Bodiakin, V I; Stepanian, I V

    2014-01-01

    The increasing flow of information, speeding up the progress of society, can impact the health that puts the task of its hygienic reglamentation. The physical aspects of information, parameters and units of quantities, aspects of measurement and evaluation with account of information quantity and quality as well as criteria of its permissible and optimal levels are considered. The results of measurements of quantity of text information produced per year on computer in 17 occupations of 10 economic sectors are presented. The principle of IT-automation of operator's work and of dynamic monitoring is proposed. On the basis of research performed the glossary of terms and guide on the problem with computer support are elaborated for the accumulation of experience and clarification of prospects.

  7. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  8. On-line automated sample preparation for liquid chromatography using parallel supported liquid membrane extraction and microporous membrane liquid-liquid extraction.

    PubMed

    Sandahl, Margareta; Mathiasson, Lennart; Jönsson, Jan Ake

    2002-10-25

    An automated system was developed for analysis of non-polar and polar ionisable compounds at trace levels in natural water. Sample work-up was performed in a flow system using two parallel membrane extraction units. This system was connected on-line to a reversed-phase HPLC system for final determination. One of the membrane units was used for supported liquid membrane (SLM) extraction, which is suitable for ionisable or permanently charged compounds. The other unit was used for microporous membrane liquid-liquid extraction (MMLLE) suitable for uncharged compounds. The fungicide thiophanate methyl and its polar metabolites carbendazim and 2-aminobenzimidazole were used as model compounds. The whole system was controlled by means of four syringe pumps. While extracting one part of the sample using the SLM technique. the extract from the MMLLE extraction was analysed and vice versa. This gave a total analysis time of 63 min for each sample resulting in a sample throughput of 22 samples per 24 h.

  9. Improvements in automated analysis of catecholamine and related metabolites in biological samples by column-switching high-performance liquid chromatography.

    PubMed

    Grossi, G; Bargossi, A M; Lucarelli, C; Paradisi, R; Sprovieri, C; Sprovieri, G

    1991-03-22

    Previously two fully automated methods based on column switching and high-performance liquid chromatography have been described, one for plasma and urinary catecholamines and the other for catecholamine urinary metabolites. Improvements in these methods, after 3 years of routine application, are now reported. The sample processing scheme was changed in order to eliminate memory effects and, in the procedure for plasma catecholamines, a pre-analytical deproteinization step was added which enhances the analytical column lifetime. The applied voltages for the electrochemical detector have been optimized, resulting in an automated method, suitable for the simultaneous determination of vanillylmandelic acid, 3,4-dihydroxyphenylacetic acid, homovanillic acid and 5-hydroxyindoleacetic acid. The sensitivity of the methods allows the detection of 2-3 ng/l of plasma catecholamines and 0.01-0.06 mg/l of urinary metabolites. Also, it is possible to switch from one method to the other in only 30 min. The normal values obtained from 200 healthy people are reported, together with a list of 57 potential interfering substances tested.

  10. Rapid assessment of soil and groundwater tritium by vegetation sampling

    SciTech Connect

    Murphy, C.E. Jr.

    1995-09-01

    A rapid and relatively inexpensive technique for defining the extent of groundwater contamination by tritium has been investigated. The technique uses existing vegetation to sample the groundwater. Water taken up by deep rooted trees is collected by enclosing tree branches in clear plastic bags. Water evaporated from the leaves condenses on the inner surface of the bag. The water is removed from the bag with a syringe. The bags can be sampled many times. Tritium in the water is detected by liquid scintillation counting. The water collected in the bags has no color and counts as well as distilled water reference samples. The technique was used in an area of known tritium contamination and proved to be useful in defining the extent of tritium contamination.

  11. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  12. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    SciTech Connect

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  13. Genesis Solar Wind Collector Cleaning Assessment: 60366 Sample Case Study

    NASA Technical Reports Server (NTRS)

    Goreva, Y. S.; Gonzalez, C. P.; Kuhlman, K. R.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, M. C.; Burkett, P. J.

    2014-01-01

    In order to recognize, localize, characterize and remove particle and thin film surface contamination, a small subset of Genesis mission collector fragments are being subjected to extensive study via various techniques [1-5]. Here we present preliminary results for sample 60336, a Czochralski silicon (Si-CZ) based wafer from the bulk array (B/C).

  14. Towards automated early cancer detection: Non-invasive, fluorescence-based approaches for quantitative assessment of cells and tissue to identify pre-cancers

    NASA Astrophysics Data System (ADS)

    Levitt, Jonathan Michael

    Cancer is the second leading cause of death globally, second only to heart disease. As in many diseases, patient survival is directly related to how early lesions are detected. Using conventional screening methods, the early changes associated with cancer, which occur on the microscopic scale, can easily go overlooked. Due to the inherent drawbacks of conventional techniques we present non-invasive, optically based methods to acquire high resolution images from live samples and assess cellular function associated with the onset of disease. Specifically, we acquired fluorescence images from NADH and FAD to quantify morphology and metabolic activity. We first conducted studies to monitor monolayers of keratinocytes in response to apoptosis which has been shown to be disrupted during cancer progression. We found that as keratinocytes undergo apoptosis there are populations of mitochondria that exhibit a higher metabolic activity that become progressively confined to a gradually smaller perinuclear region. To further assess the changes associated with early cancer growth we developed automated methods to rapidly quantify fluorescence images and extract morphological and metabolic information from life tissue. In this study, we simultaneously quantified mitochondrial organization, metabolic activity, nuclear size distribution, and the localization of the structural protein keratin, to differentiate between normal and pre-cancerous engineered tissues. We found the degree mitochondrial organization, as determined from the fractal derived Hurst parameter, was well correlated to level of cellular differentiation. We also found that the metabolic activity in the pre-cancerous cells was greater and more consistent throughout tissue depths in comparison to normal tissue. Keratin localization, also quantified from the fluorescence images, we found it to be confined to the uppermost layers of normal tissue while it was more evenly distributed in the precancerous tissues. To

  15. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    SciTech Connect

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders; Hansen, Olfred; Schytte, Tine; Bentzen, Soren M.

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  16. An automated on-line minicolumn preconcentration cold vapour atomic absorption spectrometer: application to determination of cadmium in water samples.

    PubMed

    Sahan, Serkan; Sahin, Uğur

    2012-01-15

    A method was developed for on-line solid phase preconcentration and cold vapour atomic absorption spectrometric determination of Cd(II) in aqueous samples. Lewatit Monoplus TP207 iminodiacetate chelating resin was used for the separation and preconcentration of Cd(II) ions at pH 4.0. The whole system was labmade. The influence of analytical parameters such as concentration of eluent and sodium tetrahydroborate solution, flow rate of eluent, sample, and Ar, and matrix ions were investigated. A preconcentration factor of 20 and a detection limit (3s(b)) of 2.1ngL(-1), along with a sampling frequency of 28h(-1) were achieved with 1.4min of sample loading time and with 2.8mL sample consumption. The relative standard deviation (R.S.D.) was 2.5% for 0.05μgL(-1) Cd(II) level. The developed method was used for Cd(II) analysis in water samples. The certified reference material (LGC6019) experimental results are in good agreement with the certified value.

  17. Simple semi-automated portable capillary electrophoresis instrument with contactless conductivity detection for the determination of β-agonists in pharmaceutical and pig-feed samples.

    PubMed

    Nguyen, Thi Anh Huong; Pham, Thi Ngoc Mai; Doan, Thi Tuoi; Ta, Thi Thao; Sáiz, Jorge; Nguyen, Thi Quynh Hoa; Hauser, Peter C; Mai, Thanh Duc

    2014-09-19

    An inexpensive, robust and easy to use portable capillary electrophoresis instrument with miniaturized high-voltage capacitively coupled contactless conductivity detection was developed. The system utilizes pneumatic operation to manipulate the solutions for all flushing steps. The different operations, i.e. capillary flushing, interface rinsing, and electrophoretic separation, are easily activated by turning an electronic switch. To allow the analysis of samples with limited available volume, and to render the construction less complicated compared to a computer-controlled counterpart, sample injection is carried out hydrodynamically directly from the sample vial into the capillary by manual syphoning. The system is a well performing solution where the financial means for the highly expensive commercial instruments are not available and where the in-house construction of a sophisticated automated instrument is not possible due to limited mechanical and electronic workshop facilities and software programming expertise. For demonstration, the system was employed successfully for the determination of some β-agonists, namely salbutamol, metoprolol and ractopamine down to 0.7ppm in pharmaceutical and pig-feed sample matrices in Vietnam.

  18. Automated in-syringe single-drop head-space micro-extraction applied to the determination of ethanol in wine samples.

    PubMed

    Srámková, Ivana; Horstkotte, Burkhard; Solich, Petr; Sklenářová, Hana

    2014-05-30

    A novel approach of head-space single-drop micro-extraction applied to the determination of ethanol in wine is presented. For the first time, the syringe of an automated syringe pump was used as an extraction chamber of adaptable size for a volatile analyte. This approach enabled to apply negative pressure during the enrichment step, which favored the evaporation of the analyte. Placing a slowly spinning magnetic stirring bar inside the syringe, effective syringe cleaning as well as mixing of the sample with buffer solution to suppress the interference of acetic acid was achieved. Ethanol determination was based on the reduction of a single drop of 3mmol L(-1) potassium dichromate dissolved in 8mol L(-1) sulfuric acid. The drop was positioned in the syringe inlet in the head-space above the sample with posterior spectrophotometric quantification. The entire procedure was carried out automatically using a simple sequential injection analyzer system. One analysis required less than 5min including the washing step. A limit of detection of 0.025% (v/v) of ethanol and an average repeatability of less than 5.0% RSD were achieved. The consumption of dichromate reagent, buffer, and sample per analysis were only 20μL, 200μL, and 1mL, respectively. The results of real samples analysis did not differ significantly from those obtained with the references gas chromatography method.

  19. GenomEra MRSA/SA, a fully automated homogeneous PCR assay for rapid detection of Staphylococcus aureus and the marker of methicillin resistance in various sample matrixes.

    PubMed

    Hirvonen, Jari J; Kaukoranta, Suvi-Sirkku

    2013-09-01

    The GenomEra MRSA/SA assay (Abacus Diagnostica, Turku, Finland) is the first commercial homogeneous PCR assay using thermally stable, intrinsically fluorescent time-resolved fluorometric (TRF) labels resistant to autofluorescence and other background effects. This fully automated closed tube PCR assay simultaneously detects Staphylococcus aureus specific DNA and the mecA gene within 50 min. It can be used for both screening and confirmation of methicillin-resistant and -sensitive S. aureus (MRSA and MSSA) directly in different specimen types or from preceding cultures. The assay has shown excellent performance in comparisons with other diagnostic methods in all the sample types tested. The GenomEra MRSA/SA assay provides rapid assistance for the detection of MRSA as well as invasive staphylococcal infections and helps the early targeting of antimicrobial therapy to patients with potential MRSA infection.

  20. Universality of Generalized Bunching and Efficient Assessment of Boson Sampling

    NASA Astrophysics Data System (ADS)

    Shchesnovich, V. S.

    2016-03-01

    It is found that identical bosons (fermions) show a generalized bunching (antibunching) property in linear networks: the absolute maximum (minimum) of the probability that all N input particles are detected in a subset of K output modes of any nontrivial linear M -mode network is attained only by completely indistinguishable bosons (fermions). For fermions K is arbitrary; for bosons it is either (i) arbitrary for only classically correlated bosons or (ii) satisfies K ≥N (or K =1 ) for arbitrary input states of N particles. The generalized bunching allows us to certify in a polynomial in N number of runs that a physical device realizing boson sampling with an arbitrary network operates in the regime of full quantum coherence compatible only with completely indistinguishable bosons. The protocol needs only polynomial classical computations for the standard boson sampling, whereas an analytic formula is available for the scattershot version.

  1. Development of Genesis Solar Wind Sample Cleanliness Assessment: Initial Report on Sample 60341 Optical Imagery and Elemental Mapping

    NASA Technical Reports Server (NTRS)

    Gonzalez, C. P.; Goreva, Y. S.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, P. J.; Burkett, P. J.

    2014-01-01

    Since 2005 the Genesis science team has experimented with techniques for removing the contaminant particles and films from the collection surface of the Genesis fragments. A subset of 40 samples have been designated as "cleaning matrix" samples. These are small samples to which various cleaning approaches are applied and then cleanliness is assessed optically, by TRXRF, SEM, ToF-SIMS, XPS, ellipsometry or other means [1-9]. Most of these sam-ples remain available for allocation, with cleanliness assessment data. This assessment allows evaluation of various cleaning techniques and handling or analytical effects. Cleaning techniques investigated by the Genesis community include acid/base etching, acetate replica peels, ion beam, and CO2 snow jet cleaning [10-16]. JSC provides surface cleaning using UV ozone exposure and ultra-pure water (UPW) [17-20]. The UPW rinse is commonly used to clean samples for handling debris between processing by different researchers. Optical microscopic images of the sample taken before and after UPW cleaning show what has been added or removed during the cleaning process.

  2. Assessing total and volatile solids in municipal solid waste samples.

    PubMed

    Peces, M; Astals, S; Mata-Alvarez, J

    2014-01-01

    Municipal solid waste is broadly generated in everyday activities and its treatment is a global challenge. Total solids (TS) and volatile solids (VS) are typical control parameters measured in biological treatments. In this study, the TS and VS were determined using the standard methods, as well as introducing some variants: (i) the drying temperature for the TS assays was 105°C, 70°C and 50°C and (ii) the VS were determined using different heating ramps from room tempature to 550°C. TS could be determined at either 105°C or 70°C, but oven residence time was tripled at 70°C, increasing from 48 to 144 h. The VS could be determined by smouldering the sample (where the sample is burnt without a flame), which avoids the release of fumes and odours in the laboratory. However, smouldering can generate undesired pyrolysis products as a consequence of carbonization, which leads to VS being underestimated. Carbonization can be avoided using slow heating ramps to prevent the oxygen limitation. Furthermore, crushing the sample cores decreased the time to reach constant weight and decreased the potential to underestimate VS.

  3. An automated method for assessing routine radiographs of patients with total hip replacements.

    PubMed

    Redhead, A L; Kotcheff, A C; Taylor, C J; Porter, M L; Hukins, D W

    1997-01-01

    This paper describes a new, fully automated method of locating objects on radiographs of patients with total joint replacements (TJRs). A statistical computer model, known as an active shape model, was trained to identify the position of the femur, pelvis, stem and cup marker wire on radiographs of patients with Charnley total hip prostheses. Once trained, the model was able to locate these objects through a process of automatic image searching, despite their appearance depending on the orientation and anatomy of the patient. Experiments were carried out to test the accuracy with which the model was able to fit to previously unseen data and with which reference points could be calculated from the model points. The model was able to locate the femur and stem with a mean error of approximately 0.8 mm and a 95 per cent confidence limit of 1.7 mm. Once the model had successfully located these objects, the midpoint of the stem head could be calculated with a mean error of approximately 0.2 mm. Although the model has been trained on Charnley total hip replacements, the method is generic and so can be applied to radiographs of patients with any TJR. This paper shows that computer models can form the basis of a quick, automatic method of taking measurements from standard clinical radiographs.

  4. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts

    PubMed Central

    Torney, Colin J.; Dobson, Andrew P.; Borner, Felix; Lloyd-Jones, David J.; Moyer, David; Maliti, Honori T.; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J. Grant C.

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future. PMID:27227888

  5. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. PMID:25553972

  6. Assessing workload through physiological measurements in bus drivers using an automated system during docking.

    PubMed

    Collet, Christian; Petit, Claire; Champely, Stephane; Dittmar, Andre

    The aim of the experiment was to test the effect of an automated system of bus docking on drivers' mental workload. Reduced workload is thought to be brought about by helping the driver to maneuver, as he or she is required only to monitor proper functioning of the system. However, the true impact of the system on drivers must be studied to guarantee good acceptance and minimal distraction from traffic. Workload was estimated by electrodermal activity recording while drivers tested 5 scenarios involving (or not involving) the docking system. Results showed that docking precision was improved when the system was used. When drivers monitored the functioning of the system, their workload was higher than that observed during manual docking; however, reduced workload was evidenced after a learning process. The docking system was also shown to increase workload in the event of dysfunction, especially when drivers had to take over control. Despite this particular situation, and after habituation, such a system could be integrated into buses to improve safety during boarding and egress.

  7. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity.

  8. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    PubMed

    Torney, Colin J; Dobson, Andrew P; Borner, Felix; Lloyd-Jones, David J; Moyer, David; Maliti, Honori T; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J Grant C

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future. PMID:27227888

  9. Semi-Automated Assessment of Transdiaphragmatic Pressure Variability across Motor Behaviors

    PubMed Central

    Medina-Martínez, Juan S.; Greising, Sarah M.; Sieck, Gary C.; Mantilla, Carlos B.

    2015-01-01

    We developed and tested a semi-automated algorithm to generate large data sets of ventilatory information (amplitude, premotor drive and timing) across a range of motor behaviors. Adult spontaneously breathing, anesthetized mice (n=27) underwent measurements of transdiaphragmatic pressure (Pdi) during eupnea, hypoxia-hypercapnia, and tracheal occlusion with values ranging from 8±1 to 9±2 to 44±3 cm H2O, respectively. Premotor drive to phrenic motor neurons (estimated by the rate of rise during initial 60 ms) was ~5-fold greater during tracheal occlusion compared to other behaviors. Variability in Pdi amplitude (normalized to spontaneously occurring sighs for each animal) displayed minimal evidence of complex temporal structure or dynamic clustering across the entire period of examination. Using a deterministic model to evaluate predictor variables for Pdi amplitude between successive inspiratory events, there was a large correlation for premotor drive and preceding Pdi amplitude vs. Pdi amplitude (r=0.52). These findings highlight substantial variability in Pdi amplitude that primarily reflects linear components rather than complex, dynamic effects over time. PMID:26003850

  10. ON-LINE TOOLS FOR PROPER VERTICAL POSITIONING OF VERTICAL SAMPLING INTERVALS DURING SITE ASSESSMENT

    EPA Science Inventory

    This presentation presents on-line tools for proper vertical positioning of vertical sampling intervals during site assessment. Proper vertical sample interval selection is critical for generate data on the vertical distribution of contamination. Without vertical delineation, th...

  11. Possibility of Using Nonmetallic Check Samples to Assess the Sensitivity of Penetrant Testing

    NASA Astrophysics Data System (ADS)

    Kalinichenko, N.; Lobanova, I.; Kalinichenko, A.; Loboda, E.; Jakubec, T.

    2016-06-01

    Versions of check sample manufacturing for penetrant inspection are considered. A statistical analysis of crack width measuring for nonmetallic samples is performed to determine the possibility of their application to assess the penetrant testing sensitivity.

  12. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    PubMed

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-01

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  13. Isotope Enrichment Detection by Laser Ablation - Laser Absorption Spectrometry: Automated Environmental Sampling and Laser-Based Analysis for HEU Detection

    SciTech Connect

    Anheier, Norman C.; Bushaw, Bruce A.

    2010-01-01

    The global expansion of nuclear power, and consequently the uranium enrichment industry, requires the development of new safeguards technology to mitigate proliferation risks. Current enrichment monitoring instruments exist that provide only yes/no detection of highly enriched uranium (HEU) production. More accurate accountancy measurements are typically restricted to gamma-ray and weight measurements taken in cylinder storage yards. Analysis of environmental and cylinder content samples have much higher effectiveness, but this approach requires onsite sampling, shipping, and time-consuming laboratory analysis and reporting. Given that large modern gaseous centrifuge enrichment plants (GCEPs) can quickly produce a significant quantity (SQ ) of HEU, these limitations in verification suggest the need for more timely detection of potential facility misuse. The Pacific Northwest National Laboratory (PNNL) is developing an unattended safeguards instrument concept, combining continuous aerosol particulate collection with uranium isotope assay, to provide timely analysis of enrichment levels within low enriched uranium facilities. This approach is based on laser vaporization of aerosol particulate samples, followed by wavelength tuned laser diode spectroscopy to characterize the uranium isotopic ratio through subtle differences in atomic absorption wavelengths. Environmental sampling (ES) media from an integrated aerosol collector is introduced into a small, reduced pressure chamber, where a focused pulsed laser vaporizes material from a 10 to 20-µm diameter spot of the surface of the sampling media. The plume of ejected material begins as high-temperature plasma that yields ions and atoms, as well as molecules and molecular ions. We concentrate on the plume of atomic vapor that remains after the plasma has expanded and then cooled by the surrounding cover gas. Tunable diode lasers are directed through this plume and each isotope is detected by monitoring absorbance

  14. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  15. Automated registration of optical coherence tomography and dermoscopy in the assessment of sub-clinical spread in basal cell carcinoma

    PubMed Central

    Penney, G. P.; Richardson, T. J.; Guyot, A.; Choi, M. J.; Sheth, N.; Craythorne, E.; Robson, A.; Mallipeddi, R.

    2014-01-01

    Optical coherence tomography (OCT) has been shown to be of clinical value in imaging basal cell carcinoma (BCC). A novel dual OCT-video imaging system, providing automated registration of OCT and dermoscopy, has been developed to assess the potential of OCT in measuring the degree of sub-clinical spread of BCC. Seventeen patients selected for Mohs micrographic surgery (MMS) for BCC were recruited to the study. The extent of BCC infiltration beyond a segment of the clinically assessed pre-surgical border was evaluated using OCT. Sufficiently accurate (<0.5 mm) registration of OCT and dermoscopy images was achieved in 9 patients. The location of the OCT-assessed BCC border was also compared with that of the final surgical defect. Infiltration of BCC across the clinical border ranged from 0 mm to >2.5 mm. In addition, the OCT border lay between 0.5 mm and 2.0 mm inside the final MMS defect in those cases where this could be assessed. In one case, where the final MMS defect was over 17 mm from the clinical border, OCT showed >2.5 mm infiltration across the clinical border at the FOV limit. These results provide evidence that OCT allows more accurate assessment of sub-clinical spread of BCC than clinical observation alone. Such a capability may have clinical value in reducing the number of surgical stages in MMS for BCC. There may also be a role for OCT in aiding the selection of patients most suitable for MMS. PMID:24784842

  16. Assessing rare earth elements in quartz rich geological samples.

    PubMed

    Santoro, A; Thoss, V; Guevara, S Ribeiro; Urgast, D; Raab, A; Mastrolitti, S; Feldmann, J

    2016-01-01

    Sodium peroxide (Na2O2) fusion coupled to Inductively Coupled Plasma Tandem Mass Spectrometry (ICP-MS/MS) measurements was used to rapidly screen quartz-rich geological samples for rare earth element (REE) content. The method accuracy was checked with a geological reference material and Instrumental Neutron Activation Analysis (INAA) measurements. The used mass-mode combinations presented accurate results (only exception being (157)Gd in He gas mode) with recovery of the geological reference material QLO-1 between 80% and 98% (lower values for Lu, Nd and Sm) and in general comparable to INAA measurements. Low limits of detection for all elements were achieved, generally below 10 pg g(-1), as well as measurement repeatability below 15%. Overall, the Na2O2/ICP-MS/MS method proved to be a suitable lab-based method to quickly and accurately screen rock samples originating from quartz-rich geological areas for rare earth element content; particularly useful if checking commercial viability.

  17. In planta passive sampling devices for assessing subsurface chlorinated solvents.

    PubMed

    Shetty, Mikhil K; Limmer, Matt A; Waltermire, Kendra; Morrison, Glenn C; Burken, Joel G

    2014-06-01

    Contaminant concentrations in trees have been used to delineate groundwater contaminant plumes (i.e., phytoscreening); however, variability in tree composition hinders accurate measurement of contaminant concentrations in planta, particularly for long-term monitoring. This study investigated in planta passive sampling devices (PSDs), termed solid phase samplers (SPSs) to be used as a surrogate tree core. Characteristics studied for five materials included material-air partitioning coefficients (Kma) for chlorinated solvents, sampler equilibration time and field suitability. The materials investigated were polydimethylsiloxane (PDMS), low-density polyethylene (LDPE), linear low-density polyethylene (LLDPE), polyoxymethylene (POM) and plasticized polyvinyl chloride (PVC). Both PDMS and LLDPE samplers demonstrated high partitioning coefficients and diffusivities and were further tested in greenhouse experiments and field trials. While most of the materials could be used for passive sampling, the PDMS SPSs performed best as an in planta sampler. Such a sampler was able to accurately measure trichloroethylene (TCE) and tetrachloroethylene (PCE) concentrations while simultaneously incorporating simple operation and minimal impact to the surrounding property and environment.

  18. In planta passive sampling devices for assessing subsurface chlorinated solvents.

    PubMed

    Shetty, Mikhil K; Limmer, Matt A; Waltermire, Kendra; Morrison, Glenn C; Burken, Joel G

    2014-06-01

    Contaminant concentrations in trees have been used to delineate groundwater contaminant plumes (i.e., phytoscreening); however, variability in tree composition hinders accurate measurement of contaminant concentrations in planta, particularly for long-term monitoring. This study investigated in planta passive sampling devices (PSDs), termed solid phase samplers (SPSs) to be used as a surrogate tree core. Characteristics studied for five materials included material-air partitioning coefficients (Kma) for chlorinated solvents, sampler equilibration time and field suitability. The materials investigated were polydimethylsiloxane (PDMS), low-density polyethylene (LDPE), linear low-density polyethylene (LLDPE), polyoxymethylene (POM) and plasticized polyvinyl chloride (PVC). Both PDMS and LLDPE samplers demonstrated high partitioning coefficients and diffusivities and were further tested in greenhouse experiments and field trials. While most of the materials could be used for passive sampling, the PDMS SPSs performed best as an in planta sampler. Such a sampler was able to accurately measure trichloroethylene (TCE) and tetrachloroethylene (PCE) concentrations while simultaneously incorporating simple operation and minimal impact to the surrounding property and environment. PMID:24268175

  19. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  20. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  1. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  2. Detection of coronary calcifications from computed tomography scans for automated risk assessment of coronary artery disease

    SciTech Connect

    Isgum, Ivana; Rutten, Annemarieke; Prokop, Mathias; Ginneken, Bram van

    2007-04-15

    A fully automated method for coronary calcification detection from non-contrast-enhanced, ECG-gated multi-slice computed tomography (CT) data is presented. Candidates for coronary calcifications are extracted by thresholding and component labeling. These candidates include coronary calcifications, calcifications in the aorta and in the heart, and other high-density structures such as noise and bone. A dedicated set of 64 features is calculated for each candidate object. They characterize the object's spatial position relative to the heart and the aorta, for which an automatic segmentation scheme was developed, its size and shape, and its appearance, which is described by a set of approximated Gaussian derivatives for which an efficient computational scheme is presented. Three classification strategies were designed. The first one tested direct classification without feature selection. The second approach also utilized direct classification, but with feature selection. Finally, the third scheme employed two-stage classification. In a computationally inexpensive first stage, the most easily recognizable false positives were discarded. The second stage discriminated between more difficult to separate coronary calcium and other candidates. Performance of linear, quadratic, nearest neighbor, and support vector machine classifiers was compared. The method was tested on 76 scans containing 275 calcifications in the coronary arteries and 335 calcifications in the heart and aorta. The best performance was obtained employing a two-stage classification system with a k-nearest neighbor (k-NN) classifier and a feature selection scheme. The method detected 73.8% of coronary calcifications at the expense of on average 0.1 false positives per scan. A calcium score was computed for each scan and subjects were assigned one of four risk categories based on this score. The method assigned the correct risk category to 93.4% of all scans.

  3. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  4. The second round of Critical Assessment of Automated Structure Determination of Proteins by NMR: CASD-NMR-2013.

    PubMed

    Rosato, Antonio; Vranken, Wim; Fogh, Rasmus H; Ragan, Timothy J; Tejero, Roberto; Pederson, Kari; Lee, Hsiau-Wei; Prestegard, James H; Yee, Adelinda; Wu, Bin; Lemak, Alexander; Houliston, Scott; Arrowsmith, Cheryl H; Kennedy, Michael; Acton, Thomas B; Xiao, Rong; Liu, Gaohua; Montelione, Gaetano T; Vuister, Geerten W

    2015-08-01

    The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100% of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90% of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged. PMID:26071966

  5. Aquatic hazard assessment of a commercial sample of naphthenic acids.

    PubMed

    Swigert, James P; Lee, Carol; Wong, Diana C L; White, Russell; Scarlett, Alan G; West, Charles E; Rowland, Steven J

    2015-04-01

    This paper presents chemical composition and aquatic toxicity characteristics of a commercial sample of naphthenic acids (NAs). Naphthenic acids are derived from the refining of petroleum middle distillates and can contribute to refinery effluent toxicity. NAs are also present in oil sands process-affected water (OSPW), but differences in the NAs compositions from these sources precludes using a common aquatic toxicity dataset to represent the aquatic hazards of NAs from both origins. Our chemical characterization of a commercial sample of NAs showed it to contain in order of abundance, 1-ring>2-ring>acyclic>3-ring acids (∼84%). Also present were monoaromatic acids (7%) and non-acids (9%, polyaromatic hydrocarbons and sulfur heterocyclic compounds). While the acyclic acids were only the third most abundant group, the five most abundant individual compounds were identified as C(10-14) n-acids (n-decanoic acid to n-tetradecanoic acid). Aquatic toxicity testing of fish (Pimephales promelas), invertebrate (Daphnia magna), algae (Pseudokirchneriella subcapitata), and bacteria (Vibrio fischeri) showed P. promelas to be the most sensitive species with 96-h LL50=9.0 mg L(-1) (LC50=5.6 mg L(-1)). Acute EL50 values for the other species ranged 24-46 mg L(-1) (EC50 values ranged 20-30 mg L(-1)). Biomimetic extraction via solid-phase-microextraction (BE-SPME) suggested a nonpolar narcosis mode of toxic action for D. magna, P. subcapitata, and V. fischeri. The BE analysis under-predicted fish toxicity, which indicates that a specific mode of action, besides narcosis, may be a factor for fishes. PMID:25434270

  6. Aquatic hazard assessment of a commercial sample of naphthenic acids.

    PubMed

    Swigert, James P; Lee, Carol; Wong, Diana C L; White, Russell; Scarlett, Alan G; West, Charles E; Rowland, Steven J

    2015-04-01

    This paper presents chemical composition and aquatic toxicity characteristics of a commercial sample of naphthenic acids (NAs). Naphthenic acids are derived from the refining of petroleum middle distillates and can contribute to refinery effluent toxicity. NAs are also present in oil sands process-affected water (OSPW), but differences in the NAs compositions from these sources precludes using a common aquatic toxicity dataset to represent the aquatic hazards of NAs from both origins. Our chemical characterization of a commercial sample of NAs showed it to contain in order of abundance, 1-ring>2-ring>acyclic>3-ring acids (∼84%). Also present were monoaromatic acids (7%) and non-acids (9%, polyaromatic hydrocarbons and sulfur heterocyclic compounds). While the acyclic acids were only the third most abundant group, the five most abundant individual compounds were identified as C(10-14) n-acids (n-decanoic acid to n-tetradecanoic acid). Aquatic toxicity testing of fish (Pimephales promelas), invertebrate (Daphnia magna), algae (Pseudokirchneriella subcapitata), and bacteria (Vibrio fischeri) showed P. promelas to be the most sensitive species with 96-h LL50=9.0 mg L(-1) (LC50=5.6 mg L(-1)). Acute EL50 values for the other species ranged 24-46 mg L(-1) (EC50 values ranged 20-30 mg L(-1)). Biomimetic extraction via solid-phase-microextraction (BE-SPME) suggested a nonpolar narcosis mode of toxic action for D. magna, P. subcapitata, and V. fischeri. The BE analysis under-predicted fish toxicity, which indicates that a specific mode of action, besides narcosis, may be a factor for fishes.

  7. Shorter sampling periods and accurate estimates of milk volume and components are possible for pasture based dairy herds milked with automated milking systems.

    PubMed

    Kamphuis, Claudia; Burke, Jennie K; Taukiri, Sarah; Petch, Susan-Fay; Turner, Sally-Anne

    2016-08-01

    Dairy cows grazing pasture and milked using automated milking systems (AMS) have lower milking frequencies than indoor fed cows milked using AMS. Therefore, milk recording intervals used for herd testing indoor fed cows may not be suitable for cows on pasture based farms. We hypothesised that accurate standardised 24 h estimates could be determined for AMS herds with milk recording intervals of less than the Gold Standard (48 hs), but that the optimum milk recording interval would depend on the herd average for milking frequency. The Gold Standard protocol was applied on five commercial dairy farms with AMS, between December 2011 and February 2013. From 12 milk recording test periods, involving 2211 cow-test days and 8049 cow milkings, standardised 24 h estimates for milk volume and milk composition were calculated for the Gold Standard protocol and compared with those collected during nine alternative sampling scenarios, including six shorter sampling periods and three in which a fixed number of milk samples per cow were collected. Results infer a 48 h milk recording protocol is unnecessarily long for collecting accurate estimates during milk recording on pasture based AMS farms. Collection of two milk samples only per cow was optimal in terms of high concordance correlation coefficients for milk volume and components and a low proportion of missed cow-test days. Further research is required to determine the effects of diurnal variations in milk composition on standardised 24 h estimates for milk volume and components, before a protocol based on a fixed number of samples could be considered. Based on the results of this study New Zealand have adopted a split protocol for herd testing based on the average milking frequency for the herd (NZ Herd Test Standard 8100:2015). PMID:27600967

  8. Assessment of social cognition in non-human primates using a network of computerized automated learning device (ALDM) test systems.

    PubMed

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-01-01

    Fagot & Paleressompoulle(1) and Fagot & Bonte(2) have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing(3). This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition. PMID:25992495

  9. Assessment of Social Cognition in Non-human Primates Using a Network of Computerized Automated Learning Device (ALDM) Test Systems

    PubMed Central

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-01-01

    Fagot & Paleressompoulle1 and Fagot & Bonte2 have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing3. This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition. PMID:25992495

  10. Evaluating hydrological response to forecasted land-use change—scenario testing with the automated geospatial watershed assessment (AGWA) tool

    USGS Publications Warehouse

    Kepner, William G.; Semmens, Darius J.; Hernandez, Mariano; Goodrich, David C.

    2009-01-01

    Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions to maintain the sustainable nature of our ecosystem services now and into the future. During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial-analysis technologies have been used to develop landscape information that can be integrated with hydrologic models to determine long-term change and make predictive inferences about the future. Two diverse case studies in northwest Oregon (Willamette River basin) and southeastern Arizona (San Pedro River) were examined in regard to future land use scenarios relative to their impact on surface water conditions (e.g., sediment yield and surface runoff) using hydrologic models associated with the Automated Geospatial Watershed Assessment (AGWA) tool. The base reference grid for land cover was modified in both study locations to reflect stakeholder preferences 20 to 60 yrs into the future, and the consequences of landscape change were evaluated relative to the selected future scenarios. The two studies provide examples of integrating hydrologic modeling with a scenario analysis framework to evaluate plausible future forecasts and to understand the potential impact of landscape change on ecosystem services.

  11. An Automated Grass-Based Procedure to Assess the Geometrical Accuracy of the Openstreetmap Paris Road Network

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.

    2016-06-01

    OpenStreetMap (OSM) is the largest spatial database of the world. One of the most frequently occurring geospatial elements within this database is the road network, whose quality is crucial for applications such as routing and navigation. Several methods have been proposed for the assessment of OSM road network quality, however they are often tightly coupled to the characteristics of the authoritative dataset involved in the comparison. This makes it hard to replicate and extend these methods. This study relies on an automated procedure which was recently developed for comparing OSM with any road network dataset. It is based on three Python modules for the open source GRASS GIS software and provides measures of OSM road network spatial accuracy and completeness. Provided that the user is familiar with the authoritative dataset used, he can adjust the values of the parameters involved thanks to the flexibility of the procedure. The method is applied to assess the quality of the Paris OSM road network dataset through a comparison against the French official dataset provided by the French National Institute of Geographic and Forest Information (IGN). The results show that the Paris OSM road network has both a high completeness and spatial accuracy. It has a greater length than the IGN road network, and is found to be suitable for applications requiring spatial accuracies up to 5-6 m. Also, the results confirm the flexibility of the procedure for supporting users in carrying out their own comparisons between OSM and reference road datasets.

  12. Assessment of social cognition in non-human primates using a network of computerized automated learning device (ALDM) test systems.

    PubMed

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-05-05

    Fagot & Paleressompoulle(1) and Fagot & Bonte(2) have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing(3). This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition.

  13. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  14. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    SciTech Connect

    Walworth, Matthew J; ElNaggar, Mariam S; Stankovich, Joseph J; WitkowskiII, Charles E.; Norris, Jeremy L; Van Berkel, Gary J

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  15. Assessing the Alcohol-BMI Relationship in a US National Sample of College Students

    ERIC Educational Resources Information Center

    Barry, Adam E.; Piazza-Gardner, Anna K.; Holton, M. Kim

    2015-01-01

    Objective: This study sought to assess the body mass index (BMI)-alcohol relationship among a US national sample of college students. Design: Secondary data analysis using the Fall 2011 National College Health Assessment (NCHA). Setting: A total of 44 US higher education institutions. Methods: Participants included a national sample of college…

  16. GIS-BASED HYDROLOGIC MODELING: THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local scale problems toward complex, spatially explicit regional ones. Such problems have to be
    addressed with distributed models that can compute runoff and erosion at different spatial a...

  17. Automated extraction of BI-RADS final assessment categories from radiology reports with natural language processing.

    PubMed

    Sippo, Dorothy A; Warden, Graham I; Andriole, Katherine P; Lacson, Ronilda; Ikuta, Ichiro; Birdwell, Robyn L; Khorasani, Ramin

    2013-10-01

    The objective of this study is to evaluate a natural language processing (NLP) algorithm that determines American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) final assessment categories from radiology reports. This HIPAA-compliant study was granted institutional review board approval with waiver of informed consent. This cross-sectional study involved 1,165 breast imaging reports in the electronic medical record (EMR) from a tertiary care academic breast imaging center from 2009. Reports included screening mammography, diagnostic mammography, breast ultrasound, combined diagnostic mammography and breast ultrasound, and breast magnetic resonance imaging studies. Over 220 reports were included from each study type. The recall (sensitivity) and precision (positive predictive value) of a NLP algorithm to collect BI-RADS final assessment categories stated in the report final text was evaluated against a manual human review standard reference. For all breast imaging reports, the NLP algorithm demonstrated a recall of 100.0 % (95 % confidence interval (CI), 99.7, 100.0 %) and a precision of 96.6 % (95 % CI, 95.4, 97.5 %) for correct identification of BI-RADS final assessment categories. The NLP algorithm demonstrated high recall and precision for extraction of BI-RADS final assessment categories from the free text of breast imaging reports. NLP may provide an accurate, scalable data extraction mechanism from reports within EMRs to create databases to track breast imaging performance measures and facilitate optimal breast cancer population management strategies. PMID:23868515

  18. AGWA: The Automated Geospatial Watershed Assessment Tool to Inform Rangeland Management

    EPA Science Inventory

    Do you want a relatively easy to use tool to assess rangeland soil and water conservation practices on rangeland erosion that is specifically designed to use ecological information? New Decision Support Tools (DSTs) that are easy-to-use, incorporate ecological concepts and rangel...

  19. Determination of polycyclic aromatic hydrocarbons in food samples by automated on-line in-tube solid-phase microextraction coupled with high-performance liquid chromatography-fluorescence detection.

    PubMed

    Ishizaki, A; Saito, K; Hanioka, N; Narimatsu, S; Kataoka, H

    2010-08-27

    A simple and sensitive automated method, consisting of in-tube solid-phase microextraction (SPME) coupled with high-performance liquid chromatography-fluorescence detection (HPLC-FLD), was developed for the determination of 15 polycyclic aromatic hydrocarbons (PAHs) in food samples. PAHs were separated within 15 min by HPLC using a Zorbax Eclipse PAH column with a water/acetonitrile gradient elution program as the mobile phase. The optimum in-tube SPME conditions were 20 draw/eject cycles of 40 microL of sample using a CP-Sil 19CB capillary column as an extraction device. Low- and high-molecular weight PAHs were extracted effectively onto the capillary coating from 5% and 30% methanol solutions, respectively. The extracted PAHs were readily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME HPLC-FLD method, good linearity of the calibration curve (r>0.9972) was obtained in the concentration range of 0.05-2.0 ng/mL, and the detection limits (S/N=3) of PAHs were 0.32-4.63 pg/mL. The in-tube SPME method showed 18-47 fold higher sensitivity than the direct injection method. The intra-day and inter-day precision (relative standard deviations) for a 1 ng/mL PAH mixture were below 5.1% and 7.6% (n=5), respectively. This method was applied successfully to the analysis of tea products and dried food samples without interference peaks, and the recoveries of PAHs spiked into the tea samples were >70%. Low-molecular weight PAHs such as naphthalene and pyrene were detected in many foods, and carcinogenic benzo[a]pyrene, at relatively high concentrations, was also detected in some black tea samples. This method was also utilized to assess the release of PAHs from tea leaves into the liquor.

  20. Determination of polycyclic aromatic hydrocarbons in food samples by automated on-line in-tube solid-phase microextraction coupled with high-performance liquid chromatography-fluorescence detection.

    PubMed

    Ishizaki, A; Saito, K; Hanioka, N; Narimatsu, S; Kataoka, H

    2010-08-27

    A simple and sensitive automated method, consisting of in-tube solid-phase microextraction (SPME) coupled with high-performance liquid chromatography-fluorescence detection (HPLC-FLD), was developed for the determination of 15 polycyclic aromatic hydrocarbons (PAHs) in food samples. PAHs were separated within 15 min by HPLC using a Zorbax Eclipse PAH column with a water/acetonitrile gradient elution program as the mobile phase. The optimum in-tube SPME conditions were 20 draw/eject cycles of 40 microL of sample using a CP-Sil 19CB capillary column as an extraction device. Low- and high-molecular weight PAHs were extracted effectively onto the capillary coating from 5% and 30% methanol solutions, respectively. The extracted PAHs were readily desorbed from the capillary by passage of the mobile phase, and no carryover was observed. Using the in-tube SPME HPLC-FLD method, good linearity of the calibration curve (r>0.9972) was obtained in the concentration range of 0.05-2.0 ng/mL, and the detection limits (S/N=3) of PAHs were 0.32-4.63 pg/mL. The in-tube SPME method showed 18-47 fold higher sensitivity than the direct injection method. The intra-day and inter-day precision (relative standard deviations) for a 1 ng/mL PAH mixture were below 5.1% and 7.6% (n=5), respectively. This method was applied successfully to the analysis of tea products and dried food samples without interference peaks, and the recoveries of PAHs spiked into the tea samples were >70%. Low-molecular weight PAHs such as naphthalene and pyrene were detected in many foods, and carcinogenic benzo[a]pyrene, at relatively high concentrations, was also detected in some black tea samples. This method was also utilized to assess the release of PAHs from tea leaves into the liquor. PMID:20637468

  1. Test-retest reliability analysis of the Cambridge Neuropsychological Automated Tests for the assessment of dementia in older people living in retirement homes.

    PubMed

    Gonçalves, Marta Matos; Pinho, Maria Salomé; Simões, Mário R

    2016-01-01

    The validity of the Cambridge Neuropsychological Automated Tests has been widely studied, but their reliability has not. This study aimed to estimate the test-retest reliability of these tests in a sample of 34 older adults, aged 69 to 90 years old, without neuropsychiatric diagnoses and living in retirement homes in the district of Lisbon, Portugal. The battery was administered twice, with a 4-week interval between sessions. The Paired Associates Learning (PAL), Spatial Working Memory (SWM), Rapid Visual Information Processing, and Reaction Time tests revealed measures with high-to-adequate test-retest correlations (.71-.89), although several PAL and SWM measures showed susceptibility to practice effects. Two estimated standardized regression-based methods were found to be more efficient at correcting for practice effects than a method of fixed correction. We also found weak test-retest correlations (.56-.68) for several measures. These results suggest that some, but not all, measures are suitable for cognitive assessment and monitoring in this population.

  2. Automated solid-phase extraction and liquid chromatography-electrospray ionization-mass spectrometry for the determination of flunitrazepam and its metabolites in human urine and plasma samples.

    PubMed

    Jourdil, N; Bessard, J; Vincent, F; Eysseric, H; Bessard, G

    2003-05-25

    A sensitive and specific method using reversed-phase liquid chromatography coupled with electrospray ionization-mass spectrometry (LC-ESI-MS) has been developed for the quantitative determination of flunitrazepam (F) and its metabolites 7-aminoflunitrazepam (7-AF), N-desmethylflunitrazepam (N-DMF) and 3-hydroxyflunitrazepam (3-OHF) in biological fluids. After the addition of deuterium labelled standards of F,7-AF and N-DMF, the drugs were isolated from urine or plasma by automated solid-phase extraction, then chromatographed in an isocratic elution mode with a salt-free eluent. The quantification was performed using selected ion monitoring of protonated molecular ions (M+H(+)). Experiments were carried out to improve the extraction recovery (81-100%) and the sensitivity (limit of detection 0.025 ng/ml for F and 7-AF, 0.040 ng/ml for N-DMF and 0.200 ng/ml for 3-OHF). The method was applied to the determination of F and metabolites in drug addicts including withdrawal urine samples and in one date-rape plasma and urine sample. PMID:12705961

  3. Automated detection of ambiguity in BI-RADS assessment categories in mammography reports.

    PubMed

    Bozkurt, Selen; Rubin, Daniel

    2014-01-01

    An unsolved challenge in biomedical natural language processing (NLP) is detecting ambiguities in the reports that can help physicians to improve report clarity. Our goal was to develop NLP methods to tackle the challenges of identifying ambiguous descriptions of the laterality of BI-RADS Final Assessment Categories in mammography radiology reports. We developed a text processing system that uses a BI-RADS ontology we built as a knowledge source for automatic annotation of the entities in mammography reports relevant to this problem. We used the GATE NLP toolkit and developed customized processing resources for report segmentation, named entity recognition, and detection of mismatches between BI-RADS Final Assessment Categories and mammogram laterality. Our system detected 55 mismatched cases in 190 reports and the accuracy rate was 81%. We conclude that such NLP techniques can detect ambiguities in mammography reports and may reduce discrepancy and variability in reporting. PMID:24743074

  4. Automated detection of ambiguity in BI-RADS assessment categories in mammography reports.

    PubMed

    Bozkurt, Selen; Rubin, Daniel

    2014-01-01

    An unsolved challenge in biomedical natural language processing (NLP) is detecting ambiguities in the reports that can help physicians to improve report clarity. Our goal was to develop NLP methods to tackle the challenges of identifying ambiguous descriptions of the laterality of BI-RADS Final Assessment Categories in mammography radiology reports. We developed a text processing system that uses a BI-RADS ontology we built as a knowledge source for automatic annotation of the entities in mammography reports relevant to this problem. We used the GATE NLP toolkit and developed customized processing resources for report segmentation, named entity recognition, and detection of mismatches between BI-RADS Final Assessment Categories and mammogram laterality. Our system detected 55 mismatched cases in 190 reports and the accuracy rate was 81%. We conclude that such NLP techniques can detect ambiguities in mammography reports and may reduce discrepancy and variability in reporting.

  5. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  6. Semi-automated Volumetric and Morphological Assessment of Glioblastoma Resection with Fluorescence-Guided Surgery

    PubMed Central

    Cordova, J. Scott; Gurbani, Saumya S.; Holder, Chad A.; Olson, Jeffrey J.; Schreibmann, Eduard; Shi, Ran; Guo, Ying; Shu, Hui-Kuo G.; Shim, Hyunsuk; Hadjipanayis, Costas G.

    2016-01-01

    Purpose Glioblastoma (GBM) neurosurgical resection relies on contrast-enhanced MRI-based neuronavigation. However, it is well-known that infiltrating tumor extends beyond contrast enhancement. Fluorescence-guided surgery (FGS) using 5-aminolevulinic acid (5-ALA) was evaluated to improve extent of resection (EOR) of GBMs. Pre-operative morphological tumor metrics were also assessed. Procedures Thirty patients from a Phase II trial evaluating 5-ALA FGS in newly diagnosed GBM were assessed. Tumors were segmented pre-operatively to assess morphological features as well as post-operatively to evaluate EOR and residual tumor volume (RTV). Results Median EOR and RTV were 94.3% and 0.821 cm3, respectively. Pre-operative surface area to volume ratio and RTV were significantly associated with overall survival, even when controlling for the known survival confounders. Conclusions This study supports claims that 5-ALA FGS is helpful at decreasing tumor burden and prolonging survival in GBM. Moreover, morphological indices are shown to impact both resection and patient survival. PMID:26463215

  7. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    NASA Astrophysics Data System (ADS)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  8. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    EPA Science Inventory

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  9. Higher-Order Exploratory Factor Analysis of the Reynolds Intellectual Assessment Scales with a Referred Sample

    ERIC Educational Resources Information Center

    Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.

    2007-01-01

    The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…

  10. Sequential sampling: cost-effective approach for monitoring benthic macroinvertebrates in environmental impact assessements

    SciTech Connect

    Resh, V.H.; Price, D.G.

    1984-01-01

    Sequential sampling is a method for monitoring benthic macroinvertebrates that can significantly reduce the number of samples required to reach a decision, and consequently, decrease the cost of benthic sampling in environmental impact assessments. Rather than depending on a fixed number of samples, this analysis cumulatively compares measured parameter values (for example, density, community diversity) from individual samples, with thresholds that are based on specified degrees of precision. In addition to reducing sample size, a monitoring program based on sequential sampling can provide clear-cut decisions as to whether a priori-defined changes in the measured parameter(s) have or have not occurred.

  11. The Clinical Outcomes Assessment Toolkit: A Framework to Support Automated Clinical Records–based Outcomes Assessment and Performance Measurement Research

    PubMed Central

    D'Avolio, Leonard W.; Bui, Alex A.T.

    2008-01-01

    The Clinical Outcomes Assessment Toolkit (COAT) was created through a collaboration between the University of California, Los Angeles and Brigham and Women's Hospital to address the challenge of gathering, formatting, and abstracting data for clinical outcomes and performance measurement research. COAT provides a framework for the development of information pipelines to transform clinical data from its original structured, semi-structured, and unstructured forms to a standardized format amenable to statistical analysis. This system includes a collection of clinical data structures, reusable utilities for information analysis and transformation, and a graphical user interface through which pipelines can be controlled and their results audited by nontechnical users. The COAT architecture is presented, as well as two case studies of current implementations in the domain of prostate cancer outcomes assessment. PMID:18308990

  12. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  13. Quantification of serum apolipoproteins A-I and B-100 in clinical samples using an automated SISCAPA-MALDI-TOF-MS workflow.

    PubMed

    van den Broek, Irene; Nouta, Jan; Razavi, Morteza; Yip, Richard; Bladergroen, Marco R; Romijn, Fred P H T M; Smit, Nico P M; Drews, Oliver; Paape, Rainer; Suckau, Detlev; Deelder, André M; van der Burgt, Yuri E M; Pearson, Terry W; Anderson, N Leigh; Cobbaert, Christa M

    2015-06-15

    A fully automated workflow was developed and validated for simultaneous quantification of the cardiovascular disease risk markers apolipoproteins A-I (apoA-I) and B-100 (apoB-100) in clinical sera. By coupling of stable-isotope standards and capture by anti-peptide antibodies (SISCAPA) for enrichment of proteotypic peptides from serum digests to matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS detection, the standardized platform enabled rapid, liquid chromatography-free quantification at a relatively high throughput of 96 samples in 12h. The average imprecision in normo- and triglyceridemic serum pools was 3.8% for apoA-I and 4.2% for apoB-100 (4 replicates over 5 days). If stored properly, the MALDI target containing enriched apoA-1 and apoB-100 peptides could be re-analyzed without any effect on bias or imprecision for at least 7 days after initial analysis. Validation of the workflow revealed excellent linearity for daily calibration with external, serum-based calibrators (R(2) of 0.984 for apoA-I and 0.976 for apoB-100 as average over five days), and absence of matrix effects or interference from triglycerides, protein content, hemolysates, or bilirubins. Quantification of apoA-I in 93 normo- and hypertriglyceridemic clinical sera showed good agreement with immunoturbidimetric analysis (slope = 1.01, R(2) = 0.95, mean bias = 4.0%). Measurement of apoB-100 in the same clinical sera using both methods, however, revealed several outliers in SISCAPA-MALDI-TOF-MS measurements, possibly as a result of the lower MALDI-TOF-MS signal intensity (slope = 1.09, R(2) = 0.91, mean bias = 2.0%). The combination of analytical performance, rapid cycle time and automation potential validate the SISCAPA-MALDI-TOF-MS platform as a valuable approach for standardized and high-throughput quantification of apoA-I and apoB-100 in large sample cohorts.

  14. Enabling automated magnetic resonance imaging-based targeting assessment during dipole field navigation

    NASA Astrophysics Data System (ADS)

    Latulippe, Maxime; Felfoul, Ouajdi; Dupont, Pierre E.; Martel, Sylvain

    2016-02-01

    The magnetic navigation of drugs in the vascular network promises to increase the efficacy and reduce the secondary toxicity of cancer treatments by targeting tumors directly. Recently, dipole field navigation (DFN) was proposed as the first method achieving both high field and high navigation gradient strengths for whole-body interventions in deep tissues. This is achieved by introducing large ferromagnetic cores around the patient inside a magnetic resonance imaging (MRI) scanner. However, doing so distorts the static field inside the scanner, which prevents imaging during the intervention. This limitation constrains DFN to open-loop navigation, thus exposing the risk of a harmful toxicity in case of a navigation failure. Here, we are interested in periodically assessing drug targeting efficiency using MRI even in the presence of a core. We demonstrate, using a clinical scanner, that it is in fact possible to acquire, in specific regions around a core, images of sufficient quality to perform this task. We show that the core can be moved inside the scanner to a position minimizing the distortion effect in the region of interest for imaging. Moving the core can be done automatically using the gradient coils of the scanner, which then also enables the core to be repositioned to perform navigation to additional targets. The feasibility and potential of the approach are validated in an in vitro experiment demonstrating navigation and assessment at two targets.

  15. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  16. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  17. Assessment of fully automated antibody homology modeling protocols in molecular operating environment.

    PubMed

    Maier, Johannes K X; Labute, Paul

    2014-08-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed. PMID:24715627

  18. Quantitative assessment of levodopa-induced dyskinesia using automated motion sensing technology.

    PubMed

    Mera, Thomas O; Burack, Michelle A; Giuffrida, Joseph P

    2012-01-01

    The objective was to capture levodopa-induced dyskinesia (LID) in patients with Parkinson's disease (PD) using body-worn motion sensors. Dopaminergic treatment in PD can induce abnormal involuntary movements, including choreatic dyskinesia (brief, rapid, irregular movements). Adjustments in medication to reduce LID often sacrifice control of motor symptoms, and balancing this tradeoff poses a significant challenge for management of advanced PD. Fifteen PD subjects with known LID were recruited and instructed to perform two stationary motor tasks while wearing a compact wireless motion sensor unit positioned on each hand over the course of a levodopa dose cycle. Videos of subjects performing the motor tasks were later scored by expert clinicians to assess global dyskinesia using the modified Abnormal Involuntary Rating Scale (m-AIMS). Kinematic features were extracted from motion data in different frequency bands (1-3Hz and 3-8Hz) to quantify LID severity and to distinguish between LID and PD tremor. Receiver operator characteristic analysis was used to determine thresholds for individual features to detect the presence of LID. A sensitivity of 0.73 and specificity of 1.00 were achieved. A neural network was also trained to output dyskinesia severity on a 0 to 4 scale, similar to the m-AIMS. The model generalized well to new data (coefficient of determination= 0.85 and mean squared error= 0.3). This study demonstrated that hand-worn motion sensors can be used to assess global dyskinesia severity independent of PD tremor over the levodopa dose cycle.

  19. A Computer-Based Automated Algorithm for Assessing Acinar Cell Loss after Experimental Pancreatitis

    PubMed Central

    Eisses, John F.; Davis, Amy W.; Tosun, Akif Burak; Dionise, Zachary R.; Chen, Cheng; Ozolek, John A.; Rohde, Gustavo K.; Husain, Sohail Z.

    2014-01-01

    The change in exocrine mass is an important parameter to follow in experimental models of pancreatic injury and regeneration. However, at present, the quantitative assessment of exocrine content by histology is tedious and operator-dependent, requiring manual assessment of acinar area on serial pancreatic sections. In this study, we utilized a novel computer-generated learning algorithm to construct an accurate and rapid method of quantifying acinar content. The algorithm works by learning differences in pixel characteristics from input examples provided by human experts. HE-stained pancreatic sections were obtained in mice recovering from a 2-day, hourly caerulein hyperstimulation model of experimental pancreatitis. For training data, a pathologist carefully outlined discrete regions of acinar and non-acinar tissue in 21 sections at various stages of pancreatic injury and recovery (termed the “ground truth”). After the expert defined the ground truth, the computer was able to develop a prediction rule that was then applied to a unique set of high-resolution images in order to validate the process. For baseline, non-injured pancreatic sections, the software demonstrated close agreement with the ground truth in identifying baseline acinar tissue area with only a difference of 1%±0.05% (p = 0.21). Within regions of injured tissue, the software reported a difference of 2.5%±0.04% in acinar area compared with the pathologist (p = 0.47). Surprisingly, on detailed morphological examination, the discrepancy was primarily because the software outlined acini and excluded inter-acinar and luminal white space with greater precision. The findings suggest that the software will be of great potential benefit to both clinicians and researchers in quantifying pancreatic acinar cell flux in the injured and recovering pancreas. PMID:25343460

  20. Development of a field-friendly automated dietary assessment tool and nutrient database for India.

    PubMed

    Daniel, Carrie R; Kapur, Kavita; McAdams, Mary J; Dixit-Joshi, Sujata; Devasenapathy, Niveditha; Shetty, Hemali; Hariharan, Sriram; George, Preethi S; Mathew, Aleyamma; Sinha, Rashmi

    2014-01-14

    Studies of diet and disease risk in India and among other Asian-Indian populations are hindered by the need for a comprehensive dietary assessment tool to capture data on the wide variety of food and nutrient intakes across different regions and ethnic groups. The nutritional component of the India Health Study, a multicentre pilot cohort study, included 3908 men and women, aged 35-69 years, residing in three regions of India (New Delhi in the north, Mumbai in the west and Trivandrum in the south). We developed a computer-based, interviewer-administered dietary assessment software known as the 'NINA-DISH (New Interactive Nutrition Assistant - Diet in India Study of Health)', which consisted of four sections: (1) a diet history questionnaire with defined questions on frequency and portion size; (2) an open-ended section for each mealtime; (3) a food-preparer questionnaire; (4) a 24 h dietary recall. Using the preferred meal-based approach, frequency of intake and portion size were recorded and linked to a nutrient database that we developed and modified from a set of existing international databases containing data on Indian foods and recipes. The NINA-DISH software was designed to be easily adaptable and was well accepted by the interviewers and participants in the field. A predominant three-meal eating pattern emerged; however, patterns in the number of foods reported and the primary contributors to macro- and micronutrient intakes differed by region and demographic factors. The newly developed NINA-DISH software provides a much-needed tool for measuring diet and nutrient profiles across the diverse populations of India with the potential for application in other South Asian populations living throughout the world. PMID:23796477

  1. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    SciTech Connect

    Gwynne, Sarah; Spezi, Emiliano; Wills, Lucy; Nixon, Lisette; Hurt, Chris; Joseph, George; Evans, Mererid; Griffiths, Gareth; Crosby, Tom; Staffurth, John

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  2. Assessment of the Phoenix™ automated system and EUCAST breakpoints for antimicrobial susceptibility testing against isolates expressing clinically relevant resistance mechanisms.

    PubMed

    Giani, T; Morosini, M I; D'Andrea, M M; García-Castillo, M; Rossolini, G M; Cantón, R

    2012-11-01

    EUCAST breakpoint criteria are being adopted by automatic antimicrobial susceptibility testing systems. The accuracy of the Phoenix Automated System in combination with 2012 EUCAST breakpoints against recent clinical isolates was evaluated. A total of 697 isolates (349 Enterobacteriaceae, 113 Pseudomonas spp., 25 Acinetobacter baumannii, 11 Stenotrophomonas maltophilia, 95 Staphylococcus aureus, 6 coagulase negative staphylococci, 77 enterococci and 21 Streptococcus pneumoniae) with defined resistance phenotypes and well-characterized resistance mechanisms recovered in Spain (n = 343) and Italy (n = 354) were tested. Comparator antimicrobial susceptibility testing data were obtained following CLSI guidelines. Experimental agreement (EA), defined as MIC agreement ±1 log(2) dilution, category agreement (CA) and relative discrepancies (minor (mD), major (MD) and very major discrepancies (VMD)) were determined. The overall EA and CA for all organism-antimicrobial agent combinations (n = 6.294) were 97.3% and 95.2%, respectively. mD, MD and VMD were 4.7%, 1.3% and 2.7%, all of them in agreement with the ISO (ISO20776-2:2007) acceptance criteria for assessment of susceptibility testing devices. VMD were mainly observed in amoxicillin-clavulanate and cefuroxime in Enterobacteriaceae and gentamicin in Pseudomonas aeruginosa, whereas MD were mainly observed in amoxicillin-clavulante in Enterobacteriaceae. mD were mainly observed in Enterobacteriaceae but distributed in different antimicrobials. For S. aureus and enterococci relative discrepancies were low. The Phoenix system showed accuracy assessment in accordance with the ISO standards when using EUCAST breakpoints. Inclusion of EUCAST criteria in automatic antimicrobial susceptibility testing systems will facilitate the implementation of EUCAST breakpoints in clinical microbiology laboratories.

  3. Assessing the accuracy and repeatability of automated photogrammetrically generated digital surface models from unmanned aerial system imagery

    NASA Astrophysics Data System (ADS)

    Chavis, Christopher

    Using commercial digital cameras in conjunction with Unmanned Aerial Systems (UAS) to generate 3-D Digital Surface Models (DSMs) and orthomosaics is emerging as a cost-effective alternative to Light Detection and Ranging (LiDAR). Powerful software applications such as Pix4D and APS can automate the generation of DSM and orthomosaic products from a handful of inputs. However, the accuracy of these models is relatively untested. The objectives of this study were to generate multiple DSM and orthomosaic pairs of the same area using Pix4D and APS from flights of imagery collected with a lightweight UAS. The accuracy of each individual DSM was assessed in addition to the consistency of the method to model one location over a period of time. Finally, this study determined if the DSMs automatically generated using lightweight UAS and commercial digital cameras could be used for detecting changes in elevation and at what scale. Accuracy was determined by comparing DSMs to a series of reference points collected with survey grade GPS. Other GPS points were also used as control points to georeference the products within Pix4D and APS. The effectiveness of the products for change detection was assessed through image differencing and observance of artificially induced, known elevation changes. The vertical accuracy with the optimal data and model is ≈ 25 cm and the highest consistency over repeat flights is a standard deviation of ≈ 5 cm. Elevation change detection based on such UAS imagery and DSM models should be viable for detecting infrastructure change in urban or suburban environments with little dense canopy vegetation.

  4. Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection

    SciTech Connect

    Barbara H. Dolphin; William D. RIchins; Stephen R. Novascone

    2010-10-01

    The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision

  5. Comparison of various capillary electrophoretic approaches for the study of drug-protein interaction with emphasis on minimal consumption of protein sample and possibility of automation.

    PubMed

    Michalcová, Lenka; Glatz, Zdeněk

    2015-01-01

    The binding ability of a drug to plasma proteins influences the pharmacokinetics of a drug. As a result, it is a very important issue in new drug development. In this study, affinity capillary electrophoresis, capillary electrophoresis with frontal analysis, and Hummel Dreyer methods with internal and external calibration were used to study the affinity between bovine serum albumin and salicylic acid. The binding constant was measured by all these approaches including the equilibrium dialysis, which is considered to be a reference method. The comparison of results and other considerations showed the best electrophoretic approach to be capillary electrophoresis-frontal analysis, which is characterized by the high sample throughput with the possibility of automation, very small quantities of biomacromolecules, simplicity, and a short analysis time. The mechanism of complex formation was then examined by capillary electrophoresis with frontal analysis. The binding parameters were determined and the corresponding thermodynamic parameters such as Gibbs free energy ΔG(0), enthalpy ΔH(0), and entropy changes ΔS(0) at various temperatures were calculated. The results showed that the binding of bovine serum albumin and salicylic acid was spontaneous, and that hydrogen bonding and van der Waals forces played a major role in the formation of the complex.

  6. A fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure: Liquid chromatographic determination of ofloxacin in human urine samples.

    PubMed

    Vakh, Christina; Pochivalov, Aleksei; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-02-11

    A novel fully automated effervescence-assisted switchable solvent-based liquid phase microextraction procedure has been suggested. In this extraction method, medium-chain saturated fatty acids were investigated as switchable hydrophilicity solvents. The conversion of fatty acid into hydrophilic form was carried out in the presence of sodium carbonate. The injection of sulfuric acid into the solution decreased the pH value of the solution, thus, microdroplets of the fatty acid were generated. Carbon dioxide bubbles were generated in-situ, and promoted the extraction process and final phase separation. The performance of the suggested approach was demonstrated by the determination of ofloxacin in human urine samples using high-performance liquid chromatography with fluorescence detection. This analytical task was used as a proof-of-concept example. Under the optimal conditions, the detector response of ofloxacin was linear in the concentration ranges of 3·10(-8)-3·10(-6) mol L(-1). The limit of detection, calculated from a blank test based on 3σ, was 1·10(-8) mol L(-1). The results demonstrated that the presented approach is highly cost-effective, simple, rapid and environmentally friendly.

  7. Evaluation of a Portable Automated Serum Chemistry Analyzer for Field Assessment of Harlequin Ducks, Histrionicus histrionicus.

    PubMed

    Stoskopf, Michael K; Mulcahy, Daniel M; Esler, Daniel

    2010-01-01

    A portable analytical chemistry analyzer was used to make field assessments of wild harlequin ducks (Histrionicus histrionicus) in association with telemetry studies of winter survival in Prince William Sound, Alaska. We compared serum chemistry results obtained on-site with results from a traditional laboratory. Particular attention was paid to serum glucose and potassium concentrations as potential indicators of high-risk surgical candidates based on evaluation of the field data. The median differential for glucose values (N = 82) between methods was 0.6 mmol/L (quartiles 0.3 and 0.9 mmol/L) with the median value higher when assayed on site. Analysis of potassium on site returned a median of 2.7 mmol/L (N = 88; quartiles 2.4 and 3.0 mmol/L). Serum potassium values were too low for quantitation by the traditional laboratory. Changes in several serum chemistry values following a three-day storm during the study support the value of on site evaluation of serum potassium to identify presurgical patients with increased anesthetic risk.

  8. Automated In-Home Fall Risk Assessment and Detection Sensor System for Elders

    PubMed Central

    Rantz, Marilyn; Skubic, Marjorie; Abbott, Carmen; Galambos, Colleen; Popescu, Mihail; Keller, James; Stone, Erik; Back, Jessie; Miller, Steven J.; Petroski, Gregory F.

    2015-01-01

    Purpose of the Study: Falls are a major problem for the elderly people leading to injury, disability, and even death. An unobtrusive, in-home sensor system that continuously monitors older adults for fall risk and detects falls could revolutionize fall prevention and care. Design and Methods: A fall risk and detection system was developed and installed in the apartments of 19 older adults at a senior living facility. The system includes pulse-Doppler radar, a Microsoft Kinect, and 2 web cameras. To collect data for comparison with sensor data and for algorithm development, stunt actors performed falls in participants’ apartments each month for 2 years and participants completed fall risk assessments (FRAs) using clinically valid, standardized instruments. The FRAs were scored by clinicians and recorded by the sensing modalities. Participants’ gait parameters were measured as they walked on a GAITRite mat. These data were used as ground truth, objective data to use in algorithm development and to compare with radar and Kinect generated variables. Results: All FRAs are highly correlated (p < .01) with the Kinect gait velocity and Kinect stride length. Radar velocity is correlated (p < .05) to all the FRAs and highly correlated (p < .01) to most. Real-time alerts of actual falls are being sent to clinicians providing faster responses to urgent situations. Implications: The in-home FRA and detection system has the potential to help older adults remain independent, maintain functional ability, and live at home longer. PMID:26055784

  9. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  10. Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations

    NASA Astrophysics Data System (ADS)

    Nehm, Ross H.; Ha, Minsu; Mayfield, Elijah

    2012-02-01

    This study explored the use of machine learning to automatically evaluate the accuracy of students' written explanations of evolutionary change. Performance of the Summarization Integrated Development Environment (SIDE) program was compared to human expert scoring using a corpus of 2,260 evolutionary explanations written by 565 undergraduate students in response to two different evolution instruments (the EGALT-F and EGALT-P) that contained prompts that differed in various surface features (such as species and traits). We tested human-SIDE scoring correspondence under a series of different training and testing conditions, using Kappa inter-rater agreement values of greater than 0.80 as a performance benchmark. In addition, we examined the effects of response length on scoring success; that is, whether SIDE scoring models functioned with comparable success on short and long responses. We found that SIDE performance was most effective when scoring models were built and tested at the individual item level and that performance degraded when suites of items or entire instruments were used to build and test scoring models. Overall, SIDE was found to be a powerful and cost-effective tool for assessing student knowledge and performance in a complex science domain.

  11. Using RNA sample titrations to assess microarray platform performance and normalization techniques.

    PubMed

    Shippy, Richard; Fulmer-Smentek, Stephanie; Jensen, Roderick V; Jones, Wendell D; Wolber, Paul K; Johnson, Charles D; Pine, P Scott; Boysen, Cecilie; Guo, Xu; Chudin, Eugene; Sun, Yongming Andrew; Willey, James C; Thierry-Mieg, Jean; Thierry-Mieg, Danielle; Setterquist, Robert A; Wilson, Mike; Lucas, Anne Bergstrom; Novoradovskaya, Natalia; Papallo, Adam; Turpaz, Yaron; Baker, Shawn C; Warrington, Janet A; Shi, Leming; Herman, Damir

    2006-09-01

    We have assessed the utility of RNA titration samples for evaluating microarray platform performance and the impact of different normalization methods on the results obtained. As part of the MicroArray Quality Control project, we investigated the performance of five commercial microarray platforms using two independent RNA samples and two titration mixtures of these samples. Focusing on 12,091 genes common across all platforms, we determined the ability of each platform to detect the correct titration response across the samples. Global deviations from the response predicted by the titration ratios were observed. These differences could be explained by variations in relative amounts of messenger RNA as a fraction of total RNA between the two independent samples. Overall, both the qualitative and quantitative correspondence across platforms was high. In summary, titration samples may be regarded as a valuable tool, not only for assessing microarray platform performance and different analysis methods, but also for determining some underlying biological features of the samples.

  12. Measurement of acceleration while walking as an automated method for gait assessment in dairy cattle.

    PubMed

    Chapinal, N; de Passillé, A M; Pastell, M; Hänninen, L; Munksgaard, L; Rushen, J

    2011-06-01

    The aims were to determine whether measures of acceleration of the legs and back of dairy cows while they walk could help detect changes in gait or locomotion associated with lameness and differences in the walking surface. In 2 experiments, 12 or 24 multiparous dairy cows were fitted with five 3-dimensional accelerometers, 1 attached to each leg and 1 to the back, and acceleration data were collected while cows walked in a straight line on concrete (experiment 1) or on both concrete and rubber (experiment 2). Cows were video-recorded while walking to assess overall gait, asymmetry of the steps, and walking speed. In experiment 1, cows were selected to maximize the range of gait scores, whereas no clinically lame cows were enrolled in experiment 2. For each accelerometer location, overall acceleration was calculated as the magnitude of the 3-dimensional acceleration vector and the variance of overall acceleration, as well as the asymmetry of variance of acceleration within the front and rear pair of legs. In experiment 1, the asymmetry of variance of acceleration in the front and rear legs was positively correlated with overall gait and the visually assessed asymmetry of the steps (r ≥ 0.6). Walking speed was negatively correlated with the asymmetry of variance of the rear legs (r=-0.8) and positively correlated with the acceleration and the variance of acceleration of each leg and back (r ≥ 0.7). In experiment 2, cows had lower gait scores [2.3 vs. 2.6; standard error of the difference (SED)=0.1, measured on a 5-point scale] and lower scores for asymmetry of the steps (18.0 vs. 23.1; SED=2.2, measured on a continuous 100-unit scale) when they walked on rubber compared with concrete, and their walking speed increased (1.28 vs. 1.22 m/s; SED=0.02). The acceleration of the front (1.67 vs. 1.72 g; SED=0.02) and rear (1.62 vs. 1.67 g; SED=0.02) legs and the variance of acceleration of the rear legs (0.88 vs. 0.94 g; SED=0.03) were lower when cows walked on rubber

  13. Efficiency of EPI cluster sampling for assessing diarrhoea and dysentery prevalence.

    PubMed Central

    Yoon, S. S.; Katz, J.; Brendel, K.; West, K. P.

    1997-01-01

    This study examines the efficiency of EPI cluster sampling in assessing the prevalence of diarrhoea and dysentery. A computer was used to simulate fieldwork carried out by a survey taker. The bias and variance of prevalence estimates obtained using EPI cluster sampling were compared with those obtained using simple random sampling and cluster (stratified random) sampling. Efficiency ratios, calculated as the mean square error divided by total distance travelled, were used to compare EPI cluster sampling to simple random sampling and standard cluster sampling. EPI cluster sampling may be an appropriate low-cost tool for monitoring trends in the prevalence of diarrhoea and dysentery over time. However, it should be used with caution when estimating the prevalence of diarrhoea at a single point in time because of the bias associated with this cluster sampling method. PMID:9447775

  14. 296-B-5 Stack monitoring and sampling system annual system assessment report

    SciTech Connect

    Ridge, T.M.

    1995-02-01

    The B Plant Administration Manual requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 296-B-5 at B Plant. The sampling and monitoring system associated with stack 296-B-5 is functional and performing satisfactorily. This document is an annual assessment report of the systems associated with the 296-B-5 stack.

  15. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    SciTech Connect

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  16. Testing of an automated online EA-IRMS method for fast and simultaneous carbon content and stable isotope measurement of aerosol samples

    NASA Astrophysics Data System (ADS)

    Major, István; Gyökös, Brigitta; Túri, Marianna; Futó, István; Filep, Ágnes; Hoffer, András; Molnár, Mihály

    2016-04-01

    Comprehensive atmospheric studies have demonstrated that carbonaceous aerosol is one of the main components of atmospheric particulate matter over Europe. Various methods, considering optical or thermal properties, have been developed for quantification of the accurate amount of both organic and elemental carbon constituents of atmospheric aerosol. The aim of our work was to develop an alternative fast and easy method for determination of the total carbon content of individual aerosol samples collected on prebaked quartz filters whereby the mass and surface concentration becomes simply computable. We applied the conventional "elemental analyzer (EA) coupled online with an isotope ratio mass spectrometer (IRMS)" technique which is ubiquitously used in mass spectrometry. Using this technique we are able to measure simultaneously the carbon stable isotope ratio of the samples, as well. During the developing process, we compared the EA-IRMS technique with an off-line catalytic combustion method worked out previously at Hertelendi Laboratory of Environmental Studies (HEKAL). We tested the combined online total carbon content and stable isotope ratio measurement both on standard materials and real aerosol samples. Regarding the test results the novel method assures, on the one hand, at least 95% of carbon recovery yield in a broad total carbon mass range (between 100 and 3000 ug) and, on the other hand, a good reproducibility of stable isotope measurements with an uncertainty of ± 0.2 per mill. Comparing the total carbon results obtained by the EA-IRMS and the off-line catalytic combustion method we found a very good correlation (R2=0.94) that proves the applicability of both preparation method. Advantages of the novel method are the fast and simplified sample preparation steps and the fully automated, simultaneous carbon stable isotope ratio measurement processes. Furthermore stable isotope ratio results can effectively be applied in the source apportionment

  17. Fully automated ionic liquid-based headspace single drop microextraction coupled to GC-MS/MS to determine musk fragrances in environmental water samples.

    PubMed

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-09-15

    A fully automated ionic liquid-based headspace single drop microextraction (IL-HS-SDME) procedure has been developed for the first time to preconcentrate trace amounts of ten musk fragrances extensively used in personal care products (six polycyclic musks, three nitro musks and one polycyclic musk degradation product) from wastewater samples prior to analysis by gas chromatography and ion trap tandem mass spectrometry (GC-IT-MS/MS). Due to the low volatility of the ILs, a large internal diameter liner (3.4 mm i.d.) was used to improve the ILs evaporation. Furthermore, a piece of glass wool was introduced into the liner to avoid the entrance of the ILs in the GC column and a guard column was used to prevent analytical column damages. The main factors influencing the IL-HS-SDME were optimized. For all species, the highest enrichments factors were achieved using 1 μL of 1-octyl-3-methylimidazolium hexafluorophosphate ([OMIM][PF(6)]) ionic liquid exposed in the headspace of 10 mL water samples containing 300 g L(-1) of NaCl and stirred at 750 rpm and 60 °C for 45 min. All compounds were determined by direct injection GC-IT-MS/MS with a chromatographic time of 19 min. Method detection limits were found in the low ng mL(-1) range between 0.010 ng mL(-1) and 0.030 ng mL(-1) depending on the target analytes. Also, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations varying between 3% and 6% and 5% and 11%, respectively (n=3, 1 ng mL(-1)). The applicability of the method was tested with different wastewater samples from influent and effluent urban wastewater treatment plants (WWTPs) and one potable treatment plant (PTP). The analysis of influent urban wastewater revealed the presence of galaxolide and tonalide at concentrations of between 2.10 ng mL(-1) and 0.29 ng mL(-1) and 0.32 ng mL(-1) and

  18. Using Teacher Work Samples to Develop and Assess Best Practices in Physical Education Teacher Education

    ERIC Educational Resources Information Center

    Sariscsany, Mary Jo

    2010-01-01

    Teacher work samples (TWS) are an integrated, comprehensive assessment tool that can be used as evidence of a beginning teacher's readiness to teach. Unlike linear assessments used to determine teaching effectiveness, TWS are relevant and reflective of "real" teaching. They are designed to exhibit a clear relationship among teacher candidate…

  19. Assessing Writing Competence through Writing Samples. Studies in Language Education Report No. 36.

    ERIC Educational Resources Information Center

    Hudson, Sally; Veal, Ramon

    This report discusses a plan to help Georgia school systems develop their own evaluation instruments for student writing. The report indicates that the use of actual writing samples in assessing writing skills can be valid, reliable, and economical. It includes attachments with sample writing scores from a school, the purposes and statistics that…

  20. Samples of Students' Responses from the Grade 9 Science Performance-Based Assessment Tasks, June 1993.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Student Evaluation Branch.

    The purpose of this document is to provide teachers, administrators, students, and parents with samples of students' performances that exemplify standards in relation to the 1993 Grade 9 Science Performance-Based Assessment Tasks for the province of Alberta, Canada. A sample of 698 randomly selected students from 31 schools did the…

  1. Development and Validation of an Admission Test Designed to Assess Samples of Performance on Academic Tasks

    ERIC Educational Resources Information Center

    Tanilon, Jenny; Segers, Mien; Vedder, Paul; Tillema, Harm

    2009-01-01

    This study illustrates the development and validation of an admission test, labeled as Performance Samples on Academic Tasks in Educational Sciences (PSAT-Ed), designed to assess samples of performance on academic tasks characteristic of those that would eventually be encountered by examinees in an Educational Sciences program. The test was based…

  2. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    ERIC Educational Resources Information Center

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  3. Benzodiazepine Use During Hospitalization: Automated Identification of Potential Medication Errors and Systematic Assessment of Preventable Adverse Events

    PubMed Central

    Niedrig, David Franklin; Hoppe, Liesa; Mächler, Sarah; Russmann, Heike; Russmann, Stefan

    2016-01-01

    Objective Benzodiazepines and “Z-drug” GABA-receptor modulators (BDZ) are among the most frequently used drugs in hospitals. Adverse drug events (ADE) associated with BDZ can be the result of preventable medication errors (ME) related to dosing, drug interactions and comorbidities. The present study evaluated inpatient use of BDZ and related ME and ADE. Methods We conducted an observational study within a pharmacoepidemiological database derived from the clinical information system of a tertiary care hospital. We developed algorithms that identified dosing errors and interacting comedication for all administered BDZ. Associated ADE and risk factors were validated in medical records. Results Among 53,081 patients contributing 495,813 patient-days BDZ were administered to 25,626 patients (48.3%) on 115,150 patient-days (23.2%). We identified 3,372 patient-days (2.9%) with comedication that inhibits BDZ metabolism, and 1,197 (1.0%) with lorazepam administration in severe renal impairment. After validation we classified 134, 56, 12, and 3 cases involving lorazepam, zolpidem, midazolam and triazolam, respectively, as clinically relevant ME. Among those there were 23 cases with associated adverse drug events, including severe CNS-depression, falls with subsequent injuries and severe dyspnea. Causality for BDZ was formally assessed as ‘possible’ or ‘probable’ in 20 of those cases. Four cases with ME and associated severe ADE required administration of the BDZ antagonist flumazenil. Conclusions BDZ use was remarkably high in the studied setting, frequently involved potential ME related to dosing, co-medication and comorbidities, and rarely cases with associated ADE. We propose the implementation of automated ME screening and validation for the prevention of BDZ-related ADE. PMID:27711224

  4. Automated quantification of DNA demethylation effects in cells via 3D mapping of nuclear signatures and population homogeneity assessment.

    PubMed

    Gertych, Arkadiusz; Wawrowsky, Kolja A; Lindsley, Erik; Vishnevsky, Eugene; Farkas, Daniel L; Tajbakhsh, Jian

    2009-07-01

    Today's advanced microscopic imaging applies to the preclinical stages of drug discovery that employ high-throughput and high-content three-dimensional (3D) analysis of cells to more efficiently screen candidate compounds. Drug efficacy can be assessed by measuring response homogeneity to treatment within a cell population. In this study, topologically quantified nuclear patterns of methylated cytosine and global nuclear DNA are utilized as signatures of cellular response to the treatment of cultured cells with the demethylating anti-cancer agents: 5-azacytidine (5-AZA) and octreotide (OCT). Mouse pituitary folliculostellate TtT-GF cells treated with 5-AZA and OCT for 48 hours, and untreated populations, were studied by immunofluorescence with a specific antibody against 5-methylcytosine (MeC), and 4,6-diamidino-2-phenylindole (DAPI) for delineation of methylated sites and global DNA in nuclei (n = 163). Cell images were processed utilizing an automated 3D analysis software that we developed by combining seeded watershed segmentation to extract nuclear shells with measurements of Kullback-Leibler's (K-L) divergence to analyze cell population homogeneity in the relative nuclear distribution patterns of MeC versus DAPI stained sites. Each cell was assigned to one of the four classes: similar, likely similar, unlikely similar, and dissimilar. Evaluation of the different cell groups revealed a significantly higher number of cells with similar or likely similar MeC/DAPI patterns among untreated cells (approximately 100%), 5-AZA-treated cells (90%), and a lower degree of same type of cells (64%) in the OCT-treated population. The latter group contained (28%) of unlikely similar or dissimilar (7%) cells. Our approach was successful in the assessment of cellular behavior relevant to the biological impact of the applied drugs, i.e., the reorganization of MeC/DAPI distribution by demethylation. In a comparison with other metrics, K-L divergence has proven to be a more

  5. An automated system for access to derived climate indices in support of ecological impacts assessments and resource management

    NASA Astrophysics Data System (ADS)

    Walker, J.; Morisette, J. T.; Talbert, C.; Blodgett, D. L.; Kunicki, T.

    2012-12-01

    A U.S. Geological Survey team is working with several providers to establish standard data services for the climate projection data they host. To meet the needs of climate adaptation science and landscape management communities, the team is establishing a set of climate index calculation algorithms that will consume data from various providers and provide directly useful data derivatives. Climate projections coming from various scenarios, modeling centers, and downscaling methods are increasing in number and size. Global change impact modeling and assessment, generally, requires inputs in the form of climate indices or values derived from raw climate projections. This requirement puts a large burden on a community not familiar with climate data formats, semantics, and processing techniques and requires storage capacity and computing resources out of the reach of most. In order to fully understand the implications of our best available climate projections, assessments must take into account an ensemble of climate projections and potentially a range of parameters for calculation of climate indices. These requirements around data access and processing are not unique from project to project, or even among projected climate data sets, pointing to the need for a reusable tool to generate climate indices. The U.S. Geological Survey has developed a pilot application and supporting web service framework that automates the generation of climate indices. The web service framework consists of standards-based data servers and a data integration broker. The resulting system allows data producers to publish and maintain ownership of their data and data consumers to access climate derivatives via a simple to use "data product ordering" workflow. Data access and processing is completed on enterprise "cloud" computing resources and only the relatively small, derived climate indices are delivered to the scientist or land manager. These services will assist the scientific and land

  6. Differential proteomic analysis of mouse macrophages exposed to adsorbate-loaded heavy fuel oil derived combustion particles using an automated sample-preparation workflow.

    PubMed

    Kanashova, Tamara; Popp, Oliver; Orasche, Jürgen; Karg, Erwin; Harndorf, Horst; Stengel, Benjamin; Sklorz, Martin; Streibel, Thorsten; Zimmermann, Ralf; Dittmar, Gunnar

    2015-08-01

    Ship diesel combustion particles are known to cause broad cytotoxic effects and thereby strongly impact human health. Particles from heavy fuel oil (HFO) operated ships are considered as particularly dangerous. However, little is known about the relevant components of the ship emission particles. In particular, it is interesting to know if the particle cores, consisting of soot and metal oxides, or the adsorbate layers, consisting of semi- and low-volatile organic compounds and salts, are more relevant. We therefore sought to relate the adsorbates and the core composition of HFO combustion particles to the early cellular responses, allowing for the development of measures that counteract their detrimental effects. Hence, the semi-volatile coating of HFO-operated ship diesel engine particles was removed by stepwise thermal stripping using different temperatures. RAW 264.7 macrophages were exposed to native and thermally stripped particles in submersed culture. Proteomic changes were monitored by two different quantitative mass spectrometry approaches, stable isotope labeling by amino acids in cell culture (SILAC) and dimethyl labeling. Our data revealed that cells reacted differently to native or stripped HFO combustion particles. Cells exposed to thermally stripped particles showed a very differential reaction with respect to the composition of the individual chemical load of the particle. The cellular reactions of the HFO particles included reaction to oxidative stress, reorganization of the cytoskeleton and changes in endocytosis. Cells exposed to the 280 °C treated particles showed an induction of RNA-related processes, a number of mitochondria-associated processes as well as DNA damage response, while the exposure to 580 °C treated HFO particles mainly induced the regulation of intracellular transport. In summary, our analysis based on a highly reproducible automated proteomic sample-preparation procedure shows a diverse cellular response, depending on the

  7. Assessment of fully-automated atlas-based segmentation of novel oral mucosal surface organ-at-risk

    PubMed Central

    Dean, Jamie A; Welsh, Liam C; McQuaid, Dualta; Wong, Kee H; Aleksic, Aleksandar; Dunne, Emma; Islam, Mohammad R; Patel, Anushka; Patel, Priyanka; Petkar, Imran; Phillips, Iain; Sham, Jackie; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Gulliford, Sarah L; Nutting, Christopher M

    2016-01-01

    Background and Purpose Current oral mucositis normal tissue complication probability models, based on the dose distribution to the oral cavity volume, have suboptimal predictive power. Improving the delineation of the oral mucosa is likely to improve these models, but is resource intensive. We developed and evaluated fully-automated atlas-based segmentation (ABS) of a novel delineation technique for the oral mucosal surfaces. Material and Methods An atlas of mucosal surface contours (MSC) consisting of 46 patients was developed. It was applied to an independent test cohort of 10 patients for whom manual segmentation of MSC structures, by three different clinicians, and conventional outlining of oral cavity contours (OCC), by an additional clinician, were also performed. Geometric comparisons were made using the dice similarity coefficient (DSC), validation index (VI) and Hausdorff distance (HD). Dosimetric comparisons were carried out using dose-volume histograms. Results The median difference, in the DSC and HD, between automated-manual comparisons and manual-manual comparisons were small and non-significant (-0.024; p = 0.33 and -0.5; p = 0.88, respectively). The median VI was 0.086. The maximum normalised volume difference between automated and manual MSC structures across all of the dose levels, averaged over the test cohort, was 8%. This difference reached approximately 28% when comparing automated MSC and OCC structures. Conclusions Fully-automated ABS of MSC is suitable for use in radiotherapy dose-response modelling. PMID:26970676

  8. Automated Assessment of Reviews

    ERIC Educational Resources Information Center

    Ramachandran, Lakshmi

    2013-01-01

    Relevance helps identify to what extent a review's content pertains to that of the submission. Relevance metric helps distinguish generic or vague reviews from the useful ones. Relevance of a review to a submission can be determined by identifying semantic and syntactic similarities between them. Our work introduces the use of a word-order graph…

  9. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  10. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  11. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  14. Assessing the accuracy of Landsat Thematic Mapper classification using double sampling

    USGS Publications Warehouse

    Kalkhan, M.A.; Reich, R.M.; Stohlgren, T.J.

    1998-01-01

    Double sampling was used to provide a cost efficient estimate of the accuracy of a Landsat Thematic Mapper (TM) classification map of a scene located in the Rocky Moutnain National Park, Colorado. In the first phase, 200 sample points were randomly selected to assess the accuracy between Landsat TM data and aerial photography. The overall accuracy and Kappa statistic were 49.5% and 32.5%, respectively. In the second phase, 25 sample points identified in the first phase were selected using stratified random sampling and located in the field. This information was used to correct for misclassification errors associated with the first phase samples. The overall accuracy and Kappa statistic increased to 59.6% and 45.6%, respectively.Double sampling was used to provide a cost efficient estimate of the accuracy of a Landsat Thematic Mapper (TM) classification map of a scene located in the Rocky Mountain National Park, Colorado. In the first phase, 200 sample points were randomly selected to assess the accuracy between Landsat TM data and aerial photography. The overall accuracy and Kappa statistic were 49.5 per cent and 32.5 per cent, respectively. In the second phase, 25 sample points identified in the first phase were selected using stratified random sampling and located in the field. This information was used to correct for misclassification errors associated with the first phase samples. The overall accuracy and Kappa statistic increased to 59.6 per cent and 45.6 per cent, respectively.

  15. Automated home cage assessment shows behavioral changes in a transgenic mouse model of spinocerebellar ataxia type 17.

    PubMed

    Portal, Esteban; Riess, Olaf; Nguyen, Huu Phuc

    2013-08-01

    Spinocerebellar Ataxia type 17 (SCA17) is an autosomal dominantly inherited, neurodegenerative disease characterized by ataxia, involuntary movements, and dementia. A novel SCA17 mouse model having a 71 polyglutamine repeat expansion in the TATA-binding protein (TBP) has shown age related motor deficit using a classic motor test, yet concomitant weight increase might be a confounding factor for this measurement. In this study we used an automated home cage system to test several motor readouts for this same model to confirm pathological behavior results and evaluate benefits of automated home cage in behavior phenotyping. Our results confirm motor deficits in the Tbp/Q71 mice and present previously unrecognized behavioral characteristics obtained from the automated home cage, indicating its use for high-throughput screening and testing, e.g. of therapeutic compounds.

  16. An Assessment of the State of the Art of Curriculum Materials and a Status Assessment of Training Programs for Robotics/Automated Systems Technicians. Task Analysis and Descriptions of Required Job Competencies of Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Hull, Daniel M.; Lovett, James E.

    This report presents the results of research conducted to determine the current state of the art of robotics/automated systems technician (RAST) training offered in the United States. Section I discusses the RAST curriculum project, of which this state-of-the-art review is a part, and offers a RAST job description. Section II describes the…

  17. The effect of sampling strategies on assessment of water quality criteria attainment.

    PubMed

    Wang, Yuxin; Wilson, Jessica M; VanBriesen, Jeanne M

    2015-05-01

    Sample locations for large river studies affect the representativeness of data, and thus can alter decisions made regarding river conditions and the need for interventions to improve water quality. The present study evaluated three water-quality sampling programs for Total Dissolved Solid (TDS) assessment in the Monongahela River from 2008 to 2012. The sampling plans cover the same 145 km of river but differ in frequency, sample location and type (e.g., river water sample vs drinking water plant intake sample). Differences resulting from temporal and spatial variability in sampling lead to different conclusions regarding water quality in the river (including regulatory listing decisions), especially when low flow leads to concentrations at or near the water quality criteria (500mg/L TDS). Drinking water samples exceeded the criteria 82 out of 650 samples (12.6%), while river water samples exceeded the criteria 47 out of 464 samples (10.1%). Different water sample types could provide different pictures of water quality in the river and lead to different regulatory listing decisions.

  18. Methods for sampling fish communities as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Meador, M.R.; Cuffney, T.F.; Gurtz, M.E.

    1993-01-01

    Fish community structure is characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. The objective of the National Water-Quality Assessment characterization of fish community structure is to relate fish community characteristics to physical, chemical, and other biological factors to assess water-quality conditions. To accomplish this, fish community structure is described at sites representing selected environmental settings. In addition, spatial and temporal patterns in fish community structure are examined at local, regional, and national levels. A representative sample of the fish community is collected by sampling a stream reach using two complementary methods. The primary collection method is electrofishing using backpack, towed, or boat-operated electrofishing gear; seining is a secondary technique. Other secondary techniques may be substituted after careful consideration of sampling efficiency and consultation with local fish ecologists. Before fish sampling is conducted, careful consideration must be given to collecting permits; protecting endangered, threatened, and special-concern species; and coordinating sampling efforts with other fish ecologists. After the sample is collected, individual fish are identified to species by ichthyologists. Length and weight measurements are taken, and the presence of external anomalies are recorded.

  19. Evaluation of sampling sizes on the intertidal macroinfauna assessment in a subtropical mudflat of Hong Kong.

    PubMed

    Shen, Ping-Ping; Zhou, Hong; Zhao, Zhenye; Yu, Xiao-Zhang; Gu, Ji-Dong

    2012-08-01

    In this study, two types of sediment cores with different diameters were used to collect sediment samples from an intertidal mudflat in Hong Kong to investigate the influence of sampling unit on the quantitative assessment of benthic macroinfaunal communities. Both univariate and multivariate analyses were employed to detect differences in sampling efficiencies by the two samplers through total abundance and biomass, species richness and diversity, community structure, relative abundance of major taxa of the infaunal community. The species-area curves were further compared to find out the influence of the sampling units. Results showed that the two sampling devices provided similar information on the estimates of species diversity, density and species composition of the benthos in main part of the mudflat where the sediment was fine and homogenous; but at the station which contained coarse sand and gravels, the significant differences were detected between the quantitative assessments of macrobenthic infauna by the two samplers. Most importantly, the species-area curves indicated that more and smaller samples were better in capturing more species than less large ones when comparing an equal sampling area. Therefore, the efficiency of the sampler largely depended on the sediment properties, and sampling devices must be chosen based on the physical conditions and desired levels of precision on the organisms of the sampling program. PMID:22766844

  20. Evaluation of sampling sizes on the intertidal macroinfauna assessment in a subtropical mudflat of Hong Kong.

    PubMed

    Shen, Ping-Ping; Zhou, Hong; Zhao, Zhenye; Yu, Xiao-Zhang; Gu, Ji-Dong

    2012-08-01

    In this study, two types of sediment cores with different diameters were used to collect sediment samples from an intertidal mudflat in Hong Kong to investigate the influence of sampling unit on the quantitative assessment of benthic macroinfaunal communities. Both univariate and multivariate analyses were employed to detect differences in sampling efficiencies by the two samplers through total abundance and biomass, species richness and diversity, community structure, relative abundance of major taxa of the infaunal community. The species-area curves were further compared to find out the influence of the sampling units. Results showed that the two sampling devices provided similar information on the estimates of species diversity, density and species composition of the benthos in main part of the mudflat where the sediment was fine and homogenous; but at the station which contained coarse sand and gravels, the significant differences were detected between the quantitative assessments of macrobenthic infauna by the two samplers. Most importantly, the species-area curves indicated that more and smaller