Science.gov

Sample records for automated sampling assessment

  1. Reference values for performance on the Automated Neuropsychological Assessment Metrics V3.0 in an active duty military sample.

    PubMed

    Reeves, Dennis L; Bleiberg, Joseph; Roebuck-Spencer, Tresa; Cernich, Alison N; Schwab, Karen; Ivins, Brian; Salazar, Andres M; Harvey, Sally C; Brown, Fred H; Warden, Deborah

    2006-10-01

    The Automated Neuropsychological Assessment Metrics (ANAM) is a computerized measure of processing speed, cognitive efficiency, and memory. This study describes performance and psychometric properties of ANAM in an active duty, healthy military sample (N = 2,371) composed primarily of young (18-46 years) adult males. Rarely have neuropsychological reference values for use with individuals in the military been derived from a large, active duty military population, and this is the first computerized neuropsychological test battery with military-specific reference values. Although these results do not provide demographically corrected, formal normative data, they provide reference points for neuropsychologists and other health care providers who are using ANAM data in research or clinical settings, with patients of comparable demographics to the present sample.

  2. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    SciTech Connect

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the accuracy of the

  3. Water-quality assessment of south-central Texas : comparison of water quality in surface-water samples collected manually and by automated samplers

    USGS Publications Warehouse

    Ging, Patricia B.

    1999-01-01

    Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.

  4. Automated Chromosome Breakage Assessment

    NASA Technical Reports Server (NTRS)

    Castleman, Kenneth

    1985-01-01

    An automated karyotyping machine was built at JPL in 1972. It does computerized karyotyping, but it has some hardware limitations. The image processing hardware that was available at a reasonable price in 1972 was marginal, at best, for this job. In the meantime, NASA has developed an interest in longer term spaceflights and an interest in using chromosome breakage studies as a dosimeter for radiation or perhaps other damage that might occur to the tissues. This uses circulating lymphocytes as a physiological dosimeter looking for chromosome breakage on long-term spaceflights. For that reason, we have reactivated the automated karyotyping work at JPL. An update on that work, and a description of where it appears to be headed is presented.

  5. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  6. Automated sample preparation for CE-SDS.

    PubMed

    Le, M Eleanor; Vizel, Alona; Hutterer, Katariina M

    2013-05-01

    Traditionally, CE with SDS (CE-SDS) places many restrictions on sample composition. Requirements include low salt content, known initial sample concentration, and a narrow window of final sample concentration. As these restrictions require buffer exchange for many sample types, sample preparation is often tedious and yields poor sample recoveries. To improve capacity and streamline sample preparation, an automated robotic platform was developed using the PhyNexus Micro-Extractor Automated Instrument (MEA) for both the reduced and nonreduced CE-SDS assays. This automated sample preparation normalizes sample concentration, removes salts and other contaminants, and adds the required CE-SDS reagents, essentially eliminating manual steps during sample preparation. Fc-fusion proteins and monoclonal antibodies were used in this work to demonstrate benefits of this approach when compared to the manual method. With optimized conditions, this application has demonstrated decreased analyst "hands on" time and reduced total assay time. Sample recovery greater than 90% can be achieved, regardless of initial composition and concentration of analyte.

  7. Black tea volatiles fingerprinting by comprehensive two-dimensional gas chromatography - Mass spectrometry combined with high concentration capacity sample preparation techniques: Toward a fully automated sensomic assessment.

    PubMed

    Magagna, Federico; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Rubiolo, Patrizia; Sgorbini, Barbara; Bicchi, Carlo

    2017-06-15

    Tea prepared by infusion of dried leaves of Camellia sinensis (L.) Kuntze, is the second world's most popular beverage, after water. Its consumption is associated with its chemical composition: it influences its sensory and nutritional quality addressing consumer preferences, and potential health benefits. This study aims to obtain an informative chemical signature of the volatile fraction of black tea samples from Ceylon by applying the principles of sensomics. In particular, several high concentration capacity (HCC) sample preparation techniques were tested in combination with GC×GC-MS to investigate chemical signatures of black tea volatiles. This platform, using headspace solid phase microextraction (HS-SPME) with multicomponent fiber as sampling technique, recovers 95% of the key-odorants in a fully automated work-flow. A group 123 components, including key-odorants, technological and botanical tracers, were mapped. The resulting 2D fingerprints were interpreted by pattern recognition tools (i.e. template matching fingerprinting and scripting) providing highly informative chemical signatures for quality assessment.

  8. Automated Sample Deoxygenation for Improved Luminescence Measurements.

    DTIC Science & Technology

    1986-11-25

    fET-AY4 732 AUTOMATED SAMPLE DEOXYGENATION FOR IMPROVED LUMINESCENCE MEASUREMENTS U) EMORY UNIV RTLANTA GA DEPT OF CHEMISTRY M E ROLLIE ET AL 25 NOV... Deoxygenation for Improved Luminescence Measurements 12 PERSONAL AUTHOR(S) | ,Rollie, M.E.; Patonay, Gabor; and Warner, Isiah M. A .3a. TYPE OF REPORT...GROUP ISU*GRO P ,,,uminescence Spectroscopy; Fluorescence Analysis,* Room *f Temperature Phosphorescence; Deoxygenation ; Quenching ISTRACT (Continue on

  9. Automated Assessment in Massive Open Online Courses

    ERIC Educational Resources Information Center

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  10. Precise and automated microfluidic sample preparation.

    SciTech Connect

    Crocker, Robert W.; Patel, Kamlesh D.; Mosier, Bruce P.; Harnett, Cindy K.

    2004-07-01

    Autonomous bio-chemical agent detectors require sample preparation involving multiplex fluid control. We have developed a portable microfluidic pump array for metering sub-microliter volumes at flowrates of 1-100 {micro}L/min. Each pump is composed of an electrokinetic (EK) pump and high-voltage power supply with 15-Hz feedback from flow sensors. The combination of high pump fluid impedance and active control results in precise fluid metering with nanoliter accuracy. Automated sample preparation will be demonstrated by labeling proteins with fluorescamine and subsequent injection to a capillary gel electrophoresis (CGE) chip.

  11. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  12. Automated Assessment in a Programming Tools Course

    ERIC Educational Resources Information Center

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  13. Constructing Aligned Assessments Using Automated Test Construction

    ERIC Educational Resources Information Center

    Porter, Andrew; Polikoff, Morgan S.; Barghaus, Katherine M.; Yang, Rui

    2013-01-01

    We describe an innovative automated test construction algorithm for building aligned achievement tests. By incorporating the algorithm into the test construction process, along with other test construction procedures for building reliable and unbiased assessments, the result is much more valid tests than result from current test construction…

  14. Automated Geospatial Watershed Assessment Tool (AGWA)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University ...

  15. AGWA: The Automated Geospatial Watershed Assessment Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  16. Automated bone age assessment: motivation, taxonomies, and challenges.

    PubMed

    Mansourvar, Marjan; Ismail, Maizatul Akmar; Herawan, Tutut; Raj, Ram Gopal; Kareem, Sameem Abdul; Nasaruddin, Fariza Hanum

    2013-01-01

    Bone age assessment (BAA) of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research.

  17. Automated Bone Age Assessment: Motivation, Taxonomies, and Challenges

    PubMed Central

    Ismail, Maizatul Akmar; Herawan, Tutut; Gopal Raj, Ram; Abdul Kareem, Sameem; Nasaruddin, Fariza Hanum

    2013-01-01

    Bone age assessment (BAA) of unknown people is one of the most important topics in clinical procedure for evaluation of biological maturity of children. BAA is performed usually by comparing an X-ray of left hand wrist with an atlas of known sample bones. Recently, BAA has gained remarkable ground from academia and medicine. Manual methods of BAA are time-consuming and prone to observer variability. This is a motivation for developing automated methods of BAA. However, there is considerable research on the automated assessment, much of which are still in the experimental stage. This survey provides taxonomy of automated BAA approaches and discusses the challenges. Finally, we present suggestions for future research. PMID:24454534

  18. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  19. Rapid Automated Sample Preparation for Biological Assays

    SciTech Connect

    Shusteff, M

    2011-03-04

    Our technology utilizes acoustic, thermal, and electric fields to separate out contaminants such as debris or pollen from environmental samples, lyse open cells, and extract the DNA from the lysate. The objective of the project is to optimize the system described for a forensic sample, and demonstrate its performance for integration with downstream assay platforms (e.g. MIT-LL's ANDE). We intend to increase the quantity of DNA recovered from the sample beyond the current {approx}80% achieved using solid phase extraction methods. Task 1: Develop and test an acoustic filter for cell extraction. Task 2: Develop and test lysis chip. Task 3: Develop and test DNA extraction chip. All chips have been fabricated based on the designs laid out in last month's report.

  20. Automated Sample collection and Analysis unit

    SciTech Connect

    Latner, Norman; Sanderson, Colin G.; Negro, Vincent C.

    1999-03-31

    Autoramp is an atmospheric radionuclide collection and analysis unit designed for unattended operation. A large volume of air passes through one of 31 filter cartridges which is then moved from a sampling chamber and past a bar code reader, to a shielded enclosure. The collected dust-borne radionuclides are counted with a high resolution germanium gamma-ray detector. An analysis is made and the results are transmitted to a central station that can also remotely control the unit.

  1. Automated biowaste sampling system feces monitoring system

    NASA Technical Reports Server (NTRS)

    Hunt, S. R.; Glanfield, E. J.

    1979-01-01

    The Feces Monitoring System (FMS) Program designed, fabricated, assembled and tested an engineering model waste collector system (WCS) to be used in support of life science and medical experiments related to Shuttle missions. The FMS design was patterned closely after the Shuttle WCS, including: interface provisions; mounting; configuration; and operating procedures. These similarities make it possible to eventually substitute an FMS for the Shuttle WCS of Orbiter. In addition, several advanced waste collection features, including the capability of real-time inertial fecal separation and fecal mass measurement and sampling were incorporated into the FMS design.

  2. Automated Data Quality Assessment of Marine Sensors

    PubMed Central

    Timms, Greg P.; de Souza, Paulo A.; Reznik, Leon; Smith, Daniel V.

    2011-01-01

    The automated collection of data (e.g., through sensor networks) has led to a massive increase in the quantity of environmental and other data available. The sheer quantity of data and growing need for real-time ingestion of sensor data (e.g., alerts and forecasts from physical models) means that automated Quality Assurance/Quality Control (QA/QC) is necessary to ensure that the data collected is fit for purpose. Current automated QA/QC approaches provide assessments based upon hard classifications of the gathered data; often as a binary decision of good or bad data that fails to quantify our confidence in the data for use in different applications. We propose a novel framework for automated data quality assessments that uses Fuzzy Logic to provide a continuous scale of data quality. This continuous quality scale is then used to compute error bars upon the data, which quantify the data uncertainty and provide a more meaningful measure of the data’s fitness for purpose in a particular application compared with hard quality classifications. The design principles of the framework are presented and enable both data statistics and expert knowledge to be incorporated into the uncertainty assessment. We have implemented and tested the framework upon a real time platform of temperature and conductivity sensors that have been deployed to monitor the Derwent Estuary in Hobart, Australia. Results indicate that the error bars generated from the Fuzzy QA/QC implementation are in good agreement with the error bars manually encoded by a domain expert. PMID:22163714

  3. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  4. Investigating Factors Affecting the Uptake of Automated Assessment Technology

    ERIC Educational Resources Information Center

    Dreher, Carl; Reiners, Torsten; Dreher, Heinz

    2011-01-01

    Automated assessment is an emerging innovation in educational praxis, however its pedagogical potential is not fully utilised in Australia, particularly regarding automated essay grading. The rationale for this research is that the usage of automated assessment currently lags behind the capacity that the technology provides, thus restricting the…

  5. Automated Power Assessment for Helicopter Turboshaft Engines

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Litt, Jonathan S.

    2008-01-01

    An accurate indication of available power is required for helicopter mission planning purposes. Available power is currently estimated on U.S. Army Blackhawk helicopters by performing a Maximum Power Check (MPC), a manual procedure performed by maintenance pilots on a periodic basis. The MPC establishes Engine Torque Factor (ETF), an indication of available power. It is desirable to replace the current manual MPC procedure with an automated approach that will enable continuous real-time assessment of available power utilizing normal mission data. This report presents an automated power assessment approach which processes data currently collected within helicopter Health and Usage Monitoring System (HUMS) units. The overall approach consists of: 1) a steady-state data filter which identifies and extracts steady-state operating points within HUMS data sets; 2) engine performance curve trend monitoring and updating; and 3) automated ETF calculation. The algorithm is coded in MATLAB (The MathWorks, Inc.) and currently runs on a PC. Results from the application of this technique to HUMS mission data collected from UH-60L aircraft equipped with T700-GE-701C engines are presented and compared to manually calculated ETF values. Potential future enhancements are discussed.

  6. Modular Automated Processing System (MAPS) for analysis of biological samples.

    SciTech Connect

    Gil, Geun-Cheol; Chirica, Gabriela S.; Fruetel, Julia A.; VanderNoot, Victoria A.; Branda, Steven S.; Schoeniger, Joseph S.; Throckmorton, Daniel J.; Brennan, James S.; Renzi, Ronald F.

    2010-10-01

    We have developed a novel modular automated processing system (MAPS) that enables reliable, high-throughput analysis as well as sample-customized processing. This system is comprised of a set of independent modules that carry out individual sample processing functions: cell lysis, protein concentration (based on hydrophobic, ion-exchange and affinity interactions), interferent depletion, buffer exchange, and enzymatic digestion of proteins of interest. Taking advantage of its unique capacity for enclosed processing of intact bioparticulates (viruses, spores) and complex serum samples, we have used MAPS for analysis of BSL1 and BSL2 samples to identify specific protein markers through integration with the portable microChemLab{trademark} and MALDI.

  7. Automated assessment of postural stability system.

    PubMed

    Napoli, Alessandro; Ward, Christian R; Glass, Stephen M; Tucker, Carole; Obeid, Iyad

    2016-08-01

    The Balance Error Scoring System (BESS) is one of the most commonly used clinical tests to evaluate static postural stability deficits resulting from traumatic brain events and musculoskeletal injury. This test requires a trained operator to visually assess balance and give the subject a performance score based on the number of balance "errors" they committed. Despite being regularly used in several real-world situations, the BESS test is scored by clinician observation and is therefore (a) potentially susceptible to biased and inaccurate test scores and (b) cannot be administered in the absence of a trained provider. The purpose of this research is to develop, calibrate and field test a computerized version of the BESS test using low-cost commodity motion tracking technology. This `Automated Assessment of Postural Stability' (AAPS) system will quantify balance control in field conditions. This research goal is to overcome the main limitations of both the commercially available motion capture systems and the standard BESS test. The AAPS system has been designed to be operated by a minimally trained user and it requires little set-up time with no sensor calibration necessary. These features make the proposed automated system a valuable balance assessment tool to be utilized in the field.

  8. Automated DNA extraction for large numbers of plant samples.

    PubMed

    Mehle, Nataša; Nikolić, Petra; Rupar, Matevž; Boben, Jana; Ravnikar, Maja; Dermastia, Marina

    2013-01-01

    The method described here is a rapid, total DNA extraction procedure applicable to a large number of plant samples requiring pathogen detection. The procedure combines a simple and quick homogenization step of crude extracts with DNA extraction based upon the binding of DNA to magnetic beads. DNA is purified in an automated process in which the magnetic beads are transferred through a series of washing buffers. The eluted DNA is suitable for efficient amplification in PCR reactions.

  9. An Automated Home Made Low Cost Vibrating Sample Magnetometer

    NASA Astrophysics Data System (ADS)

    Kundu, S.; Nath, T. K.

    2011-07-01

    The design and operation of a homemade low cost vibrating sample magnetometer is described here. The sensitivity of this instrument is better than 10-2 emu and found to be very efficient for the measurement of magnetization of most of the ferromagnetic and other magnetic materials as a function of temperature down to 77 K and magnetic field upto 800 Oe. Both M(H) and M(T) data acquisition are fully automated employing computer and Labview software.

  10. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  11. Statistical and Economical Efficiency in Assessment of Liver Regeneration Using Defined Sample Size and Selection in Combination With a Fully Automated Image Analysis System

    PubMed Central

    Deng, Meihong; Kleinert, Robert; Huang, Hai; He, Qing; Madrahimova, Fotima; Dirsch, Olaf; Dahmen, Uta

    2009-01-01

    Quantification of liver regeneration is frequently based on determining the 5-bromo-2-deoxyuridine labeling index (BrdU-LI). The quantitative result is influenced by preanalytical, analytical, and postanalytical variables such as the region of interest (ROI). We aimed to present our newly developed and validated automatic computer-based image analysis system (AnalySIS-Macro), and to standardize the selection and sample size of ROIs. Images from BrdU-labeled and immunohistochemically stained liver sections were analyzed conventionally and with the newly developed AnalySIS-Macro and used for validation of the system. Automatic quantification correlated well with the manual counting result (r=0.9976). Validation of our AnalySIS-Macro revealed its high sensitivity (>90%) and specificity. The BrdU-LI ranged from 11% to 57% within the same liver (32.96 ± 11.94%), reflecting the highly variable spatial distribution of hepatocyte proliferation. At least 2000 hepatocytes (10 images at 200× magnification) per lobe were required as sample size for achieving a representative BrdU-LI. Furthermore, the number of pericentral areas should be equal to that of periportal areas. The combination of our AnalySIS-Macro with rules for the selection and size of ROIs represents an accurate, sensitive, specific, and efficient diagnostic tool for the determination of the BrdU-LI and the spatial distribution of proliferating hepatocytes. (J Histochem Cytochem 57:1075–1085, 2009) PMID:19620322

  12. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  13. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    PubMed Central

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  14. Critical assessment of automated flow cytometry data analysis techniques.

    PubMed

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R; Brinkman, Ryan; Gottardo, Raphael; Scheuermann, Richard H

    2013-03-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks: (i) mammalian cell population identification, to determine whether automated algorithms can reproduce expert manual gating and (ii) sample classification, to determine whether analysis pipelines can identify characteristics that correlate with external variables (such as clinical outcome). This analysis presents the results of the first FlowCAP challenges. Several methods performed well as compared to manual gating or external variables using statistical performance measures, which suggests that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis.

  15. Drug discovery from Nature: automated high-quality sample preparation

    PubMed Central

    Thiericke, Ralf

    2000-01-01

    Secondary metabolites from plants, animals and microorganisms have been proven to be an outstanding source for new and innovative drugs and show a striking structural diversity that supplements chemically synthesized compounds or libraries in drug discovery programs. Unfortunately, extracts from natural sources are usually complex mixtures of compounds:: often generated in time consuming and for the most part manual processes. As quality and quantity of the provided samples play a pivotal role in the success of high-throughput screening programs this poses serious problems. In order to make samples of natural origin competitive with synthetic compound libraries, we devised a novel, automated sample preparation procedure based on solid-phase extraction (SPE). By making use of a modified Zymark RapidTrace® SPE workstation an easy-to-handle and effective fractionation method has been developed which allows the generation of highquality samples from natural origin, fulfilling the requirements of an integration into high-throughput screening programs. PMID:18924703

  16. Automated Formative Feedback and Summative Assessment Using Individualised Spreadsheet Assignments

    ERIC Educational Resources Information Center

    Blayney, Paul; Freeman, Mark

    2004-01-01

    This paper reports on the effects of automating formative feedback at the student's discretion and automating summative assessment with individualised spreadsheet assignments. Quality learning outcomes are achieved when students adopt deep approaches to learning (Ramsden, 2003). Learning environments designed to align assessment to learning…

  17. Components for automated microfluidics sample preparation and analysis

    NASA Astrophysics Data System (ADS)

    Archer, M.; Erickson, J. S.; Hilliard, L. R.; Howell, P. B., Jr.; Stenger, D. A.; Ligler, F. S.; Lin, B.

    2008-02-01

    The increasing demand for portable devices to detect and identify pathogens represents an interdisciplinary effort between engineering, materials science, and molecular biology. Automation of both sample preparation and analysis is critical for performing multiplexed analyses on real world samples. This paper selects two possible components for such automated portable analyzers: modified silicon structures for use in the isolation of nucleic acids and a sheath flow system suitable for automated microflow cytometry. Any detection platform that relies on the genetic content (RNA and DNA) present in complex matrices requires careful extraction and isolation of the nucleic acids in order to ensure their integrity throughout the process. This sample pre-treatment step is commonly performed using commercially available solid phases along with various molecular biology techniques that require multiple manual steps and dedicated laboratory space. Regardless of the detection scheme, a major challenge in the integration of total analysis systems is the development of platforms compatible with current isolation techniques that will ensure the same quality of nucleic acids. Silicon is an ideal candidate for solid phase separations since it can be tailored structurally and chemically to mimic the conditions used in the laboratory. For analytical purposes, we have developed passive structures that can be used to fully ensheath one flow stream with another. As opposed to traditional flow focusing methods, our sheath flow profile is truly two dimensional, making it an ideal candidate for integration into a microfluidic flow cytometer. Such a microflow cytometer could be used to measure targets captured on either antibody- or DNA-coated beads.

  18. Sample Tracking in an Automated Cytogenetic Biodosimetry Laboratory for Radiation Mass Casualties.

    PubMed

    Martin, P R; Berdychevski, R E; Subramanian, U; Blakely, W F; Prasanna, P G S

    2007-07-01

    Chromosome aberration-based dicentric assay is expected to be used after mass casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput.This paper focuses on our efforts to eliminate data transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample tracking system represents a "beta" version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  19. Digital microfluidic hub for automated nucleic acid sample preparation.

    SciTech Connect

    He, Jim; Bartsch, Michael S.; Patel, Kamlesh D.; Kittlaus, Eric A.; Remillared, Erin M.; Pezzola, Genevieve L.; Renzi, Ronald F.; Kim, Hanyoup

    2010-07-01

    We have designed, fabricated, and characterized a digital microfluidic (DMF) platform to function as a central hub for interfacing multiple lab-on-a-chip sample processing modules towards automating the preparation of clinically-derived DNA samples for ultrahigh throughput sequencing (UHTS). The platform enables plug-and-play installation of a two-plate DMF device with consistent spacing, offers flexible connectivity for transferring samples between modules, and uses an intuitive programmable interface to control droplet/electrode actuations. Additionally, the hub platform uses transparent indium-tin oxide (ITO) electrodes to allow complete top and bottom optical access to the droplets on the DMF array, providing additional flexibility for various detection schemes.

  20. Automated acoustic matrix deposition for MALDI sample preparation.

    PubMed

    Aerni, Hans-Rudolf; Cornett, Dale S; Caprioli, Richard M

    2006-02-01

    Novel high-throughput sample preparation strategies for MALDI imaging mass spectrometry (IMS) and profiling are presented. An acoustic reagent multispotter was developed to provide improved reproducibility for depositing matrix onto a sample surface, for example, such as a tissue section. The unique design of the acoustic droplet ejector and its optimization for depositing matrix solution are discussed. Since it does not contain a capillary or nozzle for fluid ejection, issues with clogging of these orifices are avoided. Automated matrix deposition provides better control of conditions affecting protein extraction and matrix crystallization with the ability to deposit matrix accurately onto small surface features. For tissue sections, matrix spots of 180-200 microm in diameter were obtained and a procedure is described for generating coordinate files readable by a mass spectrometer to permit automated profile acquisition. Mass spectral quality and reproducibility was found to be better than that obtained with manual pipet spotting. The instrument can also deposit matrix spots in a dense array pattern so that, after analysis in a mass spectrometer, two-dimensional ion images may be constructed. Example ion images from a mouse brain are presented.

  1. Validation of Automated Scoring of Science Assessments

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  2. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  3. The Development of the Missouri Automated Reinforcer Assessment (MARA).

    ERIC Educational Resources Information Center

    Vatterott, Madeleine

    A knowledge of an individual's preferences is essential to create an effective reward or reinforcer program for individuals who have either a need to reduce maladaptive behaviors or to increase adaptive behaviors. The goal of the Missouri Automated Reinforcer Assessment (MARA) project is to develop an efficient yet thorough automated reinforcer…

  4. Comparison of manual and automated nucleic acid extraction from whole-blood samples.

    PubMed

    Riemann, Kathrin; Adamzik, Michael; Frauenrath, Stefan; Egensperger, Rupert; Schmid, Kurt W; Brockmeyer, Norbert H; Siffert, Winfried

    2007-01-01

    Nucleic acid extraction and purification from whole blood is a routine application in many laboratories. Automation of this procedure promises standardized sample treatment, a low error rate, and avoidance of contamination. The performance of the BioRobot M48 (Qiagen) and the manual QIAmp DNA Blood Mini Kit (Qiagen) was compared for the extraction of DNA from whole blood. The concentration and purity of the extracted DNAs were determined by spectrophotometry. Analytical sensitivity was assessed by common PCR and genotyping techniques. The quantity and quality of the generated DNAs were slightly higher using the manual extraction method. The results of downstream applications were comparable to each other. Amplification of high-molecular-weight PCR fragments, genotyping by restriction digest, and pyrosequencing were successful for all samples. No cross-contamination could be detected. While automated DNA extraction requires significantly less hands-on time, it is slightly more expensive than the manual extraction method.

  5. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    developed at NCAR through a grant from the United States Air Force 557th Weather Wing (formerly the Air Force Weather Agency), where NCAR is sponsored...that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  6. Evaluating the Validity of the Automated Working Memory Assessment

    ERIC Educational Resources Information Center

    Alloway, Tracy; Gathercole, Susan E.; Kirkwood, Hannah; Elliott, Julian

    2008-01-01

    The aim of the present study was to investigate the construct stability and diagnostic validity of a standardised computerised tool for assessing working memory: the Automated Working Memory Assessment (AWMA). The purpose of the AWMA is to provide educators with a quick and effective tool to screen for and support those with memory impairments.…

  7. Ability-Training-Oriented Automated Assessment in Introductory Programming Course

    ERIC Educational Resources Information Center

    Wang, Tiantian; Su, Xiaohong; Ma, Peijun; Wang, Yuying; Wang, Kuanquan

    2011-01-01

    Learning to program is a difficult process for novice programmers. AutoLEP, an automated learning and assessment system, was developed by us, to aid novice programmers to obtain programming skills. AutoLEP is ability-training-oriented. It adopts a novel assessment mechanism, which combines static analysis with dynamic testing to analyze student…

  8. Manual versus automated blood sampling: impact of repeated blood sampling on stress parameters and behavior in male NMRI mice

    PubMed Central

    Kalliokoski, Otto; Sørensen, Dorte B; Hau, Jann; Abelson, Klas S P

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters measured, and expressed less anxious behavior. We conclude that repeated blood sampling by automated blood sampling and from the tail vein is less stressful than cheek blood sampling. The choice between automated blood sampling and tail blood sampling should be based on the study requirements, the resources of the laboratory and skills of the staff. PMID:24958546

  9. Automated Force Volume Image Processing for Biological Samples

    PubMed Central

    Duan, Junbo; Duval, Jérôme F. L.; Brie, David; Francius, Grégory

    2011-01-01

    Atomic force microscopy (AFM) has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature) which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image. PMID:21559483

  10. Automated biowaste sampling system improved feces collection, mass measurement and sampling. [by use of a breadboard model

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Young, R.

    1974-01-01

    The capability of the basic automated Biowaste Sampling System (ABSS) hardware was extended and improved through the design, fabrication and test of breadboard hardware. A preliminary system design effort established the feasibility of integrating the breadboard concepts into the ABSS.

  11. Automated Mars surface sample return mission concepts for achievement of essential scientific objectives

    NASA Technical Reports Server (NTRS)

    Weaver, W. L.; Norton, H. N.; Darnell, W. L.

    1975-01-01

    Mission concepts were investigated for automated return to Earth of a Mars surface sample adequate for detailed analyses in scientific laboratories. The minimum sample mass sufficient to meet scientific requirements was determined. Types of materials and supporting measurements for essential analyses are reported. A baseline trajectory profile was selected for its low energy requirements and relatively simple implementation, and trajectory profile design data were developed for 1979 and 1981 launch opportunities. Efficient spacecraft systems were conceived by utilizing existing technology where possible. Systems concepts emphasized the 1979 launch opportunity, and the applicability of results to other opportunities was assessed. It was shown that the baseline missions (return through Mars parking orbit) and some comparison missions (return after sample transfer in Mars orbit) can be accomplished by using a single Titan III E/Centaur as the launch vehicle. All missions investigated can be accomplished by use of Space Shuttle/Centaur vehicles.

  12. Automated Cough Assessment on a Mobile Platform.

    PubMed

    Sterling, Mark; Rhee, Hyekyun; Bocko, Mark

    2014-01-01

    The development of an Automated System for Asthma Monitoring (ADAM) is described. This consists of a consumer electronics mobile platform running a custom application. The application acquires an audio signal from an external user-worn microphone connected to the device analog-to-digital converter (microphone input). This signal is processed to determine the presence or absence of cough sounds. Symptom tallies and raw audio waveforms are recorded and made easily accessible for later review by a healthcare provider. The symptom detection algorithm is based upon standard speech recognition and machine learning paradigms and consists of an audio feature extraction step followed by a Hidden Markov Model based Viterbi decoder that has been trained on a large database of audio examples from a variety of subjects. Multiple Hidden Markov Model topologies and orders are studied. Performance of the recognizer is presented in terms of the sensitivity and the rate of false alarm as determined in a cross-validation test.

  13. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  14. Needs Assessments for Automated Manufacturing Training Programs.

    ERIC Educational Resources Information Center

    Northampton Community Coll., Bethlehem, PA.

    This document contains needs assessments used by Northampton Community College to develop training courses for a business-industry technology resource center for firms in eastern Pennsylvania. The following needs assessments are included: (1) individual skills survey for workers at Keystone Cement Company; (2) Keystone group skills survey; (3)…

  15. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  16. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  17. Automated Geospatial Watershed Assessment (AGWA) Documentation Version 2.0

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Http://www.epa.gov/nerlesd1/landsci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, University of Arizon...

  18. Validity Arguments for Diagnostic Assessment Using Automated Writing Evaluation

    ERIC Educational Resources Information Center

    Chapelle, Carol A.; Cotos, Elena; Lee, Jooyoung

    2015-01-01

    Two examples demonstrate an argument-based approach to validation of diagnostic assessment using automated writing evaluation (AWE). "Criterion"®, was developed by Educational Testing Service to analyze students' papers grammatically, providing sentence-level error feedback. An interpretive argument was developed for its use as part of…

  19. Human and Automated Assessment of Oral Reading Fluency

    ERIC Educational Resources Information Center

    Bolaños, Daniel; Cole, Ron A.; Ward, Wayne H.; Tindal, Gerald A.; Hasbrouck, Jan; Schwanenflugel, Paula J.

    2013-01-01

    This article describes a comprehensive approach to fully automated assessment of children's oral reading fluency (ORF), one of the most informative and frequently administered measures of children's reading ability. Speech recognition and machine learning techniques are described that model the 3 components of oral reading fluency: word accuracy,…

  20. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  1. Automated Rendezvous and Capture in Space: A Technology Assessment

    NASA Technical Reports Server (NTRS)

    Polites, Michael E.

    1998-01-01

    This paper presents the results of a study to assess the technology of automated rendezvous and capture (AR&C) in space. The outline of the paper is as follows: First, the history of manual and automated rendezvous and capture and rendezvous and dock is presented. Next, the need for AR&C in space is reviewed. In light of these, AR&C systems are proposed that meet NASA's future needs, but can be developed in a reasonable amount of time with a reasonable amount of money. Technology plans for developing these systems are presented; cost and schedule are included.

  2. A modular approach for automated sample preparation and chemical analysis

    NASA Technical Reports Server (NTRS)

    Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph

    1994-01-01

    Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.

  3. Estimates of Radionuclide Loading to Cochiti Lake from Los Alamos Canyon Using Manual and Automated Sampling

    SciTech Connect

    McLean, Christopher T.

    2000-07-01

    Los Alamos National Laboratory has a long-standing program of sampling storm water runoff inside the Laboratory boundaries. In 1995, the Laboratory started collecting the samples using automated storm water sampling stations; prior to this time the samples were collected manually. The Laboratory has also been periodically collecting sediment samples from Cochiti Lake. This paper presents the data for Pu-238 and Pu-239 bound to the sediments for Los Alamos Canyon storm water runoff and compares the sampling types by mass loading and as a percentage of the sediment deposition to Cochiti Lake. The data for both manual and automated sampling are used to calculate mass loads from Los Alamos Canyon on a yearly basis. The automated samples show mass loading 200- 500 percent greater for Pu-238 and 300-700 percent greater for Pu-239 than the manual samples. Using the mean manual flow volume for mass loading calculations, the automated samples are over 900 percent greater for Pu-238 and over 1800 percent greater for Pu-239. Evaluating the Pu-238 and Pu-239 activities as a percentage of deposition to Cochiti Lake indicates that the automated samples are 700-1300 percent greater for Pu- 238 and 200-500 percent greater for Pu-239. The variance was calculated by two methods. The first method calculates the variance for each sample event. The second method calculates the variances by the total volume of water discharged in Los Alamos Canyon for the year.

  4. Operator-based metric for nuclear operations automation assessment

    SciTech Connect

    Zacharias, G.L.; Miao, A.X.; Kalkan, A.

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  5. Evaluation of the measurement uncertainty in automated long-term sampling of PCDD/PCDFs.

    PubMed

    Vicaretti, M; D'Emilia, G; Mosca, S; Guerriero, E; Rotatori, M

    2013-12-01

    Since the publication of the first version of European standard EN-1948 in 1996, long-term sampling equipment has been improved to a high standard for the sampling and analysis of polychlorodibenzo-p-dioxin (PCDD)/polychlorodibenzofuran (PCDF) emissions from industrial sources. The current automated PCDD/PCDF sampling systems enable to extend the measurement time from 6-8 h to 15-30 days in order to have data values better representative of the real pollutant emission of the plant in the long period. EN-1948:2006 is still the European technical reference standard for the determination of PCDD/PCDF from stationary source emissions. In this paper, a methodology to estimate the measurement uncertainty of long-term automated sampling is presented. The methodology has been tested on a set of high concentration sampling data resulting from a specific experience; it is proposed with the intent that it is to be applied on further similar studies and generalized. A comparison between short-term sampling data resulting from manual and automated parallel measurements has been considered also in order to verify the feasibility and usefulness of automated systems and to establish correlations between results of the two methods to use a manual method for calibration of automatic long-term one. The uncertainty components of the manual method are analyzed, following the requirements of EN-1948-3:2006, allowing to have a preliminary evaluation of the corresponding uncertainty components of the automated system. Then, a comparison between experimental data coming from parallel sampling campaigns carried out in short- and long-term sampling periods is realized. Long-term sampling is more reliable to monitor PCDD/PCDF emissions than occasional short-term sampling. Automated sampling systems can assure very useful emission data both in short and long sampling periods. Despite this, due to the different application of the long-term sampling systems, the automated results could not be

  6. Automated Assessment and Experiences of Teaching Programming

    ERIC Educational Resources Information Center

    Higgins, Colin A.; Gray, Geoffrey; Symeonidis, Pavlos; Tsintsifas, Athanasios

    2005-01-01

    This article reports on the design, implementation, and usage of the CourseMarker (formerly known as CourseMaster) courseware Computer Based Assessment (CBA) system at the University of Nottingham. Students use CourseMarker to solve (programming) exercises and to submit their solutions. CourseMarker returns immediate results and feedback to the…

  7. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  8. Automated FMV image quality assessment based on power spectrum statistics

    NASA Astrophysics Data System (ADS)

    Kalukin, Andrew

    2015-05-01

    Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).

  9. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED TOOL FOR WATERSHED ASSESSMENT AND PLANNING

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  10. Automated Assessment of Medical Training Evaluation Text

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei; Gladding, Sophia; Aylward, Michael; Borman-Shoap, Emily; Melton, Genevieve B.

    2012-01-01

    Medical post-graduate residency training and medical student training increasingly utilize electronic systems to evaluate trainee performance based on defined training competencies with quantitative and qualitative data, the later of which typically consists of text comments. Medical education is concomitantly becoming a growing area of clinical research. While electronic systems have proliferated in number, little work has been done to help manage and analyze qualitative data from these evaluations. We explored the use of text-mining techniques to assist medical education researchers in sentiment analysis and topic analysis of residency evaluations with a sample of 812 evaluation statements. While comments were predominantly positive, sentiment analysis improved the ability to discriminate statements with 93% accuracy. Similar to other domains, Latent Dirichlet Analysis and Information Gain revealed groups of core subjects and appear to be useful for identifying topics from this data. PMID:23304426

  11. Automated Neuropsychological Assessment Metrics (ANAM) Traumatic Brain Injury (TBI): Human Factors Assessment

    DTIC Science & Technology

    2011-07-01

    Monitoring Recovery from Traumatic Brain Injury Using Automated Neuropsychological Assessment Metrics (ANAM™ V1.0). Archives of Clinical Neuropsychology 1997...Bleiberg, J.; Kane, R. ANAM™ Genogram: Historical Perspectives, Description and Current Endeavors. Archives of Clinical Neuropsychology Supplement

  12. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  13. A Primer on Sampling for Statewide Assessment.

    ERIC Educational Resources Information Center

    Jaeger, Richard M.

    This paper is a primer on sampling procedures for statewide assessment. The careful reader should gain substantial knowledge about the promises and pitfalls of sampling for assessment. The primer has three basic objectives: (1) to define terms and concepts basic to sampling theory and its application, including population, sampling unit, sampling…

  14. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  15. Automation of a Surface Sampling Probe/Electrospray Mass Spectrometry System

    SciTech Connect

    Kertesz, Vilmos; Ford, Michael J; Van Berkel, Gary J

    2005-01-01

    An image analysis automation concept and the associated software (HandsFree TLC/MS) were developed to control the surface sampling probe-to-surface distance during operation of a surface sampling electrospray system. This automation system enables both 'hands-free' formation of the liquid microjunction used to sample material from the surface and hands-free reoptimization of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system. The image analysis concept and the practical implementation of the monitoring and automated adjustment of the sampling probe-to-surface distance (i.e., liquid microjunction thickness) are presented. The added capabilities for the preexisting surface sampling electrospray system afforded through this software control are illustrated by an example of automated scanning of multiple development lanes on a reversed-phase C8 TLC plate and by imaging inked lettering on a paper surface. The post data acquisition processing and data display aspects of the software package are also discussed.

  16. Situation Awareness and Levels of Automation: Empirical Assessment of Levels of Automation in the Commercial Cockpit

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Schutte, Paul C. (Technical Monitor)

    2000-01-01

    This report has been prepared to closeout a NASA grant to Mississippi State University (MSU) for research into situation awareness (SA) and automation in the advanced commercial aircraft cockpit. The grant was divided into two obligations including $60,000 for the period from May 11, 2000 to December 25, 2000. The information presented in this report summarizes work completed through this obligation. It also details work to be completed with the balance of the current obligation and unobligated funds amounting to $50,043, which are to be granted to North Carolina State University for completion of the research project from July 31, 2000 to May 10, 2001. This research was to involve investigation of a broad spectrum of degrees of automation of complex systems on human-machine performance and SA. The work was to empirically assess the effect of theoretical levels of automation (LOAs) described in a taxonomy developed by Endsley & Kaber (1999) on naive and experienced subject performance and SA in simulated flight tasks. The study was to be conducted in the context of a realistic simulation of aircraft flight control. The objective of this work was to identify LOAs that effectively integrate humans and machines under normal operating conditions and failure modes. In general, the work was to provide insight into the design of automation in the commercial aircraft cockpit. Both laboratory and field investigations were to be conducted. At this point in time, a high-fidelity flight simulator of the McDonald Douglas (MD) 11 aircraft has been completed. The simulator integrates a reconfigurable flight simulator developed by the Georgia Institute of Technology and stand-alone simulations of MD-11 autoflight systems developed at MSU. Use of the simulator has been integrated into a study plan for the laboratory research and it is expected that the simulator will also be used in the field study with actual commercial pilots. In addition to the flight simulator, an electronic

  17. Automated versus manual sample inoculations in routine clinical microbiology: a performance evaluation of the fully automated InoqulA instrument.

    PubMed

    Froment, P; Marchandin, H; Vande Perre, P; Lamy, B

    2014-03-01

    The process of plate streaking has been automated to improve the culture readings, isolation quality, and workflow of microbiology laboratories. However, instruments have not been well evaluated under routine conditions. We aimed to evaluate the performance of the fully automated InoqulA instrument (BD Kiestra B.V., The Netherlands) in the automated seeding of liquid specimens and samples collected using swabs with transport medium. We compared manual and automated methods according to the (i) within-run reproducibility using Escherichia coli-calibrated suspensions, (ii) intersample contamination using a series of alternating sterile broths and broths with >10(5) CFU/ml of either E. coli or Proteus mirabilis, (iii) isolation quality with standardized mixed bacterial suspensions of diverse complexity and a 4-category standardized scale (very poor, poor, fair to good, or excellent), and (iv) agreement of the results obtained from 244 clinical specimens. By involving 15 technicians in the latter part of the comparative study, we estimated the variability in the culture quality at the level of the laboratory team. The instrument produced satisfactory reproducibility with no sample cross-contamination, and it performed better than the manual method, with more colony types recovered and isolated (up to 11% and 17%, respectively). Finally, we showed that the instrument did not shorten the seeding time over short periods of work compared to that for the manual method. Altogether, the instrument improved the quality and standardization of the isolation, thereby contributing to a better overall workflow, shortened the time to results, and provided more accurate results for polymicrobial specimens.

  18. The Stanford Automated Mounter: Pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    SciTech Connect

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-02-24

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.

  19. The Stanford Automated Mounter: Pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    DOE PAGES

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; ...

    2016-02-24

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.

  20. The Stanford Automated Mounter: pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    PubMed Central

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-01-01

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. Sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade. PMID:27047309

  1. Automated Research Impact Assessment: A New Bibliometrics Approach

    PubMed Central

    Drew, Christina H.; Pettibone, Kristianna G.; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-01-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in “important” research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research. PMID:26989272

  2. Automated Research Impact Assessment: A New Bibliometrics Approach.

    PubMed

    Drew, Christina H; Pettibone, Kristianna G; Finch, Fallis Owen; Giles, Douglas; Jordan, Paul

    2016-03-01

    As federal programs are held more accountable for their research investments, The National Institute of Environmental Health Sciences (NIEHS) has developed a new method to quantify the impact of our funded research on the scientific and broader communities. In this article we review traditional bibliometric analyses, address challenges associated with them, and describe a new bibliometric analysis method, the Automated Research Impact Assessment (ARIA). ARIA taps into a resource that has only rarely been used for bibliometric analyses: references cited in "important" research artifacts, such as policies, regulations, clinical guidelines, and expert panel reports. The approach includes new statistics that science managers can use to benchmark contributions to research by funding source. This new method provides the ability to conduct automated impact analyses of federal research that can be incorporated in program evaluations. We apply this method to several case studies to examine the impact of NIEHS funded research.

  3. Biological Environmental Sampling Technologies Assessment

    DTIC Science & Technology

    2015-12-01

    sa ti li ty Types of surfaces Info only Can the device support all types of surface sampling to include but not limited to tile, concrete ...sampling to include but not limited to tile, concrete , wood, glass, stone, plastic, etc. Yes/no Informational only Yes N/A N/A Info only...include but not limited to tile, concrete , wood, glass, stone, plastic, etc. Yes/no Informational only Yes N/A N/A Info only Sampling area size

  4. Automated biowaste sampling system, solids subsystem operating model, part 2

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Stauffer, R. E.

    1973-01-01

    The detail design and fabrication of the Solids Subsystem were implemented. The system's capacity for the collection, storage or sampling of feces and vomitus from six subjects was tested and verified.

  5. An Automated Sample Divider for Farmers Stock Peanuts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In-shell peanuts are harvested, loaded into drying trailers, and delivered to a central facility where they are dried to a moisture content safe for long term storage, sampled, graded, then unloaded into bulk storage. Drying trailers have capacities ranging from five to twenty-five tons of dry farme...

  6. Automated biowaste sampling system urine subsystem operating model, part 1

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  7. An automated method of sample preparation of biofluids using pierceable caps to eliminate the uncapping of the sample tubes during sample transfer.

    PubMed

    Teitz, D S; Khan, S; Powell, M L; Jemal, M

    2000-09-11

    Biological samples are normally collected and stored frozen in capped tubes until analysis. To obtain aliquots of biological samples for analysis, the sample tubes have to be thawed, uncapped, samples removed and then recapped for further storage. In this paper, we report an automated method of sample transfer devised to eliminate the uncapping and recapping process. This sampling method was incorporated into an automated liquid-liquid extraction procedure of plasma samples. Using a robotic system, the plasma samples were transferred directly from pierceable capped tubes into microtubes contained in a 96-position block. The aliquoted samples were extracted with methyl-tert-butyl ether in the same microtubes. The supernatant organic layers were transferred to a 96-well collection plate and evaporated to dryness. The dried extracts were reconstituted and injected from the same plate for analysis by liquid chromatography with tandem mass spectrometry.

  8. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  9. The Impact of Sampling Approach on Population Invariance in Automated Scoring of Essays. Research Report. ETS RR-13-18

    ERIC Educational Resources Information Center

    Zhang, Mo

    2013-01-01

    Many testing programs use automated scoring to grade essays. One issue in automated essay scoring that has not been examined adequately is population invariance and its causes. The primary purpose of this study was to investigate the impact of sampling in model calibration on population invariance of automated scores. This study analyzed scores…

  10. Development of an automated sample preparation module for environmental monitoring of biowarfare agents.

    PubMed

    Hindson, Benjamin J; Brown, Steve B; Marshall, Graham D; McBride, Mary T; Makarewicz, Anthony J; Gutierrez, Dora M; Wolcott, Duane K; Metz, Thomas R; Madabhushi, Ramakrishna S; Dzenitis, John M; Colston, Billy W

    2004-07-01

    An automated sample preparation module, based upon sequential injection analysis (SIA), has been developed for use within an autonomous pathogen detection system. The SIA system interfaced aerosol sampling with multiplexed microsphere immunoassay-flow cytometric detection. Metering and sequestering of microspheres using SIA was found to be reproducible and reliable, over 24-h periods of autonomous operation. Four inbuilt immunoassay controls showed excellent immunoassay and system stability over five days of unattended continuous operation. Titration curves for two biological warfare agents, Bacillus anthracis and Yersinia pestis, obtained using the automated SIA procedure were shown to be similar to those generated using a manual microtiter plate procedure.

  11. Automated Assessment of the Quality of Depression Websites

    PubMed Central

    Tang, Thanh Tin; Hawking, David; Christensen, Helen

    2005-01-01

    Background Since health information on the World Wide Web is of variable quality, methods are needed to assist consumers to identify health websites containing evidence-based information. Manual assessment tools may assist consumers to evaluate the quality of sites. However, these tools are poorly validated and often impractical. There is a need to develop better consumer tools, and in particular to explore the potential of automated procedures for evaluating the quality of health information on the web. Objective This study (1) describes the development of an automated quality assessment procedure (AQA) designed to automatically rank depression websites according to their evidence-based quality; (2) evaluates the validity of the AQA relative to human rated evidence-based quality scores; and (3) compares the validity of Google PageRank and the AQA as indicators of evidence-based quality. Method The AQA was developed using a quality feedback technique and a set of training websites previously rated manually according to their concordance with statements in the Oxford University Centre for Evidence-Based Mental Health’s guidelines for treating depression. The validation phase involved 30 websites compiled from the DMOZ, Yahoo! and LookSmart Depression Directories by randomly selecting six sites from each of the Google PageRank bands of 0, 1-2, 3-4, 5-6 and 7-8. Evidence-based ratings from two independent raters (based on concordance with the Oxford guidelines) were then compared with scores derived from the automated AQA and Google algorithms. There was no overlap in the websites used in the training and validation phases of the study. Results The correlation between the AQA score and the evidence-based ratings was high and significant (r=0.85, P<.001). Addition of a quadratic component improved the fit, the combined linear and quadratic model explaining 82 percent of the variance. The correlation between Google PageRank and the evidence-based score was lower than

  12. Automated Portable Test System (APTS) - A performance envelope assessment tool

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Dunlap, W. P.; Jones, M. B.; Wilkes, R. L.; Bittner, A. C., Jr.

    1985-01-01

    The reliability and stability of microcomputer-based psychological tests are evaluated. The hardware, test programs, and system control of the Automated Portable Test System, which assesses human performance and subjective status, are described. Subjects were administered 11 pen-and-pencil and microcomputer-based tests for 10 sessions. The data reveal that nine of the 10 tests stabilized by the third administration; inertial correlations were high and consistent. It is noted that the microcomputer-based tests display good psychometric properties in terms of differential stability and reliability.

  13. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Fire Management and Assessment.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impacts of wildland fire on runoff and erosion. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface joi...

  14. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    NASA Astrophysics Data System (ADS)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  15. Automated syringe sampler. [remote sampling of air and water

    NASA Technical Reports Server (NTRS)

    Purgold, G. C. (Inventor)

    1981-01-01

    A number of sampling services are disposed in a rack which slides into a housing. In response to a signal from an antenna, the circutry elements are activated which provide power individually, collectively, or selectively to a servomechanism thereby moving an actuator arm and the attached jawed bracket supporting an evaculated tube towards a stationary needle. One open end of the needle extends through the side wall of a conduit to the interior and the other open end is maintained within the protective sleeve, supported by a bifurcated bracket. A septum in punctured by the end of the needle within the sleeve and a sample of the fluid medium in the conduit flows through the needle and is transferred to a tube. The signal to the servo is then reversed and the actuator arm moves the tube back to its original position permitting the septum to expand and seal the hole made by the needle. The jawed bracket is attached by pivot to the actuator to facilitate tube replacement.

  16. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    NASA Astrophysics Data System (ADS)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  17. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    PubMed Central

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A.G.; Sellergren, Börje; Reubsaet, Léon

    2017-01-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting. PMID:28303910

  18. An automated integrated platform for rapid and sensitive multiplexed protein profiling using human saliva samples.

    PubMed

    Nie, Shuai; Henley, W Hampton; Miller, Scott E; Zhang, Huaibin; Mayer, Kathryn M; Dennis, Patty J; Oblath, Emily A; Alarie, Jean Pierre; Wu, Yue; Oppenheim, Frank G; Little, Frédéric F; Uluer, Ahmet Z; Wang, Peidong; Ramsey, J Michael; Walt, David R

    2014-03-21

    During the last decade, saliva has emerged as a potentially ideal diagnostic biofluid for noninvasive testing. In this paper, we present an automated, integrated platform useable by minimally trained personnel in the field for the diagnosis of respiratory diseases using human saliva as a sample specimen. In this platform, a saliva sample is loaded onto a disposable microfluidic chip containing all the necessary reagents and components required for saliva analysis. The chip is then inserted into the automated analyzer, the SDReader, where multiple potential protein biomarkers for respiratory diseases are measured simultaneously using a microsphere-based array via fluorescence sandwich immunoassays. The results are read optically, and the images are analyzed by a custom-designed algorithm. The fully automated assay requires as little as 10 μL of saliva sample, and the results are reported in 70 min. The performance of the platform was characterized by testing protein standard solutions, and the results were comparable to those from the 3.5 h lab bench assay that we have previously reported. The device was also deployed in two clinical environments where 273 human saliva samples collected from different subjects were successfully tested, demonstrating the device's potential to assist clinicians with the diagnosis of respiratory diseases by providing timely protein biomarker profiling information. This platform, which combines noninvasive sample collection and fully automated analysis, can also be utilized in point-of-care diagnostics.

  19. Fully automated open access platform for rapid, combined serial evaporation and sample reformatting.

    PubMed

    Benali, Otman; Davies, Gary; Deal, Martyn; Farrant, Elizabeth; Guthrie, Duncan; Holden, John; Wheeler, Rob

    2008-01-01

    This paper reports a novel evaporator and its integration with an automated sample handling system to create a high throughput evaporation platform. The Vaportec V-10 evaporator uses a high speed rotation motor ( approximately 6000 rpm) to spin the vial containing a sample, creating a thin film of solvent which can be readily evaporated by the application of heat to the vial, while the consequent centrifugal force prevents "bumping". An intelligent algorithm controls pressure and temperature for optimum solvent removal conditions and end of run detection, critical for automation. The system allows the option of evaporation directly from a sample source vial, or alternatively, integrated liquid handling facilities provide the capability of transferring samples portionwise from a (large) source vial or bottle to a (small) daughter container, enabling efficient sample reformatting, with minimum user intervention. The open access system makes significant advances over current vacuum centrifugal evaporators in terms of evaporation rate and ease of automation. The evaporator's main features, the integration of robotics to provide automation, and examples of evaporation rates of a wide range of solvents from a variety of containers are described.

  20. Automated LSA Assessment of Summaries in Distance Education: Some Variables to Be Considered

    ERIC Educational Resources Information Center

    Jorge-Botana, Guillermo; Luzón, José M.; Gómez-Veiga, Isabel; Martín-Cordero, Jesús I.

    2015-01-01

    A latent semantic analysis-based automated summary assessment is described; this automated system is applied to a real learning from text task in a Distance Education context. We comment on the use of automated content, plagiarism, text coherence measures, and word weights average and their impact on predicting human judges summary scoring. A…

  1. Automated laboratory based X-ray beamline with multi-capillary sample chamber

    SciTech Connect

    Purushothaman, S.; Gauthé, B. L. L. E.; Brooks, N. J.; Templer, R. H.; Ces, O.

    2013-08-15

    An automated laboratory based X-ray beamline with a multi-capillary sample chamber capable of undertaking small angle X-ray scattering measurements on a maximum of 104 samples at a time as a function of temperature between 5 and 85 °C has been developed. The modular format of the system enables the user to simultaneously equilibrate samples at eight different temperatures with an accuracy of ±0.005 °C. This system couples a rotating anode generator and 2D optoelectronic detector with Franks X-ray optics, leading to typical exposure times of less than 5 min for lyotropic liquid crystalline samples. Beamline control including sample exchange and data acquisition has been fully automated via a custom designed LabVIEW framework.

  2. An automated system for global atmospheric sampling using B-747 airliners

    NASA Technical Reports Server (NTRS)

    Lew, K. Q.; Gustafsson, U. R. C.; Johnson, R. E.

    1981-01-01

    The global air sampling program utilizes commercial aircrafts in scheduled service to measure atmospheric constituents. A fully automated system designed for the 747 aircraft is described. Airline operational constraints and data and control subsystems are treated. The overall program management, system monitoring, and data retrieval from four aircraft in global service is described.

  3. Automated Video Quality Assessment for Deep-Sea Video

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating

  4. Automated bone age assessment of older children using the radius

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Gertych, Arkadiusz; Zhang, Aifeng; Liu, Brent J.; Huang, Han K.

    2008-03-01

    The Digital Hand Atlas in Assessment of Skeletal Development is a large-scale Computer Aided Diagnosis (CAD) project for automating the process of grading Skeletal Development of children from 0-18 years of age. It includes a complete collection of 1,400 normal hand X-rays of children between the ages of 0-18 years of age. Bone Age Assessment is used as an index of skeletal development for detection of growth pathologies that can be related to endocrine, malnutrition and other disease types. Previous work at the Image Processing and Informatics Lab (IPILab) allowed the bone age CAD algorithm to accurately assess bone age of children from 1 to 16 (male) or 14 (female) years of age using the Phalanges as well as the Carpal Bones. At the older ages (16(male) or 14(female) -19 years of age) the Phalanges as well as the Carpal Bones are fully developed and do not provide well-defined features for accurate bone age assessment. Therefore integration of the Radius Bone as a region of interest (ROI) is greatly needed and will significantly improve the ability to accurately assess the bone age of older children. Preliminary studies show that an integrated Bone Age CAD that utilizes the Phalanges, Carpal Bones and Radius forms a robust method for automatic bone age assessment throughout the entire age range (1-19 years of age).

  5. Application of bar codes to the automation of analytical sample data collection

    SciTech Connect

    Jurgensen, H A

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented.

  6. Using sample entropy for automated sign language recognition on sEMG and accelerometer data.

    PubMed

    Kosmidou, Vasiliki E; Hadjileontiadis, Leontios I

    2010-03-01

    Communication using sign language (SL) provides alternative means for information transmission among the deaf. Automated gesture recognition involved in SL, however, could further expand this communication channel to the world of hearers. In this study, data from five-channel surface electromyogram and three-dimensional accelerometer from signers' dominant hand were subjected to a feature extraction process. The latter consisted of sample entropy (SampEn)-based analysis, whereas time-frequency feature (TFF) analysis was also performed as a baseline method for the automated recognition of 60-word lexicon Greek SL (GSL) isolated signs. Experimental results have shown a 66 and 92% mean classification accuracy threshold using TFF and SampEn, respectively. These results justify the superiority of SampEn against conventional methods, such as TFF, to provide with high recognition hit-ratios, combined with feature vector dimension reduction, toward a fast and reliable automated GSL gesture recognition.

  7. Automated quality assessment in three-dimensional breast ultrasound images.

    PubMed

    Schwaab, Julia; Diez, Yago; Oliver, Arnau; Martí, Robert; van Zelst, Jan; Gubern-Mérida, Albert; Mourri, Ahmed Bensouda; Gregori, Johannes; Günther, Matthias

    2016-04-01

    Automated three-dimensional breast ultrasound (ABUS) is a valuable adjunct to x-ray mammography for breast cancer screening of women with dense breasts. High image quality is essential for proper diagnostics and computer-aided detection. We propose an automated image quality assessment system for ABUS images that detects artifacts at the time of acquisition. Therefore, we study three aspects that can corrupt ABUS images: the nipple position relative to the rest of the breast, the shadow caused by the nipple, and the shape of the breast contour on the image. Image processing and machine learning algorithms are combined to detect these artifacts based on 368 clinical ABUS images that have been rated manually by two experienced clinicians. At a specificity of 0.99, 55% of the images that were rated as low quality are detected by the proposed algorithms. The areas under the ROC curves of the single classifiers are 0.99 for the nipple position, 0.84 for the nipple shadow, and 0.89 for the breast contour shape. The proposed algorithms work fast and reliably, which makes them adequate for online evaluation of image quality during acquisition. The presented concept may be extended to further image modalities and quality aspects.

  8. Fully Automated Deep Learning System for Bone Age Assessment.

    PubMed

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-03-08

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  9. Evaluation of automated streamwater sampling during storm events for total mercury analysis.

    PubMed

    Riscassi, Ami L; Converse, Amber D; Hokanson, Kelly J; Scanlon, Todd M

    2010-10-06

    Understanding the processes by which mercury is mobilized from soil to stream is currently limited by a lack of observations during high-flow events, when the majority of this transport occurs. An automated technique to collect stream water for unfiltered total mercury (HgT) analysis was systematically evaluated in a series of laboratory experiments. Potential sources of error investigated were 1) carry-over effects associated with sequential sampling, 2) deposition of HgT into empty bottles prior to sampling, and 3) deposition to or evasion from samples prior to retrieval. Contamination from carry-over effects was minimal (<2%) and HgT deposition to open bottles was negligible. Potentially greater errors are associated with evasive losses of HgT from uncapped samples, with higher temperatures leading to greater evasion. These evasive losses were found to take place primarily within the first eight hours. HgT associated with particulate material is much less prone to evasion than HgT in dissolved form. A field test conducted during a high-flow event confirmed unfiltered HgT concentrations sampled with an automated system were comparable to those taken manually, as the mean absolute difference between automated and manual samples (10%) was similar to the mean difference between duplicate grab samples (9%). Results from this study have demonstrated that a standard automated sampler, retrofitted with appropriately cleaned fluoropolymer tubing and glass bottles, can effectively be used for collection of streamwater during high-flow events for low-level mercury analysis.

  10. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    NASA Astrophysics Data System (ADS)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  11. Automated sample-changing robot for solution scattering experiments at the EMBL Hamburg SAXS station X33.

    PubMed

    Round, A R; Franke, D; Moritz, S; Huchler, R; Fritsche, M; Malthan, D; Klaering, R; Svergun, D I; Roessle, M

    2008-10-01

    There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client-server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources.

  12. A fully automated plasma protein precipitation sample preparation method for LC-MS/MS bioanalysis.

    PubMed

    Ma, Ji; Shi, Jianxia; Le, Hoa; Cho, Robert; Huang, Judy Chi-jou; Miao, Shichang; Wong, Bradley K

    2008-02-01

    This report describes the development and validation of a robust robotic system that fully integrates all peripheral devices needed for the automated preparation of plasma samples by protein precipitation. The liquid handling system consisted of a Tecan Freedom EVO 200 liquid handling platform equipped with an 8-channel liquid handling arm, two robotic plate-handling arms, and two plate shakers. Important additional components integrated into the platform were a robotic temperature-controlled centrifuge, a plate sealer, and a plate seal piercing station. These enabled unattended operation starting from a stock solution of the test compound, a set of test plasma samples and associated reagents. The stock solution of the test compound was used to prepare plasma calibration and quality control samples. Once calibration and quality control samples were prepared, precipitation of plasma proteins was achieved by addition of three volumes of acetonitrile. Integration of the peripheral devices allowed automated sequential completion of the centrifugation, plate sealing, piercing and supernatant transferral steps. The method produced a sealed, injection-ready 96-well plate of plasma extracts. Accuracy and precision of the automated system were satisfactory for the intended use: intra-day and the inter-day precision were excellent (C.V.<5%), while the intra-day and inter-day accuracies were acceptable (relative error<8%). The flexibility of the platform was sufficient to accommodate pharmacokinetic studies of different numbers of animals and time points. To the best of our knowledge, this represents the first complete automation of the protein precipitation method for plasma sample analysis.

  13. Automated Neuropsychological Assessment Metrics Version 4 (ANAM4): Select Psychometric Properties and Administration Procedures

    DTIC Science & Technology

    2015-12-01

    Award Number: W81XWH-08-1-0021 TITLE: Automated Neuropsychological Assessment Metrics Version 4 (ANAM4): Select Psychometric Properties and...REPORT DATE December 2015 2. REPORT TYPE Annual 3. DATES COVERED 01 Dec 2014 – 30 Nov 2015 4. TITLE AND SUBTITLE Automated Neuropsychological ... Neuropsychological Assessment Metrics (ANAM) is a computer assisted tool for evaluating neurocognitive performance with demonstrated effectiveness

  14. An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies

    PubMed Central

    Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.

    2016-01-01

    Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139

  15. An automated system for assessing cognitive function in any environment

    NASA Astrophysics Data System (ADS)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  16. Automated semiquantitative direct-current-arc spectrographic analysis of eight argonne premium coal ash samples

    USGS Publications Warehouse

    Skeen, C.J.; Libby, B.J.; Crandell, W.B.

    1990-01-01

    The automated semiquantitative direct-current-arc spectre-graphic method was used to analyze 62 elements in eight Argonne Premium Coal Ash samples. All eight coal ash samples were analyzed in triplicate to verify precision and accuracy of the method. The precision for most elements was within ??10%. The accuracy of this method is limited to +50% or -33% because of the nature of the standard curves for each of the elements. Adjustments to the computer program were implemented to account for unique matrix interferences in these particular coal ash samples.

  17. Automated sampling system for the analysis of amino acids using microfluidic capillary electrophoresis.

    PubMed

    Xu, Zhang-Run; Lan, Yue; Fan, Xiao-Feng; Li, Qi

    2009-04-30

    An improved automated continuous sample introduction system for microfluidic capillary electrophoresis (CE) is described. A sample plate was designed into gear-shaped and was fixed onto the shaft of a step motor. Twenty slotted reservoirs for containing samples and working electrolytes were fabricated on the "gear tooth" of the plate. A single 7.5-cm long Teflon AF-coated silica capillary serves as separation channel, sampling probe, as well as liquid-core waveguide (LCW) for light transmission. Platinum layer deposited on the capillary tip serves as the electrode. Automated continuous sample introduction was achieved by scanning the capillary tip through the slots of reservoirs. The sample was introduced into capillary and separated immediately in the capillary with only about 2-nL gross sample consumption. The laser-induced fluorescence (LIF) method with LCW technique was used for detecting fluorescein isothiocyanate (FITC)-labeled amino acids. With electric-field strength of 320 V/cm for injection and separation, and 1.0-s sample injection time, a mixture of FITC-labeled arginine and leucine was separated with a throughput of 60/h and a carryover of 2.7%.

  18. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  19. Development of a Miniature Mass Spectrometer and an Automated Detector for Sampling Explosive Materials

    PubMed Central

    Hashimoto, Yuichiro

    2017-01-01

    The development of a robust ionization source using the counter-flow APCI, miniature mass spectrometer, and an automated sampling system for detecting explosives are described. These development efforts using mass spectrometry were made in order to improve the efficiencies of on-site detection in areas such as security, environmental, and industrial applications. A development team, including the author, has struggled for nearly 20 years to enhance the robustness and reduce the size of mass spectrometers to meet the requirements needed for on-site applications. This article focuses on the recent results related to the detection of explosive materials where automated particle sampling using a cyclone concentrator permitted the inspection time to be successfully reduced to 3 s. PMID:28337396

  20. Non-uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination

    PubMed Central

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-01-01

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [1H,1H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. PMID:26227870

  1. Development of automated preparation system for isotopocule analysis of N2O in various air samples

    NASA Astrophysics Data System (ADS)

    Toyoda, Sakae; Yoshida, Naohiro

    2016-05-01

    Nitrous oxide (N2O), an increasingly abundant greenhouse gas in the atmosphere, is the most important stratospheric ozone-depleting gas of this century. Natural abundance ratios of isotopocules of N2O, NNO molecules substituted with stable isotopes of nitrogen and oxygen, are a promising index of various sources or production pathways of N2O and of its sink or decomposition pathways. Several automated methods have been reported to improve the analytical precision for the isotopocule ratio of atmospheric N2O and to reduce the labor necessary for complicated sample preparation procedures related to mass spectrometric analysis. However, no method accommodates flask samples with limited volume or pressure. Here we present an automated preconcentration system which offers flexibility with respect to the available gas volume, pressure, and N2O concentration. The shortest processing time for a single analysis of typical atmospheric sample is 40 min. Precision values of isotopocule ratio analysis are < 0.1 ‰ for δ15Nbulk (average abundances of 14N15N16O and 15N14N16O relative to 14N14N16O), < 0.2 ‰ for δ18O (relative abundance of 14N14N18O), and < 0.5 ‰ for site preference (SP; difference between relative abundance of 14N15N16O and 15N14N16O). This precision is comparable to that of other automated systems, but better than that of our previously reported manual measurement system.

  2. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    SciTech Connect

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  3. An instrument for automated purification of nucleic acids from contaminated forensic samples.

    PubMed

    Broemeling, David J; Pel, Joel; Gunn, Dylan C; Mai, Laura; Thompson, Jason D; Poon, Hiron; Marziali, Andre

    2008-02-01

    Forensic crime scene sample analysis, by its nature, often deals with samples in which there are low amounts of nucleic acids, on substrates that often lead to inhibition of subsequent enzymatic reactions such as PCR amplification for STR profiling. Common substrates include denim from blue jeans, which yields indigo dye as a PCR inhibitor, and soil, which yields humic substances as inhibitors. These inhibitors frequently co-extract with nucleic acids in standard column or bead-based preps, leading to frequent failure of STR profiling. We present a novel instrument for DNA purification of forensic samples that is capable of highly effective concentration of nucleic acids from soil particulates, fabric, and other complex samples including solid components. The novel concentration process, known as SCODA, is inherently selective for long charged polymers such as DNA, and therefore is able to effectively reject known contaminants. We present an automated sample preparation instrument based on this process, and preliminary results based on mock forensic samples.

  4. Automated mango fruit assessment using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  5. An automated system for performance assessment of airport lighting

    NASA Astrophysics Data System (ADS)

    Niblock, James; Peng, Jian-Xun; McMenemy, Karen; Irwin, George

    2008-02-01

    This paper presents work undertaken into the development of an automated air-based vision system for assessing the performance of an approach lighting system (ALS) installation in accordance with International Civil Aviation Organisation (ICAO) standards. The measuring device consists of an image sensor with associated lens system fitted to the interior of an aircraft. The vision system is capable of capturing sequences of airport lighting images during a normal approach to the airport. These images are then processed to determine the uniformity of the ALS. To assess the uniformity of the ALS the luminaires must first be uniquely identified and tracked through an image sequence. A model-based matching technique is utilised which uses a camera projection system to match a set of template data to the extracted image data. From the matching results the associated position and pose of the camera is estimated. Each luminaire emits an intensity which is dependant on its angular displacement from the camera. As such, it is possible to predict the intensity that each luminaire within the ALS emits during an approach. Luminaires emitting the same intensity are banded together for the uniformity analysis. Uniformity assumes that luminaires in close proximity exhibit similar luminous intensity characteristics. During a typical approach grouping information is obtained for the various sectors of luminaires. This grouping information is used to compare luminaires against one another in terms of their extracted grey level information. The developed software is validated using data acquired during an actual approach to a UK airport.

  6. Automated assessment of bradykinesia and dyskinesia in Parkinson's disease.

    PubMed

    Griffiths, Robert I; Kotschet, Katya; Arfon, Sian; Xu, Zheng Ming; Johnson, William; Drago, John; Evans, Andrew; Kempster, Peter; Raghav, Sanjay; Horne, Malcolm K

    2012-01-01

    There is a need for objective measures of dyskinesia and bradykinesia of Parkinson's disease (PD) that are continuous throughout the day and related to levodopa dosing. The output of an algorithm that calculates dyskinesia and bradykinesia scores every two minutes over 10 days (PKG: Global Kinetics Corporation) was compared with conventional rating scales for PD in PD subjects. The algorithm recognises bradykinesia as movements made with lower acceleration and amplitude and with longer intervals between movement. Similarly the algorithm recognises dyskinesia as having movements of normal amplitude and acceleration but with shorter periods without movement. The distribution of the bradykinesia and dyskinesia scores from PD subjects differed from that of normal subjects. The algorithm predicted the clinical dyskinesia rating scale AIMS with a 95% margin of error of 3.2 units compared with the inter-rater 95% limits of agreement from 3 neurologists of -3.4 to +4.3 units. Similarly the algorithm predicted the UPDRS III score with a margin of error similar to the inter-rater limits of agreement. Improvement in scores in response to changes in medication could be assessed statistically in individual patients. This algorithm provides objective, continuous and automated assessment of the clinical features of bradykinesia and dyskinesia in PD.

  7. Using automated continual performance assessment to improve health care.

    PubMed

    Wulff, K R; Westphal, J R; Shray, S L; Hunkeler, E F

    1997-01-01

    Inefficiency in the work of health care providers is evident and contributes to health care costs. In the early 20th century, industrial engineers developed scientific methods for studying work to improve performance (efficiency) by measuring results--i.e., quality, cost, and productivity. In the mid-20th century, business managers developed ways to apply these methods to improve the work process. These scientific methods and management approaches can be applied to improving medical work. Fee-for-service practice has had incentives to maximize productivity, and prepaid practice has had incentives to minimize costs, but no sector of the health care system has systematically pursued the optimization of all performance variables: quality, cost, and productivity. We have reviewed evolving methods for the automation of continual assessment of performance in health care using touch screen and computer telephone, logging and scheduling software, appropriate combinations of generic or disease-specific health status questionnaires, physiologic measurements or laboratory assays from computerized records, and cost and productivity data from computerized registration logs. We propose that the results of outcome assessment be rapidly and continually transmitted to providers, patients, and managers so that health care processes can be progressively improved. The evolving systems we have described are the practical tools that can help us achieve our performance goals.

  8. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    PubMed

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  9. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments

    PubMed Central

    Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.

    2016-01-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827

  10. Electrochemical pesticide detection with AutoDip--a portable platform for automation of crude sample analyses.

    PubMed

    Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils

    2015-02-07

    Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.

  11. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    PubMed

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  12. Automation of Workplace Lifting Hazard Assessment for Musculoskeletal Injury Prevention

    PubMed Central

    2014-01-01

    posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training. PMID:24987523

  13. Assessment of organic matter resistance to biodegradation in volcanic ash soils assisted by automated interpretation of infrared spectra from humic acid and whole soil samples by using partial least squares

    NASA Astrophysics Data System (ADS)

    Hernández, Zulimar; Pérez Trujillo, Juan Pedro; Hernández-Hernández, Sergio Alexander; Almendros, Gonzalo; Sanz, Jesús

    2014-05-01

    From a practical viewpoint, the most interesting possibilities of applying infrared (IR) spectroscopy to soil studies lie on processing IR spectra of whole soil (WS) samples [1] in order to forecast functional descriptors at high organizational levels of the soil system, such as soil C resilience. Currently, there is a discussion on whether the resistance to biodegradation of soil organic matter (SOM) depends on its molecular composition or on environmental interactions between SOM and mineral components, such could be the case with physical encapsulation of particulate SOM or organo-mineral derivatives, e.g., those formed with amorphous oxides [2]. A set of about 200 dependent variables from WS and isolated, ash free, humic acids (HA) [3] was obtained in 30 volcanic ash soils from Tenerife Island (Spain). Soil biogeochemical properties such as SOM, allophane (Alo + 1 /2 Feo), total mineralization coefficient (TMC) or aggregate stability were determined in WS. In addition, structural information on SOM was obtained from the isolated HA fractions by visible spectroscopy and analytical pyrolysis (Py-GC/MS). Aiming to explore the potential of partial least squares regression (PLS) in forecasting soil dependent variables, exclusively using the information extracted from WS and HA IR spectral profiles, data were processed by using ParLeS [4] and Unscrambler programs. Data pre-treatments should be carefully chosen: the most significant PLS models from IR spectra of HA were obtained after second derivative pre-treatment, which prevented effects of intrinsically broadband spectral profiles typical in macromolecular heterogeneous material such as HA. Conversely, when using IR spectra of WS, the best forecasting models were obtained using linear baseline correction and maximum normalization pre-treatment. With WS spectra, the most successful prediction models were obtained for SOM, magnetite, allophane, aggregate stability, clay and total aromatic compounds, whereas the PLS

  14. Automated sample preparation facilitated by PhyNexus MEA purification system for oligosaccharide mapping of glycoproteins.

    PubMed

    Prater, Bradley D; Anumula, Kalyan R; Hutchins, Jeff T

    2007-10-15

    A reproducible high-throughput sample cleanup method for fluorescent oligosaccharide mapping of glycoproteins is described. Oligosaccharides are released from glycoproteins using PNGase F and labeled with 2-aminobenzoic acid (anthranilic acid, AA). A PhyNexus MEA system was adapted for automated isolation of the fluorescently labeled oligosaccharides from the reaction mixture prior to mapping by HPLC. The oligosaccharide purification uses a normal-phase polyamide resin (DPA-6S) in custom-made pipette tips. The resin volume, wash, and elution steps involved were optimized to obtain high recovery of oligosaccharides with the least amount of contaminating free fluorescent dye in the shortest amount of time. The automated protocol for sample cleanup eliminated all manual manipulations with a recycle time of 23 min. We have reduced the amount of excess AA by 150-fold, allowing quantitative oligosaccharide mapping from as little as 500 ng digested recombinant immunoglobulin G (rIgG). This low sample requirement allows early selection of a cell line with desired characteristics (e.g., oligosaccharide profile and high specific productivity) for the production of glycoprotein drugs. In addition, the use of Tecan or another robotic platform in conjunction with this method should allow the cleanup of 96 samples in 23 min, a significant decrease in the amount of time currently required to process such a large number of samples.

  15. How to assess driver's interaction with partially automated driving systems - A framework for early concept assessment.

    PubMed

    van den Beukel, Arie P; van der Voort, Mascha C

    2017-03-01

    The introduction of partially automated driving systems changes the driving task into supervising the automation with an occasional need to intervene. To develop interface solutions that adequately support drivers in this new role, this study proposes and evaluates an assessment framework that allows designers to evaluate driver-support within relevant real-world scenarios. Aspects identified as requiring assessment in terms of driver-support within the proposed framework are Accident Avoidance, gained Situation Awareness (SA) and Concept Acceptance. Measurement techniques selected to operationalise these aspects and the associated framework are pilot-tested with twenty-four participants in a driving simulator experiment. The objective of the test is to determine the reliability of the applied measurements for the assessment of the framework and whether the proposed framework is effective in predicting the level of support offered by the concepts. Based on the congruency between measurement scores produced in the test and scores with predefined differences in concept-support, this study demonstrates the framework's reliability. A remaining concern is the framework's weak sensitivity to small differences in offered support. The article concludes that applying the framework is especially advantageous for evaluating early design phases and can successfully contribute to the efficient development of driver's in-control and safe means of operating partially automated vehicles.

  16. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  17. Automated, Ultra-Sterile Solid Sample Handling and Analysis on a Chip

    NASA Technical Reports Server (NTRS)

    Mora, Maria F.; Stockton, Amanda M.; Willis, Peter A.

    2013-01-01

    There are no existing ultra-sterile lab-on-a-chip systems that can accept solid samples and perform complete chemical analyses without human intervention. The proposed solution is to demonstrate completely automated lab-on-a-chip manipulation of powdered solid samples, followed by on-chip liquid extraction and chemical analysis. This technology utilizes a newly invented glass micro-device for solid manipulation, which mates with existing lab-on-a-chip instrumentation. Devices are fabricated in a Class 10 cleanroom at the JPL MicroDevices Lab, and are plasma-cleaned before and after assembly. Solid samples enter the device through a drilled hole in the top. Existing micro-pumping technology is used to transfer milligrams of powdered sample into an extraction chamber where it is mixed with liquids to extract organic material. Subsequent chemical analysis is performed using portable microchip capillary electrophoresis systems (CE). These instruments have been used for ultra-highly sensitive (parts-per-trillion, pptr) analysis of organic compounds including amines, amino acids, aldehydes, ketones, carboxylic acids, and thiols. Fully autonomous amino acid analyses in liquids were demonstrated; however, to date there have been no reports of completely automated analysis of solid samples on chip. This approach utilizes an existing portable instrument that houses optics, high-voltage power supplies, and solenoids for fully autonomous microfluidic sample processing and CE analysis with laser-induced fluorescence (LIF) detection. Furthermore, the entire system can be sterilized and placed in a cleanroom environment for analyzing samples returned from extraterrestrial targets, if desired. This is an entirely new capability never demonstrated before. The ability to manipulate solid samples, coupled with lab-on-a-chip analysis technology, will enable ultraclean and ultrasensitive end-to-end analysis of samples that is orders of magnitude more sensitive than the ppb goal given

  18. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  19. Application of existing technology to meet increasing demands for automated sample handling.

    PubMed

    Chow, A T; Kegelman, J E; Kohli, C; McCabe, D D; Moore, J F

    1990-09-01

    As the clinical laboratory advances toward total automation, the marketplace is now demanding more-efficient sample-handling systems. These demands have arisen over a relatively short period of time, in part because of heightened concern over laboratory safety and the resulting manpower shortages. Adding sample-handling capabilities to existing instrumentation is often a challenge, because usually mechanical or system constraints are present that interfere. This challenge has been overcome in the DuPont Sample Management System (SMS), a second-generation general chemistry analyzer that incorporates the latest barcode and computer-interfacing technology. The development of the SMS system relies heavily on recent advances in technology, e.g., software modeling and computer-aided design. The SMS system includes a barcode scanner based on "charge-coupled device" technology, a random-access sample wheel, and new software that oversees the various functions.

  20. Device and method for automated separation of a sample of whole blood into aliquots

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  1. Direct determination of selenium in serum by electrothermal atomic absorption spectrometry using automated ultrasonic slurry sampling

    NASA Astrophysics Data System (ADS)

    Chen, Wen-Kang; Yen, Cheng-Chieh; Wei, Bai-Luh; Hu, Chao-Chin; Yu, Jya-Jyun; Chung, Chien; Kuo, Sheng-Chu

    1998-01-01

    Selenium concentration in body fluids is a good index to establish human selenium status. This work discusses the determination of selenium in serum by ETAAS using longitudinal Zeeman-effect background correction and combining the use of automated slurry sampling. The standard reference materials bovine serum (NIST, SRM 1598) and second-generation biological freeze-dried human serum are analyzed to verify the accuracy and precision of this technique. The direct method proposed in this study is used for the determination of selenium in human serum collected from healthy people of 19-25 years. The average accuracy values of certified reference serum samples and the recovery values of spiked samples indicate this method to be an efficient and rapid technique for determining selenium in biological samples.

  2. AUTOMATED GEOSPATICAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOICAL MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execut...

  3. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYRDOLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly

    developed by the USDA Agricultural Research Service, the U.S. Environmental Protection

    Agency, the University of Arizona, and the University of Wyoming to automate the

    parame...

  4. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR WATERSHED ASSESSMENT AND ANALYSIS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execu...

  5. A Bayesian Framework for the Automated Online Assessment of Sensor Data Quality

    PubMed Central

    Smith, Daniel; Timms, Greg; De Souza, Paulo; D'Este, Claire

    2012-01-01

    Online automated quality assessment is critical to determine a sensor's fitness for purpose in real-time applications. A Dynamic Bayesian Network (DBN) framework is proposed to produce probabilistic quality assessments and represent the uncertainty of sequentially correlated sensor readings. This is a novel framework to represent the causes, quality state and observed effects of individual sensor errors without imposing any constraints upon the physical deployment or measured phenomenon. It represents the casual relationship between quality tests and combines them in a way to generate uncertainty estimates of samples. The DBN was implemented for a particular marine deployment of temperature and conductivity sensors in Hobart, Australia. The DBN was shown to offer a substantial average improvement (34%) in replicating the error bars that were generated by experts when compared to a fuzzy logic approach. PMID:23012554

  6. Automated aerosol Raman spectrometer for semi-continuous sampling of atmospheric aerosol

    NASA Astrophysics Data System (ADS)

    Doughty, David C.; Hill, Steven C.

    2017-02-01

    Raman spectroscopy (RS) is useful in characterizing atmospheric aerosol. It is not commonly used in studying ambient particles partly because automated instrumentation for aerosol RS has not been available. Battelle (Columbus, Ohio, USA) has developed the Resource Effective Bioidentification System (REBS) for automated detection of airborne bioagents based on RS. We use a version of the REBS that measures Raman spectra of one set of particles while the next set of particles is collected from air, then moves the newly collected particles to the analysis region and repeats. Here we investigate the use of the REBS as the core of a general-purpose automated Aerosol Raman Spectrometer (ARS) for atmospheric applications. This REBS-based ARS can be operated as a line-scanning Raman imaging spectrometer. Spectra measured by this ARS for single particles made of polystyrene, black carbon, and several other materials are clearly distinguishable. Raman spectra from a 15 min ambient sample (approximately 35-50 particles, 158 spectra) were analyzed using a hierarchical clustering method to find that the cluster spectra are consistent with soot, inorganic aerosol, and other organic compounds. The ARS ran unattended, collecting atmospheric aerosol and measuring spectra for a 7 hr period at 15-min intervals. A total of 32,718 spectra were measured; 5892 exceeded a threshold and were clustered during this time. The number of particles exhibiting the D-G bands of amorphous carbon plotted vs time (at 15-min intervals) increases during the morning commute, then decreases. This data illustrates the potential of the ARS to measure thousands of time resolved aerosol Raman spectra in the ambient atmosphere over the course of several hours. The capability of this ARS for automated measurements of Raman spectra should lead to more extensive RS-based studies of atmospheric aerosols.

  7. Automated sample exchange and tracking system for neutron research at cryogenic temperatures.

    PubMed

    Rix, J E; Weber, J K R; Santodonato, L J; Hill, B; Walker, L M; McPherson, R; Wenzel, J; Hammons, S E; Hodges, J; Rennich, M; Volin, K J

    2007-01-01

    An automated system for sample exchange and tracking in a cryogenic environment and under remote computer control was developed. Up to 24 sample "cans" per cycle can be inserted and retrieved in a programed sequence. A video camera acquires a unique identification marked on the sample can to provide a record of the sequence. All operations are coordinated via a LABVIEW program that can be operated locally or over a network. The samples are contained in vanadium cans of 6-10 mm in diameter and equipped with a hermetically sealed lid that interfaces with the sample handler. The system uses a closed-cycle refrigerator (CCR) for cooling. The sample was delivered to a precooling location that was at a temperature of approximately 25 K, after several minutes, it was moved onto a "landing pad" at approximately 10 K that locates the sample in the probe beam. After the sample was released onto the landing pad, the sample handler was retracted. Reading the sample identification and the exchange operation takes approximately 2 min. The time to cool the sample from ambient temperature to approximately 10 K was approximately 7 min including precooling time. The cooling time increases to approximately 12 min if precooling is not used. Small differences in cooling rate were observed between sample materials and for different sample can sizes. Filling the sample well and the sample can with low pressure helium is essential to provide heat transfer and to achieve useful cooling rates. A resistive heating coil can be used to offset the refrigeration so that temperatures up to approximately 350 K can be accessed and controlled using a proportional-integral-derivative control loop. The time for the landing pad to cool to approximately 10 K after it has been heated to approximately 240 K was approximately 20 min.

  8. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  9. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  10. SAMPL4 & DOCK3.7: Lessons for automated docking procedures

    PubMed Central

    Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.

    2014-01-01

    The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: 1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, 2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, 3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed. PMID:24515818

  11. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  12. Accelerating in vitro studies on circadian clock systems using an automated sampling device

    PubMed Central

    Furuike, Yoshihiko; Abe, Jun; Mukaiyama, Atsushi; Akiyama, Shuji

    2016-01-01

    KaiC, a core protein of the cyanobacterial circadian clock, is rhythmically autophosphorylated and autodephosphorylated with a period of approximately 24 h in the presence of two other Kai proteins, KaiA and KaiB. In vitro experiments to investigate the KaiC phosphorylation cycle consume considerable time and effort. To automate the fractionation, quantification, and evaluation steps, we developed a suite consisting of an automated sampling device equipped with an 8-channel temperature controller and accompanying analysis software. Eight sample tables can be controlled independently at different temperatures within a fluctuation of ±0.01°C, enabling investigation of the temperature dependency of clock activities simultaneously in a single experiment. The suite includes an independent software that helps users intuitively conduct a densitometric analysis of gel images in a short time with improved reliability. Multiple lanes on a gel can be detected quasi-automatically through an auto-detection procedure implemented in the software, with or without correction for lane ‘smiling.’ To demonstrate the performance of the suite, robustness of the period against temperature variations was evaluated using 32 datasets of the KaiC phosphorylation cycle. By using the software, the time required for the analysis was reduced by approximately 65% relative to the conventional method, with reasonable reproducibility and quality. The suite is potentially applicable to other clock or clock-related systems in higher organisms, relieving users from having to repeat multiple manual sampling and analytical steps. PMID:27924279

  13. A proposed protocol for remote control of automated assessment devices

    SciTech Connect

    Kissock, P.S.

    1996-09-01

    Systems and devices that are controlled remotely are becoming more common in security systems in the US Air Force and other government agencies to provide protection of valuable assets. These systems reduce the number of needed personnel while still providing a high level of protection. However, each remotely controlled device usually has its own communication protocol. This limits the ability to change devices without changing the system that provides the communications control to the device. Sandia is pursuing a standard protocol that can be used to communicate with the different devices currently in use, or may be used in the future, in the US Air Force and other government agencies throughout the security community. Devices to be controlled include intelligent pan/tilt mounts, day/night video cameras., thermal imaging cameras, and remote data processors. Important features of this protocol include the ability to send messages of varying length, identify the sender, and more importantly, control remote data processors. As camera and digital signal processor (DSP) use expands, the DSP will begin to reside in the camera itself. The DSP can be used to provide auto-focus, frame-to- frame image registration, video motion detection (VMD), target detection, tracking, image compression, and many other functions. With the serial data control link, the actual DSP software can be updated or changed as required. Coaxial video cables may become obsolete once a compression algorithm is established in the DSP. This paper describes the proposed public domain protocol, features, and examples of use. The authors hope to elicit comments from security technology developers regarding format and use of remotely controlled automated assessment devices. 2 figs., 1 tab.

  14. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    PubMed Central

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231

  15. Screening urine samples for the absence of urinary tract infection using the sediMAX automated microscopy analyser.

    PubMed

    Sterry-Blunt, Rosanne E; S Randall, Karen; J Doughton, Michael; H Aliyu, Sani; Enoch, David A

    2015-06-01

    Urinalysis culminates in a workload skew within the clinical microbiology laboratory. Routine processing involves screening via manual microscopy or biochemical dipstick measurement, followed by culture for each sample. Despite this, as many as 80% of specimens are reported as negative; thus, there is vast wastage of resources and time, as well as delayed turnaround time of results as numerous negative cultures fulfil their required incubation time. Automation provides the potential for streamlining sample screening by efficiently (>30% sample exclusion) and reliably [negative predictive value (NPV) ≥ 95%] ruling out those likely to be negative, whilst also reducing resource usage and hands-on time. The present study explored this idea by using the sediMAX automated microscopy urinalysis platform. We prospectively collected and processed 1411 non-selected samples directly after routine laboratory processing. The results from this study showed multiple optimum cut-off values for microscopy. However, although optimum cut-off values permitted rule-out of 40.1% of specimens, an associated 87.5% NPV was lower than the acceptable limit of 95%. Sensitivity and specificity of leukocytes and bacteria in determining urinary tract infection was assessed by receiver operator characteristic curves with area under the curve values found to be 0.697 [95% confidence interval (CI): 0.665-0.729] and 0.587 (95% CI: 0.551-0.623), respectively. We suggested that the sediMAX was not suitable for use as a rule-out screen prior to culture and further validation work must be carried out before routine use of the analyser.

  16. Performance verification of the Maxwell 16 Instrument and DNA IQ Reference Sample Kit for automated DNA extraction of known reference samples.

    PubMed

    Krnajski, Z; Geering, S; Steadman, S

    2007-12-01

    Advances in automation have been made for a number of processes conducted in the forensic DNA laboratory. However, because most robotic systems are designed for high-throughput laboratories batching large numbers of samples, smaller laboratories are left with a limited number of cost-effective options for employing automation. The Maxwell 16 Instrument and DNA IQ Reference Sample Kit marketed by Promega are designed for rapid, automated purification of DNA extracts from sample sets consisting of sixteen or fewer samples. Because the system is based on DNA capture by paramagnetic particles with maximum binding capacity, it is designed to generate extracts with yield consistency. The studies herein enabled evaluation of STR profile concordance, consistency of yield, and cross-contamination performance for the Maxwell 16 Instrument. Results indicate that the system performs suitably for streamlining the process of extracting known reference samples generally used for forensic DNA analysis and has many advantages in a small or moderate-sized laboratory environment.

  17. Automation of high-frequency sampling of environmental waters for reactive species

    NASA Astrophysics Data System (ADS)

    Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.

    2011-12-01

    Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the

  18. An automated method for 'clumped-isotope' measurements on small carbonate samples.

    PubMed

    Schmid, Thomas W; Bernasconi, Stefano M

    2010-07-30

    Clumped-isotope geochemistry deals with the state of ordering of rare isotopes in molecules, in particular with their tendency to form bonds with other rare isotopes rather than with the most abundant ones. Among its possible applications, carbonate clumped-isotope thermometry is the one that has gained most attention because of the wide potential of applications in many disciplines of earth sciences. Clumped-isotope thermometry allows reconstructing the temperature of formation of carbonate minerals without knowing the isotopic composition of the water from which they were formed. This feature enables new approaches in paleothermometry. The currently published method is, however, limited by sample weight requirements of 10-15 mg and because measurements are performed manually. In this paper we present a new method using an automated sample preparation device coupled to an isotope ratio mass spectrometer. The method is based on the repeated analysis (n = 6-8) of 200 microg aliquots of sample material and completely automated measurements. In addition, we propose to use precisely calibrated carbonates spanning a wide range in Delta(47) instead of heated gases to correct for isotope effects caused by the source of the mass spectrometer, following the principle of equal treatment of the samples and standards. We present data for international standards (NBS 19 and LSVEC) and different carbonates formed at temperatures exceeding 600 degrees C to show that precisions in the range of 10 to 15 ppm (1 SE) can be reached for repeated analyses of a single sample. Finally, we discuss and validate the correction procedure based on high-temperature carbonates instead of heated gases.

  19. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  20. Automated high-throughput in vitro screening of the acetylcholine esterase inhibiting potential of environmental samples, mixtures and single compounds.

    PubMed

    Froment, Jean; Thomas, Kevin V; Tollefsen, Knut Erik

    2016-08-01

    A high-throughput and automated assay for testing the presence of acetylcholine esterase (AChE) inhibiting compounds was developed, validated and applied to screen different types of environmental samples. Automation involved using the assay in 96-well plates and adapting it for the use with an automated workstation. Validation was performed by comparing the results of the automated assay with that of a previously validated and standardised assay for two known AChE inhibitors (paraoxon and dichlorvos). The results show that the assay provides similar concentration-response curves (CRCs) when run according to the manual and automated protocol. Automation of the assay resulted in a reduction in assay run time as well as in intra- and inter-assay variations. High-quality CRCs were obtained for both of the model AChE inhibitors (dichlorvos IC50=120µM and paraoxon IC50=0.56µM) when tested alone. The effect of co-exposure of an equipotent binary mixture of the two chemicals were consistent with predictions of additivity and best described by the concentration addition model for combined toxicity. Extracts of different environmental samples (landfill leachate, wastewater treatment plant effluent, and road tunnel construction run-off) were then screened for AChE inhibiting activity using the automated bioassay, with only landfill leachate shown to contain potential AChE inhibitors. Potential uses and limitations of the assay were discussed based on the present results.

  1. Toward Automated Computer-Based Visualization and Assessment of Team-Based Performance

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk

    2014-01-01

    A considerable amount of research has been undertaken to provide insights into the valid assessment of team performance. However, in many settings, manual and therefore labor-intensive assessment instruments for team performance have limitations. Therefore, automated assessment instruments enable more flexible and detailed insights into the…

  2. Automated Generation and Assessment of Autonomous Systems Test Cases

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.

    2008-01-01

    This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results

  3. Automated Assessment of Speech Fluency for L2 English Learners

    ERIC Educational Resources Information Center

    Yoon, Su-Youn

    2009-01-01

    This dissertation provides an automated scoring method of speech fluency for second language learners of English (L2 learners) based that uses speech recognition technology. Non-standard pronunciation, frequent disfluencies, faulty grammar, and inappropriate lexical choices are crucial characteristics of L2 learners' speech. Due to the ease of…

  4. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach.

  5. Automated Scoring in Context: Rapid Assessment for Placed Students

    ERIC Educational Resources Information Center

    Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal

    2013-01-01

    This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…

  6. Harmonization of automated hemolysis index assessment and use: Is it possible?

    PubMed

    Dolci, Alberto; Panteghini, Mauro

    2014-05-15

    The major source of errors producing unreliable laboratory test results is the pre-analytical phase with hemolysis accounting for approximately half of them and being the leading cause of unsuitable blood specimens. Hemolysis may produce interference in many laboratory tests by a variety of biological and analytical mechanisms. Consequently, laboratories need to systematically detect and reliably quantify hemolysis in every collected sample by means of objective and consistent technical tools that assess sample integrity. This is currently done by automated estimation of hemolysis index (HI), available on almost all clinical chemistry platforms, making the hemolysis detection reliable and reportable patient test results more accurate. Despite these advantages, a degree of variability still affects the HI estimate and more efforts should be placed on harmonization of this index. The harmonization of HI results from different analytical systems should be the immediate goal, but the scope of harmonization should go beyond analytical steps to include other aspects, such as HI decision thresholds, criteria for result interpretation and application in clinical practice as well as report formats. With regard to this, relevant issues to overcome remain the objective definition of a maximum allowable bias for hemolysis interference based on the clinical application of the measurements and the management of unsuitable samples. Particularly, for the latter a recommended harmonized approach is required when not reporting numerical results of unsuitable samples with significantly increased HI and replacing the test result with a specific comment highlighting hemolysis of the sample.

  7. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  8. Automated MALDI Matrix Coating System for Multiple Tissue Samples for Imaging Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Mounfield, William P.; Garrett, Timothy J.

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  9. Automated MALDI matrix coating system for multiple tissue samples for imaging mass spectrometry.

    PubMed

    Mounfield, William P; Garrett, Timothy J

    2012-03-01

    Uniform matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is key for reproducible analyte ion signals. Current methods often result in nonhomogenous matrix deposition, and take time and effort to produce acceptable ion signals. Here we describe a fully-automated method for matrix deposition using an enclosed spray chamber and spray nozzle for matrix solution delivery. A commercial air-atomizing spray nozzle was modified and combined with solenoid controlled valves and a Programmable Logic Controller (PLC) to control and deliver the matrix solution. A spray chamber was employed to contain the nozzle, sample, and atomized matrix solution stream, and to prevent any interference from outside conditions as well as allow complete control of the sample environment. A gravity cup was filled with MALDI matrix solutions, including DHB in chloroform/methanol (50:50) at concentrations up to 60 mg/mL. Various samples (including rat brain tissue sections) were prepared using two deposition methods (spray chamber, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed a uniform coating of matrix crystals across the sample. Overall, the mass spectral images gathered from tissues coated using the spray chamber system were of better quality and more reproducible than from tissue specimens prepared by the inkjet deposition method.

  10. Microbiological monitoring and automated event sampling at karst springs using LEO-satellites.

    PubMed

    Stadler, H; Skritek, P; Sommer, R; Mach, R L; Zerobin, W; Farnleitner, A H

    2008-01-01

    Data communication via Low-Earth-Orbit (LEO) Satellites between portable hydrometeorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E. coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples compared with the auto-sampling procedure revealed corresponding results and were in agreement with the ISO 9308-1 reference method. E. coli concentrations were individually corrected by event specific inactivation coefficients (0.10-0.14 day(-1)), compensating losses due to sample storage at spring temperature in the auto sampler.Two large summer events in 2005/2006 at an important alpine karst spring (LKAS2) were monitored including detailed analysis of E. coli dynamics (n = 271) together with comprehensive hydrological characterisations. High-resolution time series demonstrated a sudden increase of E. coli concentrations in spring water (approximately 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorption coefficient measured at 254 nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-satellite based system it is a helpful tool for early-warning systems in the field of drinking water protection.

  11. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    PubMed

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  12. A modifiable microarray-based universal sensor: providing sample-to-results automation.

    PubMed

    Yasmin, Rubina; Zhu, Hui; Chen, Zongyuan; Montagna, Richard A

    2016-10-01

    A microfluidic system consisting of generic single use cartridges which interface with a workstation allows the automatic performance of all necessary sample preparation, PCR analysis and interpretation of multiplex PCR assays. The cartridges contain a DNA array with 20 different 16mer DNA "universal" probes immobilized at defined locations. PCR amplicons can be detected via hybridization of user-defined "reporter" probes that are complementary at their 3' termini to one or more of the universal probes and complementary to the target amplicons at their 5' termini. The system was able to detect single-plex and multiplex PCR amplicons from various infectious agents as well as wild type and mutant alleles of single nucleotide polymorphisms. The system's ease of use was further demonstrated by converting a published PCR assay for the detection of Mycobacterium genitalium in a fully automated manner. Excellent correlation between traditional manual methods and the automated analysis performed by the workstation suggests that the system can provide a means to easily design and implement a variety of customized PCR-based assays. The system will be useful to researchers or clinical investigators seeking to develop their own user defined assays. As the U.S. FDA continues to pursue regulatory oversight of LDTs, the system would also allow labs to continue to develop compliant assays.

  13. Automated Device for Asynchronous Extraction of RNA, DNA, or Protein Biomarkers from Surrogate Patient Samples.

    PubMed

    Bitting, Anna L; Bordelon, Hali; Baglia, Mark L; Davis, Keersten M; Creecy, Amy E; Short, Philip A; Albert, Laura E; Karhade, Aditya V; Wright, David W; Haselton, Frederick R; Adams, Nicholas M

    2016-12-01

    Many biomarker-based diagnostic methods are inhibited by nontarget molecules in patient samples, necessitating biomarker extraction before detection. We have developed a simple device that purifies RNA, DNA, or protein biomarkers from complex biological samples without robotics or fluid pumping. The device design is based on functionalized magnetic beads, which capture biomarkers and remove background biomolecules by magnetically transferring the beads through processing solutions arrayed within small-diameter tubing. The process was automated by wrapping the tubing around a disc-like cassette and rotating it past a magnet using a programmable motor. This device recovered biomarkers at ~80% of the operator-dependent extraction method published previously. The device was validated by extracting biomarkers from a panel of surrogate patient samples containing clinically relevant concentrations of (1) influenza A RNA in nasal swabs, (2) Escherichia coli DNA in urine, (3) Mycobacterium tuberculosis DNA in sputum, and (4) Plasmodium falciparum protein and DNA in blood. The device successfully extracted each biomarker type from samples representing low levels of clinically relevant infectivity (i.e., 7.3 copies/µL of influenza A RNA, 405 copies/µL of E. coli DNA, 0.22 copies/µL of TB DNA, 167 copies/µL of malaria parasite DNA, and 2.7 pM of malaria parasite protein).

  14. Automated suppression of sample-related artifacts in Fluorescence Correlation Spectroscopy.

    PubMed

    Ries, Jonas; Bayer, Mathias; Csúcs, Gábor; Dirkx, Ronald; Solimena, Michele; Ewers, Helge; Schwille, Petra

    2010-05-24

    Fluorescence Correlation Spectroscopy (FCS) in cells often suffers from artifacts caused by bright aggregates or vesicles, depletion of fluorophores or bleaching of a fluorescent background. The common practice of manually discarding distorted curves is time consuming and subjective. Here we demonstrate the feasibility of automated FCS data analysis with efficient rejection of corrupted parts of the signal. As test systems we use a solution of fluorescent molecules, contaminated with bright fluorescent beads, as well as cells expressing a fluorescent protein (ICA512-EGFP), which partitions into bright secretory granules. This approach improves the accuracy of FCS measurements in biological samples, extends its applicability to especially challenging systems and greatly simplifies and accelerates the data analysis.

  15. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    USGS Publications Warehouse

    Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.

  16. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  17. An automated method for fibrin clot permeability assessment.

    PubMed

    Ząbczyk, Michał; Piłat, Adam; Awsiuk, Magdalena; Undas, Anetta

    2015-01-01

    The fibrin clot permeability coefficient (Ks) is a useful measure of porosity of the fibrin network, which is determined by a number of genetic and environmental factors. Currently available methods to evaluate Ks are time-consuming, require constant supervision and provide only one parameter. We present an automated method in which drops are weighed individually, buffer is dosed by the pump and well defined clot washing is controlled by the software. The presence of a straight association between drop mass and their dripping time allows to shorten the measurement time twice. In 40 healthy individuals, Ks, the number of drops required to reach the plateau (DTP), the time to achieve the plateau (TTP) and the DTP/TTP ratio (DTR) were calculated. There was a positive association between Ks (r = 0.69, P < 0.0001) evaluated by using the manual [median of 4.17 (3.60-5.18) ·10⁻⁹ cm²) and the automated method [median of 4.35 (3.74-5.38) ·10⁻⁹ cm²]. The correlation was stronger (r = 0.85, P < 0.001) in clots with DTP of 7 or less (n = 12). DTP was associated with total homocysteine (tHcy) (r = 0.35, P < 0.05) and activated partial thromboplastin time (APTT) (r = -0.34, P < 0.05), TTP with Ks (r = -0.55, P < 0.01 for the manual method and r = -0.44, P < 0.01 for the automated method) and DTP (r = 0.75, P < 0.0001), and DTR with Ks (r = 0.70, P < 0.0001 for the manual method and r = 0.76, P < 0.0001 for the automated method), fibrinogen (r = -0.58, P < 0.0001) and C-reactive protein (CRP) (r = -0.47, P < 0.01). The automated method might be a suitable tool for research and clinical use and may offer more additional parameters describing fibrin clot structure.

  18. In vivo hippocampal measurement and memory: a comparison of manual tracing and automated segmentation in a large community-based sample.

    PubMed

    Cherbuin, Nicolas; Anstey, Kaarin J; Réglade-Meslin, Chantal; Sachdev, Perminder S

    2009-01-01

    While manual tracing is the method of choice in measuring hippocampal volume, its time intensive nature and proneness to human error make automated methods attractive, especially when applied to large samples. Few studies have systematically compared the performance of the two techniques. In this study, we measured hippocampal volumes in a large (N = 403) population-based sample of individuals aged 44-48 years using manual tracing by a trained researcher and automated procedure using Freesurfer (http://surfer.nmr.mgh.harvard.edu) imaging suite. Results showed that absolute hippocampal volumes assessed with these methods were significantly different, with automated measures using the Freesurfer software suite being significantly larger, by 23% for the left and 29% for the right hippocampus. The correlation between the two methods varied from 0.61 to 0.80, with lower correlations for hippocampi with visible abnormalities. Inspection of 2D and 3D models suggested that this difference was largely due to greater inclusion of boundary voxels by the automated method and variations in subiculum/entorhinal segmentation. The correlation between left and right hippocampal volumes was very similar by the two methods. The relationship of hippocampal volumes to selected sociodemographic and cognitive variables was not affected by the measurement method, with each measure showing an association with memory performance and suggesting that both were equally valid for this purpose. This study supports the use of automated measures, based on Freesurfer in this instance, as being sufficiently reliable and valid particularly in the context of larger sample sizes when the research question does not rely on 'true' hippocampal volumes.

  19. Automation of sample preparation for mass cytometry barcoding in support of clinical research: protocol optimization.

    PubMed

    Nassar, Ala F; Wisnewski, Adam V; Raddassi, Khadir

    2017-03-01

    Analysis of multiplexed assays is highly important for clinical diagnostics and other analytical applications. Mass cytometry enables multi-dimensional, single-cell analysis of cell type and state. In mass cytometry, the rare earth metals used as reporters on antibodies allow determination of marker expression in individual cells. Barcode-based bioassays for CyTOF are able to encode and decode for different experimental conditions or samples within the same experiment, facilitating progress in producing straightforward and consistent results. Herein, an integrated protocol for automated sample preparation for barcoding used in conjunction with mass cytometry for clinical bioanalysis samples is described; we offer results of our work with barcoding protocol optimization. In addition, we present some points to be considered in order to minimize the variability of quantitative mass cytometry measurements. For example, we discuss the importance of having multiple populations during titration of the antibodies and effect of storage and shipping of labelled samples on the stability of staining for purposes of CyTOF analysis. Data quality is not affected when labelled samples are stored either frozen or at 4 °C and used within 10 days; we observed that cell loss is greater if cells are washed with deionized water prior to shipment or are shipped in lower concentration. Once the labelled samples for CyTOF are suspended in deionized water, the analysis should be performed expeditiously, preferably within the first hour. Damage can be minimized if the cells are resuspended in phosphate-buffered saline (PBS) rather than deionized water while waiting for data acquisition.

  20. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  1. Drug Discovery Testing Compounds in Patients Samples by Automated Flow Cytometry.

    PubMed

    Hernández, Pilar; Gorrochategui, Julián; Primo, Daniel; Robles, Alicia; Rojas, José Luis; Espinosa, Ana Belén; Gómez, Cristina; Martínez-López, Joaquín; Bennett, Teresa A; Ballesteros, Joan

    2017-03-01

    Functional ex vivo assays that predict a patient's clinical response to anticancer drugs for guiding cancer treatment have long been a goal, but few have yet proved to be reliable. To address this, we have developed an automated flow cytometry platform for drug screening that evaluates multiple endpoints with a robust data analysis system that can capture the complex mechanisms of action across different compounds. This system, called PharmaFlow, is used to test peripheral blood or bone marrow samples from patients diagnosed with hematological malignancies. Functional assays that use the whole sample, retaining all the microenvironmental components contained in the sample, offer an approach to ex vivo testing that may give results that are clinically relevant. This new approach can help to predict the patients' response to existing treatments or to drugs under development, for hematological malignancies or other tumors. In addition, relevant biomarkers can be identified that determine the patient's sensitivity, resistance, or toxicity to a given treatment. We propose that this approach, which better recapitulates the human microenvironment, constitutes a more predictive assay for personalized medicine and preclinical drug discovery.

  2. From Sample Changer to the Robotic Rheometer: Automation and High Throughput Screening in Rotational Rheometry

    NASA Astrophysics Data System (ADS)

    Läuger, Jörg; Krenn, Michael

    2008-07-01

    A fully automated, robotically operated rheometer was developed. The full functionality, modularity and accuracy of the rotational rheometer are available, which means the modern principles of high-throughput screening are brought to full function on the rheometer. The basic rheometer setup remains as modular as before including the ability to run all test modes the rheometer offers with the difference that the high-throughput rheometer now performs all measuring steps automatically. In addition, the standard and proven environmental chambers of the rheometer are available. The rheometer itself runs by the standard rheometer software and the measurement data and analysis results can be transferred to a monitoring database. The sample loading and the cleaning of the geometries is assisted by a sample preparation unit and a cleaning station, respectively. The sample throughput is further maximized by the use of multiple geometries allowing the simultaneous rheological measurement by the rheometer and the cleaning of the geometries at the cleaning station by the robot. The High-Throughput Rheometer (HTR) and its special adaptation to different applications like dispersions and polymer melts are described.

  3. Design and Implementation of an Automated Illuminating, Culturing, and Sampling System for Microbial Optogenetic Applications.

    PubMed

    Stewart, Cameron J; McClean, Megan N

    2017-02-19

    Optogenetic systems utilize genetically-encoded proteins that change conformation in response to specific wavelengths of light to alter cellular processes. There is a need for culturing and measuring systems that incorporate programmed illumination and stimulation of optogenetic systems. We present a protocol for building and using a continuous culturing apparatus to illuminate microbial cells with programmed doses of light, and automatically acquire and analyze images of cells in the effluent. The operation of this apparatus as a chemostat allows the growth rate and the cellular environment to be tightly controlled. The effluent of the continuous cell culture is regularly sampled and the cells are imaged by multi-channel microscopy. The culturing, sampling, imaging, and image analysis are fully automated so that dynamic responses in the fluorescence intensity and cellular morphology of cells sampled from the culture effluent are measured over multiple days without user input. We demonstrate the utility of this culturing apparatus by dynamically inducing protein production in a strain of Saccharomyces cerevisiae engineered with an optogenetic system that activates transcription.

  4. Automated measurement and quantification of heterotrophic bacteria in water samples based on the MPN method.

    PubMed

    Fuchsluger, C; Preims, M; Fritz, I

    2011-01-01

    Quantification of heterotrophic bacteria is a widely used measure for water analysis. Especially in terms of drinking water analysis, testing for microorganisms is strictly regulated by the European Drinking Water Directive, including quality criteria and detection limits. The quantification procedure presented in this study is based on the most probable number (MPN) method, which was adapted to comply with the need for a quick and easy screening tool for different kinds of water samples as well as varying microbial loads. Replacing tubes with 24-well titer plates for cultivation of bacteria drastically reduces the amount of culture media and also simplifies incubation. Automated photometric measurement of turbidity instead of visual evaluation of bacterial growth avoids misinterpretation by operators. Definition of a threshold ensures definite and user-independent determination of microbial growth. Calculation of the MPN itself is done using a program provided by the US Food and Drug Administration (FDA). For evaluation of the method, real water samples of different origins as well as pure cultures of bacteria were analyzed in parallel with the conventional plating methods. Thus, the procedure described requires less preparation time, reduces costs and ensures both stable and reliable results for water samples.

  5. Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing

    ERIC Educational Resources Information Center

    Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt

    2012-01-01

    The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…

  6. ADDING GLOBAL SOILS DATA TO THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL (AGWA)

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA) is a GIS-based hydrologic modeling tool that is available as an extension for ArcView 3.x from the USDA-ARS Southwest Watershed Research Center (www.tucson.ars.ag.gov/agwa). AGWA is designed to facilitate the assessment of...

  7. To the development of an automated system of assessment of radiological images of joints

    NASA Astrophysics Data System (ADS)

    Grechikhin, A. I.; Grunina, E. A.; Karetnikova, I. R.

    2008-03-01

    An algorithm developed for the adaptive automated computer processing of radiological images of hands and feet in order to assess the degree of bone and cartilage destruction in rheumatoid arthritis is described. A set of new numeral signs was proposed in order to assess a degree of arthritis radiological progression.

  8. Designing an Automated Assessment of Public Speaking Skills Using Multimodal Cues

    ERIC Educational Resources Information Center

    Chen, Lei; Feng, Gary; Leong, Chee Wee; Joe, Jilliam; Kitchen, Christopher; Lee, Chong Min

    2016-01-01

    Traditional assessments of public speaking skills rely on human scoring. We report an initial study on the development of an automated scoring model for public speaking performances using multimodal technologies. Task design, rubric development, and human rating were conducted according to standards in educational assessment. An initial corpus of…

  9. Development of computer automated decision support system for surface water quality assessment

    NASA Astrophysics Data System (ADS)

    Sharma, Asheesh; Naidu, Madhuri; Sargaonkar, Aabha

    2013-02-01

    The Overall Index of Pollution (OIP) is a single number that expresses the overall water quality by integrating measurements of 14 different physicochemical, toxicological, and bacteriological water quality parameters. It provides a simple and concise method for water quality classification as, 'Excellent', 'Acceptable', 'Slightly Polluted', 'Polluted', and 'Heavily Polluted'. OIP values range from 0 to 16. A high OIP value signals poor water quality, while a low value signals good water quality based on the classification scheme developed for India. In this paper, we present a computer-automated, user-friendly, and standalone Surface Water Quality Assessment Tool (SWQAT), which calculates OIP values and displays it on Google map. The software is developed in VB.Net and SQL database. The software application is demonstrated through water quality assessment of two rivers of India, namely Cauvery and Tungabhadra. OIP values are estimated at 10 sampling stations on the river Cauvery and at eight sampling stations on the river Tungabhadra. The Cauvery river OIP scores in the range 0.85-7.91 while for Tungabhadra river, it is in range 2.08 to 8.97. The results are useful to analyze the variations in the water quality of different sites at different times. SWQAT improves understanding of general water quality issues, communicates water quality status, and draws the need for and effectiveness of protection measures.

  10. Quantification of 4-beta-hydroxycholesterol in human plasma using automated sample preparation and LC-ESI-MS/MS analysis.

    PubMed

    Goodenough, Angela K; Onorato, Joelle M; Ouyang, Zheng; Chang, Shu; Rodrigues, A David; Kasichayanula, Sreeneeranj; Huang, Shu-Pang; Turley, Wesley; Burrell, Richard; Bifano, Marc; Jemal, Mohammed; LaCreta, Frank; Tymiak, Adrienne; Wang-Iverson, David

    2011-09-19

    It has recently been proposed that plasma levels of 4β-hydroxycholesterol (4βHC) may be indicative of cytochrome P450 3A4 (P450 3A) activity and therefore could be used to probe for P450 3A-mediated drug-drug interactions. With this in mind, we describe a highly sensitive and precise liquid chromatography-electrospray ionization-tandem mass spectrometry method for the measurement of 4βHC in human plasma with a lower limit of quantification established at 2 ng/mL using 50 μL of plasma. The entire sample preparation scheme including saponification and derivatization of 4βHC to the corresponding dipicolinyl ester (DPE) was completed in less than 8 h using an automated sample preparation scheme enabling higher-throughput capabilities. Chromatographic resolution of 4βHC from 4α-hydroxycholesterol and other endogenous isobaric species was achieved in 11-min using an isocratic gradient on a C18 column. Because of endogenous concentrations of 4βHC in plasma, a stable isotope labeled (SIL) analogue, d7-4βHC, was used as a surrogate analyte and measured in the standard curve and quality control samples prepared in plasma. A second SIL analogue, d4-4βHC, was used as the internal standard. The intraday and interday accuracy for the assay was within 6% of nominal concentrations, and the precision for these measurements was less than 5% relative standard deviation. Rigorous stability assessments demonstrated adequate stability of endogenous 4βHC in plasma and the corresponding DPE derivative for the analysis of clinical study samples. The results from clinical samples following treatment with a potent P450 3A inducer (rifampin) or inhibitor (ketoconazole) are reported and demonstrate the potential future application for this highly precise and robust analytical assay.

  11. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    PubMed

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  12. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    SciTech Connect

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  13. Analysis of zearalenone in cereal and Swine feed samples using an automated flow-through immunosensor.

    PubMed

    Urraca, Javier L; Benito-Peña, Elena; Pérez-Conde, Concepción; Moreno-Bondi, María C; Pestka, James J

    2005-05-04

    The development of a sensitive flow-though immunosensor for the analysis of the mycotoxin zearalenone in cereal samples is described. The sensor was completely automated and was based on a direct competitive immunosorbent assay and fluorescence detection. The mycotoxin competes with a horseradish-peroxidase-labeled derivative for the binding sites of a rabbit polyclonal antibody. Control pore glass covalently bound to Prot A was used for the oriented immobilization of the antibody-antigen immunocomplexes. The immunosensor shows an IC(50) value of 0.087 ng mL(-1) (RSD = 2.8%, n = 6) and a dynamic range from 0.019 to 0.422 ng mL(-1). The limit of detection (90% of blank signal) of 0.007 ng mL(-1) (RSD = 3.9%, n = 3) is lower than previously published methods. Corn, wheat, and swine feed samples have been analyzed with the device after extraction of the analyte using accelerated solvent extraction (ASE). The immunosensor has been validated using a corn certificate reference material and HPLC with fluorescence detection.

  14. Measurement of airborne carbonyls using an automated sampling and analysis system.

    PubMed

    Aiello, Mauro; McLaren, Robert

    2009-12-01

    Based upon the well established method of derivitization with 2,4-dinitrophenylhydrazine, an instrument was developed for ambient measurement of carbonyls with significantly improved temporal resolution and detection limits through automation, direct injection, and continuous use of a single microsilica DNPH cartridge. Kinetic experiments indicate that the derivitization reaction on the cartridge is fast enough for continuous measurements with 50 min air sampling. Reaction efficiencies measured on the cartridge were 100% for the carbonyls tested, including formaldehyde, acetaldehyde, propanal, acetone, and benzaldehyde. Transmission of the carbonyls through an ozone scrubber (KI) were in the range of 97-101%. Blank levels and detection limits were lower than those obtainable with conventional DNPH methods by an order of magnitude or greater. Mixing ratio detection limits of carbonyls in ambient air were 38-73 ppt for a 50 min air sample (2.5 L). The instrument made continuous measurements of carbonyls on a 2 h cycle over a period of 10 days during a field study in southwestern Ontario. Median mixing ratios were 0.58 ppb formaldehyde; 0.29 ppb acetaldehyde; 1.14 ppb acetone; and 0.45 ppb glyoxal. Glyoxal shows a significant correlation with ozone and zero intercept, consistent with a secondary source and minor direct source to the atmosphere. The method should easily be extendable to the detection of other low molecular weight carbonyls that have been previously reported using the DNPH technique.

  15. Plasma cortisol and norepinephrine concentrations in pigs: automated sampling of freely moving pigs housed in the PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and norepinephrine (NE) responses in automated versus manual blood sampling. A ...

  16. Plasma cortisol and noradrenalin concentrations in pigs: automated sampling of freely moving pigs housed in PigTurn versus manually sampled and restrained pigs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...

  17. Strategies for automated sample preparation, nucleic acid purification, and concentration of low-target-number nucleic acids in environmental and food processing samples

    NASA Astrophysics Data System (ADS)

    Bruckner-Lea, Cynthia J.; Holman, David A.; Schuck, Beatrice L.; Brockman, Fred J.; Chandler, Darrell P.

    1999-01-01

    The purpose of this work is to develop a rapid, automated system for nucleic acid purification and concentration from environmental and food processing samples. Our current approach involves off-line filtration and cell lysis (ballistic disintegration) functions in appropriate buffers followed by automated nucleic acid capture and purification on renewable affinity matrix microcolumns. Physical cell lysis and renewable affinity microcolumns eliminate the need for toxic organic solvents, enzyme digestions or other time- consuming sample manipulations. Within the renewable affinity microcolumn, we have examined nucleic acid capture and purification efficiency with various microbead matrices (glass, polymer, paramagnetic), surface derivitization (sequence-specific capture oligonucleotides or peptide nucleic acids), and DNA target size and concentration under variable solution conditions and temperatures. Results will be presented comparing automated system performance relative to benchtop procedures for both clean (pure DNA from a laboratory culture) and environmental (soil extract) samples, including results which demonstrate 8 minute purification and elution of low-copy nucleic acid targets from a crude soil extract in a form suitable for PCR or microarray-based detectors. Future research will involve the development of improved affinity reagents and complete system integration, including upstream cell concentration and cell lysis functions and downstream, gene-based detectors. Results of this research will ultimately lead to improved processes and instrumentation for on-line, automated monitors for pathogenic micro-organisms in food, water, air, and soil samples.

  18. Automated extraction and quantitation of oncogenic HPV genotypes from cervical samples by a real-time PCR-based system.

    PubMed

    Broccolo, Francesco; Cocuzza, Clementina E

    2008-03-01

    Accurate laboratory assays for the diagnosis of persistent oncogenic HPV infection are being recognized increasingly as essential for clinical management of women with cervical precancerous lesions. HPV viral load has been suggested to be a surrogate marker of persistent infection. Four independent real-time quantitative TaqMan PCR assays were developed for: HPV-16, -31, -18 and/or -45 and -33 and/or -52, -58, -67. The assays had a wide dynamic range of detection and a high degree of accuracy, repeatability and reproducibility. In order to minimize material and hands-on time, automated nucleic acid extraction was performed using a 96-well plate format integrated into a robotic liquid handler workstation. The performance of the TaqMan assays for HPV identification was assessed by comparing results with those obtained by means of PCR using consensus primers (GP5+/GP6+) and sequencing (296 samples) and INNO-LiPA analysis (31 samples). Good agreement was found generally between results obtained by real-time PCR assays and GP(+)-PCR system (kappa statistic=0.91). In conclusion, this study describes four newly developed real-time PCR assays that provide a reliable and high-throughput method for detection of not only HPV DNA but also HPV activity of the most common oncogenic HPV types in cervical specimens.

  19. Automated dispersive liquid-liquid microextraction coupled to high performance liquid chromatography - cold vapour atomic fluorescence spectroscopy for the determination of mercury species in natural water samples.

    PubMed

    Liu, Yao-Min; Zhang, Feng-Ping; Jiao, Bao-Yu; Rao, Jin-Yu; Leng, Geng

    2017-04-14

    An automated, home-constructed, and low cost dispersive liquid-liquid microextraction (DLLME) device that directly coupled to a high performance liquid chromatography (HPLC) - cold vapour atomic fluorescence spectroscopy (CVAFS) system was designed and developed for the determination of trace concentrations of methylmercury (MeHg(+)), ethylmercury (EtHg(+)) and inorganic mercury (Hg(2+)) in natural waters. With a simple, miniaturized and efficient automated DLLME system, nanogram amounts of these mercury species were extracted from natural water samples and injected into a hyphenated HPLC-CVAFS for quantification. The complete analytical procedure, including chelation, extraction, phase separation, collection and injection of the extracts, as well as HPLC-CVAFS quantification, was automated. Key parameters, such as the type and volume of the chelation, extraction and dispersive solvent, aspiration speed, sample pH, salt effect and matrix effect, were thoroughly investigated. Under the optimum conditions, linear range was 10-1200ngL(-1) for EtHg(+) and 5-450ngL(-1) for MeHg(+) and Hg(2+). Limits of detection were 3.0ngL(-1) for EtHg(+) and 1.5ngL(-1) for MeHg(+) and Hg(2+). Reproducibility and recoveries were assessed by spiking three natural water samples with different Hg concentrations, giving recoveries from 88.4-96.1%, and relative standard deviations <5.1%.

  20. Carotid Catheterization and Automated Blood Sampling Induce Systemic IL-6 Secretion and Local Tissue Damage and Inflammation in the Heart, Kidneys, Liver and Salivary Glands in NMRI Mice

    PubMed Central

    Teilmann, Anne Charlotte; Rozell, Björn; Kalliokoski, Otto; Hau, Jann; Abelson, Klas S. P.

    2016-01-01

    Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels of the cytokines IL-1β, IL-2, IL-6, IL-10, IL-17A, GM-CSF, IFN-γ and TNF-α in male NMRI mice that had been subjected to carotid artery catheterization and subsequent automated blood sampling with age-matched control mice. Body weight and histopathological changes in the surgical area, including the salivary glands, the heart, brain, spleen, liver, kidneys and lungs were compared. Catheterized mice had higher levels of IL-6 than did control mice, but other cytokine levels did not differ between the groups. No significant difference in body weight was found. The histology revealed inflammatory and regenerative (healing) changes at surgical sites of all catheterized mice, with mild inflammatory changes extending into the salivary glands. Several catheterized mice had multifocal degenerative to necrotic changes with inflammation in the heart, kidneys and livers, suggesting that thrombi had detached from the catheter tip and embolized to distant sites. Thus, catheterization and subsequent automated blood sampling may have physiological impact. Possible confounding effects of visceral damage should be assessed and considered, when using catheterized mouse models. PMID:27832170

  1. Development of an automated multiple-target mask CD disposition system to enable new sampling strategy

    NASA Astrophysics Data System (ADS)

    Ma, Jian; Farnsworth, Jeff; Bassist, Larry; Cui, Ying; Mammen, Bobby; Padmanaban, Ramaswamy; Nadamuni, Venkatesh; Kamath, Muralidhar; Buckmann, Ken; Neff, Julie; Freiberger, Phil

    2006-03-01

    Traditional mask critical dimension (CD) disposition systems with only one or two targets is being challenged by the new requirements from mask-users as the wafer process control becomes more complicated in the newer generation of technologies. Historically, the mask shop does not necessarily measure and disposition off the same kind of CD structures that wafer fabs do. Mask disposition specifications and structures come from the frame-design and the tapeout, while wafer-level CD dispositions are mainly based on the historical process window established per CD-skew experiments and EOL (end of line) yield. In the current high volume manufacturing environment, the mask CDs are mainly dispositioned off their mean-to-target (MTT) and uniformity (6sigma) on one or two types of pre-determined structures. The disposition specification is set to ensure the printed mask will meet the design requirements and to ensure minimum deviation from them. The CD data are also used to adjust the dose of the mask exposure tools to control CD MTT. As a result, the mask CD disposition automation system was built to allow only one or two kinds of targets at most. In contrast, wafer-fabs measure a fairly wide range of different structures to ensure their process is on target and in control. The number of such structures that are considered critical is increasing due the growing complexity of the technology. To fully comprehend the wafer-level requirements, it is highly desirable to align the mask CD sample site and disposition to be the same as that of the wafer-fabs, to measure the OPC (optical proximity correction) structures or equivalent whenever possible, and to establish the true correlation between mask CD measurements vs. wafer CD measurement. In this paper, the development of an automated multiple-target mask CD disposition system with the goal of enabling new sampling strategy is presented. The pros and cons of its implementation are discussed. The new system has been inserted in

  2. AST: an automated sequence-sampling method for improving the taxonomic diversity of gene phylogenetic trees.

    PubMed

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.

  3. AST: An Automated Sequence-Sampling Method for Improving the Taxonomic Diversity of Gene Phylogenetic Trees

    PubMed Central

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php. PMID:24892935

  4. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    SciTech Connect

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-02-25

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants.

  5. Automated sample preparation techniques for the determination of drug enantiomers in biological fluids using liquid chromatography with chiral stationary phases.

    PubMed

    Ceccato, A; Toussaint, B; Chiap, P; Hubert, P; Crommen, J

    1999-01-01

    The determination of drug enantiomers has become of prime importance in the field of pharmaceutical and biomedical analysis. Liquid chromatography (LC) is one of the most frequently used techniques for achieving the separation and quantitation of the enantiomers of drug compounds. In the bioanalytical field, the integrated systems present an interesting alternative to time-consuming sample preparation techniques such as liquid-liquid extraction. Solid phase extraction (SPE) on disposable cartridges, dialysis or column switching are sample preparation techniques that can be fully automated and applied to enantioselective analysis in biological fluids. The selection of the most appropriate LC mode and chiral stationary phase for enantioseparations in bioanalysis is discussed and some aspects of these automated sample preparation procedures are compared, such as selectivity, detectability, elution of the analytes from the extraction sorbent, sample volume and analyte stability.

  6. Automated assessment of exclusion criteria for DXA lumbar spine scans.

    PubMed

    Barden, Howard S; Markwardt, Paul; Payne, Randy; Hawkins, Brent; Frank, Matt; Faulkner, Kenneth G

    2003-01-01

    Modern bone densitometry systems using dual-energy X-ray absorptiometry (DXA) automatically analyze lumbar spine scans and provide clinically important information concerning spine bone mineral density (BMD) and fracture risk. Lumbar spine BMD accurately reflects skeletal health and fracture risk in most cases, but degenerative diseases associated with aging may lead to the formation of reactive bone (osteophytes) and other confounding conditions that elevate BMD without a concomitant increase in bone strength or decrease in fracture risk. Automated densitometry software known as computer-aided densitometry (CAD) (GE Medical Systems Lunar) assists the user in identifying scans with common acquisition and analysis irregularities known to influence BMD values. Visual examination of 231 female spine scans measured with DXA found abnormal conditions that could influence BMD results in 29% of scans. The sensitivity and specificity of several criteria for identifying scans with conditions that could influence BMD were determined. A good criterion for identifying scans with abnormal conditions was a T-score difference of greater than 0.9 or 1.0 between L1-L4 mean and individual vertebrae. Criteria for excluding affected vertebrae were determined. Exclusion of affected vertebrae resulted in a mean BMD decrease of nearly 0.6 SD (T-score) among affected scans.

  7. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm

    PubMed Central

    Pizarro, Ricardo A.; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A.; Goldman, Aaron L.; Xiao, Ena; Luo, Qian; Berman, Karen F.; Callicott, Joseph H.; Weinberger, Daniel R.; Mattay, Venkata S.

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI. PMID:28066227

  8. Assessing Writing in MOOCs: Automated Essay Scoring and Calibrated Peer Review™

    ERIC Educational Resources Information Center

    Balfour, Stephen P.

    2013-01-01

    Two of the largest Massive Open Online Course (MOOC) organizations have chosen different methods for the way they will score and provide feedback on essays students submit. EdX, MIT and Harvard's non-profit MOOC federation, recently announced that they will use a machine-based Automated Essay Scoring (AES) application to assess written work in…

  9. Automated Quality Assessment of Structural Magnetic Resonance Brain Images Based on a Supervised Machine Learning Algorithm.

    PubMed

    Pizarro, Ricardo A; Cheng, Xi; Barnett, Alan; Lemaitre, Herve; Verchinski, Beth A; Goldman, Aaron L; Xiao, Ena; Luo, Qian; Berman, Karen F; Callicott, Joseph H; Weinberger, Daniel R; Mattay, Venkata S

    2016-01-01

    High-resolution three-dimensional magnetic resonance imaging (3D-MRI) is being increasingly used to delineate morphological changes underlying neuropsychiatric disorders. Unfortunately, artifacts frequently compromise the utility of 3D-MRI yielding irreproducible results, from both type I and type II errors. It is therefore critical to screen 3D-MRIs for artifacts before use. Currently, quality assessment involves slice-wise visual inspection of 3D-MRI volumes, a procedure that is both subjective and time consuming. Automating the quality rating of 3D-MRI could improve the efficiency and reproducibility of the procedure. The present study is one of the first efforts to apply a support vector machine (SVM) algorithm in the quality assessment of structural brain images, using global and region of interest (ROI) automated image quality features developed in-house. SVM is a supervised machine-learning algorithm that can predict the category of test datasets based on the knowledge acquired from a learning dataset. The performance (accuracy) of the automated SVM approach was assessed, by comparing the SVM-predicted quality labels to investigator-determined quality labels. The accuracy for classifying 1457 3D-MRI volumes from our database using the SVM approach is around 80%. These results are promising and illustrate the possibility of using SVM as an automated quality assessment tool for 3D-MRI.

  10. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGICAL MODELING TOOL FOR WATERSHED MANAGEMENT AND LANDSCAPE ASSESSMENT

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (http://www.epa.gov/nerlesd1/land-sci/agwa/introduction.htm and www.tucson.ars.ag.gov/agwa) tool is a GIS interface jointly developed by the U.S. Environmental Protection Agency, USDA-Agricultural Research Service, and the University ...

  11. Assessment of Automated Measurement and Verification (M&V) Methods

    SciTech Connect

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Jump, David

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  12. Assessment of two automated imaging systems in evaluating estrogen receptor status in breast carcinoma.

    PubMed

    Gokhale, Sumita; Rosen, Daniel; Sneige, Nour; Diaz, Leslie K; Resetkova, Erika; Sahin, Aysegul; Liu, Jinsong; Albarracin, Constance T

    2007-12-01

    Immunohistochemical staining for estrogen receptor (ER) status is widely used in the management of breast cancer. These stains have traditionally been scored manually, which results in generally good agreement among observers when the cases are strongly positive. However, significant interobserver and intraobserver differences in scoring can occur in borderline or weakly staining cases. Recently, automated systems have been proposed to provide a more sensitive and objective method of ER quantification. The ChromaVision Automated Cellular Imaging System and the Applied Imaging Ariol SL-50 quantify the color intensity of the immunoreactive product. To assess the accuracy of these 2 automated systems and to compare them to one another and to manual scoring, we performed immunostaining for ER on 64 cases of breast cancer. The percentages of positive cells were scored manually by 4 pathologists and by the 2 imaging systems. A discrepancy in scoring was defined as that which resulted in the reclassification of a case from negative to positive or vice versa. Our results showed significant agreement between the 2 automated systems. When automated scores were compared with the manual scores, only 5 of the 64 cases (7%) were discrepant. In 4 of these, the percentage of cells staining for ER was low (0% to 20%). Overall, the 2 systems were comparable, and discrepant results were most frequently seen when analyzing tumors with low levels of ER positive cells.

  13. Assessing Creative Problem-Solving with Automated Text Grading

    ERIC Educational Resources Information Center

    Wang, Hao-Chuan; Chang, Chun-Yen; Li, Tsai-Yen

    2008-01-01

    The work aims to improve the assessment of creative problem-solving in science education by employing language technologies and computational-statistical machine learning methods to grade students' natural language responses automatically. To evaluate constructs like creative problem-solving with validity, open-ended questions that elicit…

  14. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.

    PubMed

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2013-11-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.

  15. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    NASA Technical Reports Server (NTRS)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads

  16. Falcon: automated optimization method for arbitrary assessment criteria

    DOEpatents

    Yang, Tser-Yuan; Moses, Edward I.; Hartmann-Siantar, Christine

    2001-01-01

    FALCON is a method for automatic multivariable optimization for arbitrary assessment criteria that can be applied to numerous fields where outcome simulation is combined with optimization and assessment criteria. A specific implementation of FALCON is for automatic radiation therapy treatment planning. In this application, FALCON implements dose calculations into the planning process and optimizes available beam delivery modifier parameters to determine the treatment plan that best meets clinical decision-making criteria. FALCON is described in the context of the optimization of external-beam radiation therapy and intensity modulated radiation therapy (IMRT), but the concepts could also be applied to internal (brachytherapy) radiotherapy. The radiation beams could consist of photons or any charged or uncharged particles. The concept of optimizing source distributions can be applied to complex radiography (e.g. flash x-ray or proton) to improve the imaging capabilities of facilities proposed for science-based stockpile stewardship.

  17. Capturing temporal variation in phosphorus dynamics in groundwater dominated rivers using automated high-frequency sampling

    NASA Astrophysics Data System (ADS)

    Bieroza, M. Z.; Heathwaite, A. L.; Mullinger, N. J.; Keenan, P. O.

    2012-04-01

    High-frequency river water quality monitoring provides detailed hydrochemical information on the time scale of hydrologic response. Several studies (Kirchner et al., 2004; Johnes, 2007; Cassidy and Jordan, 2011) have shown previously that coarse sampling approaches fail to quantify nutrient and sediment loads and to capture the fine structure of water quality dynamics correctly. A robust analysis of high-frequency nutrient and water quality time series can present a complex conceptual, analytical and computational problem. High-frequency nutrient monitoring provides new evidence of processes and patterns that could not be observed previously using standard coarse resolution sampling schemes. However, to fully utilise the wealth of information contained in high-frequency nutrient data, we need to address the following questions: how to detect complex coupling patterns and processes in high-resolution flow-nutrients data, how do these patterns and processes change throughout the period of observation, and how to distinguish noise signals from an evidence of real processes (Harris and Heathwaite, 2005). Here, hourly measurements of total phosphorus (TP), soluble reactive phosphorus (SRP) and turbidity were carried out using bank side analysers to study the biogeochemical response of a 54 km2 catchment of the River Leith, a tributary of the River Eden (Cumbria, UK). A remote automated mobile lab facilitates real-time high-frequency nutrient and water quality monitoring, with no time delay between collection and analysis of the reactive elements. The objectives of this study were two-fold: first to investigate the intrinsic complexity of the temporal relationship between phosphorus fractions (SRP, TP), turbidity and continuous hydrometric time series and secondly to investigate the possibilities of missing high-frequency phosphorus data infilling using continuous hydrometric time series. Complex non-linear relationships between flow, TP and SRP, turbidity were observed

  18. Transitioning the Defense Automated Neurobehavioral Assessment (DANA) to Operational Use

    DTIC Science & Technology

    2015-12-01

    primary&purpose&of&the&PTSD&study&conducted&by&our&partners,&Pacific&Health& Research&and&Education&Institute&(PHREI),&was&to&provide& psychological &data...8217’&holds&both& cognitive&and& psychological &test&information,&by&summary&and&by&trial.&(Database+diagram+ in+Appendix+M)+ Additionally,&in&order&to&streamline...Neurocognitive+Assessment+Tool."+Applied+ Psychological + Measurement+(2015):+0146621615577361+ Published:!Yes.!Acknowledgement!of!federal!support:+Yes

  19. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography

    DTIC Science & Technology

    1980-03-01

    interpreting/smoothing data containing a significant percentage of gross errors, and thus is ideally suited for applications in automated image ... analysis where interpretation is based on the data provided by error-prone feature detectors. A major portion of the paper describes the application of

  20. The Effects of Finite Sampling Corrections on State Assessment Sample Requirements. NAEP Validity Studies (NVS).

    ERIC Educational Resources Information Center

    Chromy, James R.

    States participating in the National Assessment of Educational Progress State Assessment program (state NAEP) are required to sample at least 2,500 students from at least 100 schools per subject assessed. In this ideal situation, 25 students are assessed for a subject in each school selected for that subject. Two problems have arisen: some states…

  1. Automated Gel Size Selection to Improve the Quality of Next-generation Sequencing Libraries Prepared from Environmental Water Samples.

    PubMed

    Uyaguari-Diaz, Miguel I; Slobodan, Jared R; Nesbitt, Matthew J; Croxen, Matthew A; Isaac-Renton, Judith; Prystajecky, Natalie A; Tang, Patrick

    2015-04-17

    Next-generation sequencing of environmental samples can be challenging because of the variable DNA quantity and quality in these samples. High quality DNA libraries are needed for optimal results from next-generation sequencing. Environmental samples such as water may have low quality and quantities of DNA as well as contaminants that co-precipitate with DNA. The mechanical and enzymatic processes involved in extraction and library preparation may further damage the DNA. Gel size selection enables purification and recovery of DNA fragments of a defined size for sequencing applications. Nevertheless, this task is one of the most time-consuming steps in the DNA library preparation workflow. The protocol described here enables complete automation of agarose gel loading, electrophoretic analysis, and recovery of targeted DNA fragments. In this study, we describe a high-throughput approach to prepare high quality DNA libraries from freshwater samples that can be applied also to other environmental samples. We used an indirect approach to concentrate bacterial cells from environmental freshwater samples; DNA was extracted using a commercially available DNA extraction kit, and DNA libraries were prepared using a commercial transposon-based protocol. DNA fragments of 500 to 800 bp were gel size selected using Ranger Technology, an automated electrophoresis workstation. Sequencing of the size-selected DNA libraries demonstrated significant improvements to read length and quality of the sequencing reads.

  2. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    DTIC Science & Technology

    2012-03-01

    fibrosis (fibrosis of the lining of the cavity holding the lungs) EMDQ March 2012 Chest x - ray showing areas of scarring related to asbestosis. 8...soil) •If the expected number of asbestos structures in a sample is λ, then the probability that there are exactly x asbestos fibers is equal to: •E.g...Estimating Risk for Asbestos Risk = Exposure x Toxicity = [Air] × ET × EF × IUR = f/cm3× hour/hour × day/day × (f/cm3)-1 For asbestos , ED is

  3. Assessing genetic diversity in a sugarcane germplasm collection using an automated AFLP analysis.

    PubMed

    Besse, P; Taylor, G; Carroll, B; Berding, N; Burner, D; McIntyre, C L

    1998-10-01

    An assessment of genetic diversity within and between Saccharum, Old World Erianthus sect. Ripidium, and North American E.giganteus (S.giganteum) was conducted using Amplified Fragment Length Polymorphism (AFLP(TM)) markers. An automated gel scoring system (GelCompar(TM)) was successfully used to analyse the complex AFLP patterns obtained in sugarcane and its relatives. Similarity coefficient calculations and clustering revealed a genetic structure for Saccharum and Erianthus sect. Ripidium that was identical to the one previously obtained using other molecular marker types, showing the appropriateness of AFLP markers and the associated automated analysis in assessing genetic diversity in sugarcane. A genetic structure that correlated with cytotype (2n=30, 60, 90) was revealed within the North American species, E. giganteus (S.giganteum). Complex relationships among Saccharum, Erianthus sect. Ripidium, and North American E.giganteus were revealed and are discussed in the light of a similar study which involved RAPD markers.

  4. Using after-action review based on automated performance assessment to enhance training effectiveness.

    SciTech Connect

    Stevens-Adams, Susan Marie; Gieseler, Charles J.; Basilico, Justin Derrick; Abbott, Robert G.; Forsythe, James Chris

    2010-09-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides a rigorous empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback on two out of three domain-specific performance metrics.

  5. Automated Peripheral Neuropathy Assessment Using Optical Imaging and Foot Anthropometry.

    PubMed

    Siddiqui, Hafeez-U R; Spruce, Michelle; Alty, Stephen R; Dudley, Sandra

    2015-08-01

    A large proportion of individuals who live with type-2 diabetes suffer from plantar sensory neuropathy. Regular testing and assessment for the condition is required to avoid ulceration or other damage to patient's feet. Currently accepted practice involves a trained clinician testing a patient's feet manually with a hand-held nylon monofilament probe. The procedure is time consuming, labor intensive, requires special training, is prone to error, and repeatability is difficult. With the vast increase in type-2 diabetes, the number of plantar sensory neuropathy sufferers has already grown to such an extent as to make a traditional manual test problematic. This paper presents the first investigation of a novel approach to automatically identify the pressure points on a given patient's foot for the examination of sensory neuropathy via optical image processing incorporating plantar anthropometry. The method automatically selects suitable test points on the plantar surface that correspond to those repeatedly chosen by a trained podiatrist. The proposed system automatically identifies the specific pressure points at different locations, namely the toe (hallux), metatarsal heads and heel (Calcaneum) areas. The approach is generic and has shown 100% reliability on the available database used. The database consists of Chinese, Asian, African, and Caucasian foot images.

  6. Automation impact study of Army training management 2: Extension of sampling and collection of installation resource data

    SciTech Connect

    Sanquist, T.F.; McCallum, M.C.; Hunt, P.S.; Slavich, A.L.; Underwood, J.A.; Toquam, J.L.; Seaver, D.A.

    1989-05-01

    This automation impact study of Army training management (TM) was performed for the Army Development and Employment Agency (ADEA) and the Combined Arms Training Activity (CATA) by the Battelle Human Affairs Research Centers and the Pacific Northwest Laboratory. The primary objective of the study was to provide the Army with information concerning the potential costs and savings associated with automating the TM process. This study expands the sample of units surveyed in Phase I of the automation impact effort (Sanquist et al., 1988), and presents data concerning installation resource management in relation to TM. The structured interview employed in Phase I was adapted to a self-administered survey. The data collected were compatible with that of Phase I, and both were combined for analysis. Three US sites, one reserve division, one National Guard division, and one unit in the active component outside the continental US (OCONUS) (referred to in this report as forward deployed) were surveyed. The total sample size was 459, of which 337 respondents contributed the most detailed data. 20 figs., 62 tabs.

  7. A device for automated direct sampling and quantitation from solid-phase sorbent extraction cards by electrospray tandem mass spectrometry.

    PubMed

    Wachs, Timothy; Henion, Jack

    2003-04-01

    A new solid-phase extraction (SPE) device in the 96-well format (SPE Card) has been employed for automated off-line sample preparation of low-volume urine samples. On-line automated analyte elution via SPE and direct quantitation by micro ion spray mass spectrometry is reported. This sample preparation device has the format of a microtiter plate and is molded in a plastic frame which houses 96 separate sandwiched 3M Empore sorbents (0.5-mm-thickness, 8-microm particles) covered on both sides by a microfiber support material. Ninety-six discrete SPE zones, each 7 mm in diameter, are imbedded into the sheet in the conventional 9-mm pitch (spacing) of a 96-well microtiter plate. In this study one-quarter of an SPE Card (24 individual zones) was used merely as a convenience. After automated off-line interference elution of applied human urine from 24 samples, a section of SPE Card is mounted vertically on a computer-controlled X, Y, Z positioner in front of a micro ion spray direct sampling tube equipped with a beveled tip. The beveled tip of this needle robotically penetrates each SPE elution zone (sorbent disk) or stationary phase in a serial fashion. The eluted analytes are sequentially transferred directly to a microelectrosprayer to obtain tandem mass spectrometric (MS/MS) analysis. This strategy precludes any HPLC separation and the associated method development. The quantitative determination of Ritalin (methylphenidate) from fortified human urine samples is demonstrated. A trideuterated internal standard of methylphenidate was used to obtain ion current response ratios between the parent drug and the internal standard. Human control urine samples fortified from 6.6 to 3300 ng/mL (normal therapeutic levels have been determined in other studies to be between 50 and 100 ng/mL urine) were analyzed and a linear calibration curve was obtained with a correlation coefficient of 0.9999, where the precision of the quality control (QC) samples ranged from 9.6% at the 24

  8. Lab on valve-multisyringe flow injection system (LOV-MSFIA) for fully automated uranium determination in environmental samples.

    PubMed

    Avivar, Jessica; Ferrer, Laura; Casas, Montserrat; Cerdà, Víctor

    2011-06-15

    The hyphenation of lab-on-valve (LOV) and multisyringe flow analysis (MSFIA), coupled to a long path length liquid waveguide capillary cell (LWCC), allows the spectrophotometric determination of uranium in different types of environmental sample matrices, without any manual pre-treatment, and achieving high selectivity and sensitivity levels. On-line separation and preconcentration of uranium is carried out by means of UTEVA resin. The potential of the LOV-MSFIA makes possible the fully automation of the system by the in-line regeneration of the column. After elution, uranium(VI) is spectrophotometrically detected after reaction with arsenazo-III. The determination of levels of uranium present in environmental samples is required in order to establish an environmental control. Thus, we propose a rapid, cheap and fully automated method to determine uranium(VI) in environmental samples. The limit of detection reached is 1.9 ηg of uranium and depending on the preconcentrated volume; it results in ppt levels (10.3 ηg L(-1)). Different water sample matrices (seawater, well water, freshwater, tap water and mineral water) and a phosphogypsum sample (with natural uranium content) were satisfactorily analyzed.

  9. Automated Multiple-Sample Tray Manipulation Designed and Fabricated for Atomic Oxygen Facility

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Stueber, Thomas J.; Dever, Joyce A.; Banks, Bruce A.; Rutledge, Sharon K.

    2000-01-01

    Extensive improvements to increase testing capacity and flexibility and to automate the in situ Reflectance Measurement System (RMS) are in progress at the Electro-Physics Branch s Atomic Oxygen (AO) beam facility of the NASA Glenn Research Center at Lewis Field. These improvements will triple the system s capacity while placing a significant portion of the testing cycle under computer control for added reliability, repeatability, and ease of use.

  10. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    NASA Astrophysics Data System (ADS)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  11. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays

  12. Evaluation of an Automated Instrument for Inoculating and Spreading Samples onto Agar Plates.

    PubMed

    Glasson, J H; Guthrie, L H; Nielsen, D J; Bethell, F A

    2008-04-01

    The findings from a preliminary assessment of a new instrument designed for the inoculation and spreading of specimens for microbiological analysis onto agar plates are described. The study found that the instrument was able to select full or biplates from a number of input cassettes, each containing different agar types. Samples were then inoculated by the instrument onto the agar surfaces and spread by a novel plastic applicator. Following this, the instrument labeled the plates and sorted them into a number of specified output stations. It was found that the instrument was able to inoculate and spread samples over a greater proportion of the agar plate surface than the manual loop-to-plate method. As a consequence, up to 44% more usable colonies were produced per plate from clinical specimens and standard cultures. Viable counts showed that the instrument was able to detect as few as 10(2) CFU/ml in fluids and also facilitated the enumeration of organisms, particularly in specimens such as urine.

  13. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces.

    PubMed

    Hyam, Roger

    2017-01-01

    The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN) of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value) and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN) metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation.

  14. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces

    PubMed Central

    2017-01-01

    The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN) of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value) and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN) metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation. PMID:28052110

  15. A compact tritium enrichment unit for large sample volumes with automated re-filling and higher enrichment factor.

    PubMed

    Kumar, B; Han, L-F; Wassenaar, L I; Klaus, P M; Kainz, G G; Hillegonds, D; Brummer, D; Ahmad, M; Belachew, D L; Araguás, L; Aggarwal, P

    2016-12-01

    Tritium ((3)H) in natural waters is a powerful tracer of hydrological processes, but its low concentrations require electrolytic enrichment before precise measurements can be made with a liquid scintillation counter. Here, we describe a newly developed, compact tritium enrichment unit which can be used to enrich up to 2L of a water sample. This allows a high enrichment factor (>100) for measuring low (3)H contents of <0.05TU. The TEU uses a small cell (250mL) with automated re-filling and a CO2 bubbling technique to neutralize the high alkalinity of enriched samples. The enriched residual sample is retrieved from the cell under vacuum by cryogenic distillation at -20°C and the tritium enrichment factor for each sample is accurately determined by measuring pre- and post- enrichment (2)H concentrations with laser spectrometry.

  16. Automated Sampling and Imaging of Analytes Separated on Thin-Layer Chromatography Plates Using Desorption Electrospray Ionization Mass Spectrometry

    SciTech Connect

    Van Berkel, Gary J; Kertesz, Vilmos

    2006-01-01

    Modest modifications to the atmospheric sampling capillary of a commercial electrospray mass spectrometer and upgrades to an in-house developed surface positioning control software package (HandsFree TLC/MS ) were used to enable the automated sampling and imaging of analytes on and/or within large area surface substrates using desorption electrospray ionization mass spectrometry. Sampling and imaging of rhodamine dyes separated on TLC plates were used to illustrate some of the practical applications of this system. Examples are shown for user-defined spot sampling from separated bands on a TLC plate (one or multiple spots), scanning of a complete development lane (one or multiple lanes), or imaging of analyte bands in a development lane (i.e. multiple lane scans with close spacing). The post data processing and data display aspects of the software system are also discussed.

  17. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    SciTech Connect

    Williams, Alex C; Hitt, Austin N; Voisin, Sophie; Tourassi, Georgia

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  18. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  19. An Assessment of the Technology of Automated Rendezvous and Capture in Space

    NASA Technical Reports Server (NTRS)

    Polites, M. E.

    1998-01-01

    This paper presents the results of a study to assess the technology of automated rendezvous and capture (AR&C) in space. The outline of the paper is as follows. First, the history of manual and automated rendezvous and capture and rendezvous and dock is presented. Next, the need for AR&C in space is established. Then, today's technology and ongoing technology efforts related to AR&C in space are reviewed. In light of these, AR&C systems are proposed that meet NASA's future needs, but can be developed in a reasonable amount of time with a reasonable amount of money. Technology plans for developing these systems are presented; cost and schedule are included.

  20. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines.

    PubMed

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age.

  1. Towards Automating Clinical Assessments: A Survey of the Timed Up and Go (TUG)

    PubMed Central

    Sprint, Gina; Cook, Diane; Weeks, Douglas

    2016-01-01

    Older adults often suffer from functional impairments that affect their ability to perform everyday tasks. To detect the onset and changes in abilities, healthcare professionals administer standardized assessments. Recently, technology has been utilized to complement these clinical assessments to gain a more objective and detailed view of functionality. In the clinic and at home, technology is able to provide more information about patient performance and reduce subjectivity in outcome measures. The timed up and go (TUG) test is one such assessment recently instrumented with technology in several studies, yielding promising results towards the future of automating clinical assessments. Potential benefits of technological TUG implementations include additional performance parameters, generated reports, and the ability to be self-administered in the home. In this paper, we provide an overview of the TUG test and technologies utilized for TUG instrumentation. We then critically review the technological advancements and follow up with an evaluation of the benefits and limitations of each approach. Finally, we analyze the gaps in the implementations and discuss challenges for future research towards automated, self-administered assessment in the home. PMID:25594979

  2. A Framework to Automate Assessment of Upper-Limb Motor Function Impairment: A Feasibility Study

    PubMed Central

    Otten, Paul; Kim, Jonghyun; Son, Sang Hyuk

    2015-01-01

    Standard upper-limb motor function impairment assessments, such as the Fugl-Meyer Assessment (FMA), are a critical aspect of rehabilitation after neurological disorders. These assessments typically take a long time (about 30 min for the FMA) for a clinician to perform on a patient, which is a severe burden in a clinical environment. In this paper, we propose a framework for automating upper-limb motor assessments that uses low-cost sensors to collect movement data. The sensor data is then processed through a machine learning algorithm to determine a score for a patient’s upper-limb functionality. To demonstrate the feasibility of the proposed approach, we implemented a system based on the proposed framework that can automate most of the FMA. Our experiment shows that the system provides similar FMA scores to clinician scores, and reduces the time spent evaluating each patient by 82%. Moreover, the proposed framework can be used to implement customized tests or tests specified in other existing standard assessment methods. PMID:26287206

  3. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    PubMed

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions.

  4. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis.

    PubMed

    Andersen, David; Rasmussen, Brian; Linnet, Kristian

    2012-05-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography-mass spectrometry using several preparation techniques, including protein precipitation, solid-phase extraction and centrifugation, without any manual intervention. Pipetting of a known aliquot of whole blood was achieved by integrating a balance and performing gravimetric measurements. The system was able to handle 1,073 of 1,092 (98.3%) samples of whole blood from forensic material, including postmortem samples, without any need for repeating sample preparation. Only three samples required special treatment such as dilution. The addition of internal and calibration standards were validated by pipetting a solution of Orange G and measuring the weight and absorbance. Internal standard (20 µL) was added in a multi-pipetting sequence with an accuracy of 99.9% and imprecision (coefficient of variation) of 1.6%. Calibration standards were added with high accuracy at volumes as low as 6.00 µL (±0.21 µL). The general setup of the offline sample preparation and key validation parameters of a quantitative analysis of Δ(9)-tetrahydrocannabinol is presented.

  5. Fully automated Liquid Extraction-Based Surface Sampling and Ionization Using a Chip-Based Robotic Nanoelectrospray Platform

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2010-01-01

    A fully automated liquid extraction-based surface sampling device utilizing an Advion NanoMate chip-based infusion nanoelectrospray ionization system is reported. Analyses were enabled for discrete spot sampling by using the Advanced User Interface of the current commercial control software. This software interface provided the parameter control necessary for the NanoMate robotic pipettor to both form and withdraw a liquid microjunction for sampling from a surface. The system was tested with three types of analytically important sample surface types, viz., spotted sample arrays on a MALDI plate, dried blood spots on paper, and whole-body thin tissue sections from drug dosed mice. The qualitative and quantitative data were consistent with previous studies employing other liquid extraction-based surface sampling techniques. The successful analyses performed here utilized the hardware and software elements already present in the NanoMate system developed to handle and analyze liquid samples. Implementation of an appropriate sample (surface) holder, a solvent reservoir, faster movement of the robotic arm, finer control over solvent flow rate when dispensing and retrieving the solution at the surface, and the ability to select any location on a surface to sample from would improve the analytical performance and utility of the platform.

  6. Evaluation of a software package for automated quality assessment of contrast detail images—comparison with subjective visual assessment

    NASA Astrophysics Data System (ADS)

    Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.

    2005-12-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.

  7. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  8. A new automated sample transfer system for instrumental neutron activation analysis.

    PubMed

    Ismail, S S

    2010-01-01

    A fully automated and fast pneumatic transport system for short-time activation analysis was recently developed. It is suitable for small nuclear research reactors or laboratories that are using neutron generators and other neutron sources. It is equipped with a programmable logic controller, software package, and 12 devices to facilitate optimal analytical procedures. 550 ms were only necessary to transfer the irradiated capsule (diameter: 15 mm, length: 50 mm, weight: 4 gram) to the counting chamber at a distance of 20 meters using pressurized air (4 bars) as a transport gas.

  9. A New Automated Sample Transfer System for Instrumental Neutron Activation Analysis

    PubMed Central

    Ismail, S. S.

    2010-01-01

    A fully automated and fast pneumatic transport system for short-time activation analysis was recently developed. It is suitable for small nuclear research reactors or laboratories that are using neutron generators and other neutron sources. It is equipped with a programmable logic controller, software package, and 12 devices to facilitate optimal analytical procedures. 550 ms were only necessary to transfer the irradiated capsule (diameter: 15 mm, length: 50 mm, weight: 4 gram) to the counting chamber at a distance of 20 meters using pressurized air (4 bars) as a transport gas. PMID:20369063

  10. Sensitivity testing of trypanosome detection by PCR from whole blood samples using manual and automated DNA extraction methods.

    PubMed

    Dunlop, J; Thompson, C K; Godfrey, S S; Thompson, R C A

    2014-11-01

    Automated extraction of DNA for testing of laboratory samples is an attractive alternative to labour-intensive manual methods when higher throughput is required. However, it is important to maintain the maximum detection sensitivity possible to reduce the occurrence of type II errors (false negatives; failure to detect the target when it is present), especially in the biomedical field, where PCR is used for diagnosis. We used blood infected with known concentrations of Trypanosoma copemani to test the impact of analysis techniques on trypanosome detection sensitivity by PCR. We compared combinations of a manual and an automated DNA extraction method and two different PCR primer sets to investigate the impact of each on detection levels. Both extraction techniques and specificity of primer sets had a significant impact on detection sensitivity. Samples extracted using the same DNA extraction technique performed substantially differently for each of the separate primer sets. Type I errors (false positives; detection of the target when it is not present), produced by contaminants, were avoided with both extraction methods. This study highlights the importance of testing laboratory techniques with known samples to optimise accuracy of test results.

  11. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    SciTech Connect

    Rahman, Nur Aira Abd Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B.; Salim, Nazaratul Ashifa Bt. Abdullah; Ismail, Nadiah Binti

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  12. Automated Assessment of Patients' Self-Narratives for Posttraumatic Stress Disorder Screening Using Natural Language Processing and Text Mining.

    PubMed

    He, Qiwei; Veldkamp, Bernard P; Glas, Cees A W; de Vries, Theo

    2017-03-01

    Patients' narratives about traumatic experiences and symptoms are useful in clinical screening and diagnostic procedures. In this study, we presented an automated assessment system to screen patients for posttraumatic stress disorder via a natural language processing and text-mining approach. Four machine-learning algorithms-including decision tree, naive Bayes, support vector machine, and an alternative classification approach called the product score model-were used in combination with n-gram representation models to identify patterns between verbal features in self-narratives and psychiatric diagnoses. With our sample, the product score model with unigrams attained the highest prediction accuracy when compared with practitioners' diagnoses. The addition of multigrams contributed most to balancing the metrics of sensitivity and specificity. This article also demonstrates that text mining is a promising approach for analyzing patients' self-expression behavior, thus helping clinicians identify potential patients from an early stage.

  13. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    PubMed

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost.

  14. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    NASA Technical Reports Server (NTRS)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  15. ALVEOLAR BREATH SAMPLING AND ANALYSIS IN HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Alveolar breath sampling and analysis can be extremely useful in exposure assessment studies involving volatile organic compounds (VOCs). Over recent years scientists from the EPA's National Exposure Research Laboratory have developed and refined an alveolar breath collection ...

  16. Method for Effective Calibration of Temperature Loggers with Automated Data Sampling and Evaluation

    NASA Astrophysics Data System (ADS)

    Ljungblad, S.; Josefson, L. E.; Holmsten, M.

    2011-12-01

    A highly automated calibration method for temperature loggers is presented. By using an automated procedure, a time- and cost-efficient calibration of temperature loggers is made possible. The method is directed at loggers that lack the function/property of direct reading from a display. This type of logger has to be connected to a computer for the setting-up of the measurement and again for collection of the measurement results. During the calibration, the loggers are offline. This method has been developed for temperature loggers from Gemini Data loggers, but the software and method could be modified to suit other types of loggers as well. Calibration is performed by comparison to a reference thermometer in liquid baths; and for loggers which have external sensors, only the sensor is normally placed in the bath. Loggers with internal sensors are protected from the liquid by placing them in an exterior plastic or metallic cover, and thereafter the entire loggers are placed in the bath. A digital thermometer measures the reference temperature of the bath and transmits it to a computer by way of Bluetooth. The developed calibration software, SPTempLogger, controls the logger software, and thus the communication protocol of the logger software does not need to be known. The previous method, with manual handling of the start and termination of every measuring sequence, evaluation of the resulting data and its corresponding uncertainty components, can be replaced by this automated method. Both the logger and reference measurement data are automatically downloaded once the logger has been connected to a computer after the calibration, and the calibration software started. The data are then evaluated automatically, and by statistical analysis of the confidence coefficient and standard deviation, the temperature plateaus that the calibration includes are identified. If a number of control parameters comply with the requirements, then the correction, resolution, and short

  17. An automated sample preparation system with mini-reactor to isolate and process submegabase fragments of bacterial DNA.

    PubMed

    Mollova, Emilia T; Patil, Vishal A; Protozanova, Ekaterina; Zhang, Meng; Gilmanshin, Rudolf

    2009-08-15

    Existing methods for extraction and processing of large fragments of bacterial genomic DNA are manual, time-consuming, and prone to variability in DNA quality and recovery. To solve these problems, we have designed and built an automated fluidic system with a mini-reactor. Balancing flows through and tangential to the ultrafiltration membrane in the reactor, cells and then released DNA can be immobilized and subjected to a series of consecutive processing steps. The steps may include enzymatic reactions, tag hybridization, buffer exchange, and selective removal of cell debris and by-products of the reactions. The system can produce long DNA fragments (up to 0.5 Mb) of bacterial genome restriction digest and perform DNA tagging with fluorescent sequence-specific probes. The DNA obtained is of high purity and floating free in solution, and it can be directly analyzed by pulsed-field gel electrophoresis (PFGE) or used in applications requiring submegabase DNA fragments. PFGE-ready samples of DNA restriction digests can be produced in as little as 2.1 h and require less than 10(8) cells. All fluidic operations are automated except for the injection of the sample and reagents.

  18. Laboratory and field testing of an automated atmospheric particle-bound reactive oxygen species sampling-analysis system.

    PubMed

    Wang, Yungang; Hopke, Philip K; Sun, Liping; Chalupa, David C; Utell, Mark J

    2011-01-01

    In this study, various laboratory and field tests were performed to develop an effective automated particle-bound ROS sampling-analysis system. The system uses 2' 7'-dichlorofluorescin (DCFH) fluorescence method as a nonspecific, general indicator of the particle-bound ROS. A sharp-cut cyclone and a particle-into-liquid sampler (PILS) were used to collect PM(2.5) atmospheric particles into slurry produced by a DCFH-HRP solution. The laboratory results show that the DCFH and H(2)O(2) standard solutions could be kept at room temperature for at least three and eight days, respectively. The field test in Rochester, NY, shows that the average ROS concentration was 8.3 ± 2.2 nmol of equivalent H(2)O(2) m(-3) of air. The ROS concentrations were observed to be greater after foggy conditions. This study demonstrates the first practical automated sampling-analysis system to measure this ambient particle component.

  19. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    PubMed

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  20. Laboratory and Field Testing of an Automated Atmospheric Particle-Bound Reactive Oxygen Species Sampling-Analysis System

    PubMed Central

    Wang, Yungang; Hopke, Philip K.; Sun, Liping; Chalupa, David C.; Utell, Mark J.

    2011-01-01

    In this study, various laboratory and field tests were performed to develop an effective automated particle-bound ROS sampling-analysis system. The system uses 2′ 7′-dichlorofluorescin (DCFH) fluorescence method as a nonspecific, general indicator of the particle-bound ROS. A sharp-cut cyclone and a particle-into-liquid sampler (PILS) were used to collect PM2.5 atmospheric particles into slurry produced by a DCFH-HRP solution. The laboratory results show that the DCFH and H2O2 standard solutions could be kept at room temperature for at least three and eight days, respectively. The field test in Rochester, NY, shows that the average ROS concentration was 8.3 ± 2.2 nmol of equivalent H2O2 m−3 of air. The ROS concentrations were observed to be greater after foggy conditions. This study demonstrates the first practical automated sampling-analysis system to measure this ambient particle component. PMID:21577270

  1. IntelliCages and automated assessment of learning in group-housed mice

    NASA Astrophysics Data System (ADS)

    Puścian, Alicja; Knapska, Ewelina

    2014-11-01

    IntelliCage is a fully automated, computer controlled system, which can be used for long-term monitoring of behavior of group-housed mice. Using standardized experimental protocols we can assess cognitive abilities and behavioral flexibility in appetitively and aversively motivated tasks, as well as measure social influences on learning of the subjects. We have also identified groups of neurons specifically activated by appetitively and aversively motivated learning within the amygdala, function of which we are going to investigate optogenetically in the future.

  2. Deep learning for automated skeletal bone age assessment in X-ray images.

    PubMed

    Spampinato, C; Palazzo, S; Giordano, D; Aldinucci, M; Leonardi, R

    2017-02-01

    Skeletal bone age assessment is a common clinical practice to investigate endocrinology, genetic and growth disorders in children. It is generally performed by radiological examination of the left hand by using either the Greulich and Pyle (G&P) method or the Tanner-Whitehouse (TW) one. However, both clinical procedures show several limitations, from the examination effort of radiologists to (most importantly) significant intra- and inter-operator variability. To address these problems, several automated approaches (especially relying on the TW method) have been proposed; nevertheless, none of them has been proved able to generalize to different races, age ranges and genders. In this paper, we propose and test several deep learning approaches to assess skeletal bone age automatically; the results showed an average discrepancy between manual and automatic evaluation of about 0.8 years, which is state-of-the-art performance. Furthermore, this is the first automated skeletal bone age assessment work tested on a public dataset and for all age ranges, races and genders, for which the source code is available, thus representing an exhaustive baseline for future research in the field. Beside the specific application scenario, this paper aims at providing answers to more general questions about deep learning on medical images: from the comparison between deep-learned features and manually-crafted ones, to the usage of deep-learning methods trained on general imagery for medical problems, to how to train a CNN with few images.

  3. In vivo assessment of human burn scars through automated quantification of vascularity using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liew, Yih Miin; McLaughlin, Robert A.; Gong, Peijun; Wood, Fiona M.; Sampson, David D.

    2013-06-01

    In scars arising from burns, objective assessment of vascularity is important in the early identification of pathological scarring, and in the assessment of progression and treatment response. We demonstrate the first clinical assessment and automated quantification of vascularity in cutaneous burn scars of human patients in vivo that uses optical coherence tomography (OCT). Scar microvasculature was delineated in three-dimensional OCT images using speckle decorrelation. The diameter and area density of blood vessels were automatically quantified. A substantial increase was observed in the measured density of vasculature in hypertrophic scar tissues (38%) when compared against normal, unscarred skin (22%). A proliferation of larger vessels (diameter≥100 μm) was revealed in hypertrophic scarring, which was absent from normal scars and normal skin over the investigated physical depth range of 600 μm. This study establishes the feasibility of this methodology as a means of clinical monitoring of scar progression.

  4. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  5. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  6. Pharmacokinetic Studies of Chinese Medicinal Herbs Using an Automated Blood Sampling System and Liquid Chromatography-mass Spectrometry

    PubMed Central

    Wu, Yu-Tse; Wu, Ming-Tsang; Lin, Chia-Chun; Chien, Chao-Feng; Tsai, Tung-Hu

    2012-01-01

    The safety of herbal products is one of the major concerns for the modernization of traditional Chinese medicine, and pharmacokinetic data of medicinal herbs guide us to design the rational use of the herbal formula. This article reviews the advantages of the automated blood sampling (ABS) systems for pharmacokinetic studies. In addition, three commonly used sample preparative methods, protein precipitation, liquid-liquid extraction and solid-phase extraction, are introduced. Furthermore, the definition, causes and evaluation of matrix effects in liquid chromatography-mass spectrometry (LC/MS) analysis are demonstrated. Finally, we present our previous works as practical examples of the application of ABS systems and LC/MS for the pharmacokinetic studies of Chinese medicinal herbs. PMID:24716112

  7. The Effects of Finite Sampling on State Assessment Sample Requirements. NAEP Validity Studies. Working Paper Series.

    ERIC Educational Resources Information Center

    Chromy, James R.

    This study addressed statistical techniques that might ameliorate some of the sampling problems currently facing states with small populations participating in State National Assessment of Educational Progress (NAEP) assessments. The study explored how the application of finite population correction factors to the between-school component of…

  8. Fully automated determination of cannabinoids in hair samples using headspace solid-phase microextraction and gas chromatography-mass spectrometry.

    PubMed

    Musshoff, Frank; Junker, Heike P; Lachenmeier, Dirk W; Kroener, Lars; Madea, Burkhard

    2002-01-01

    This paper describes a fully automated procedure using alkaline hydrolysis and headspace solid-phase microextraction (HS-SPME) followed by on-fiber derivatization and gas chromatographic-mass spectrometric (GC-MS) detection of cannabinoids in human hair samples. Ten milligrams of hair was washed with deionized water, petroleum ether, and dichloromethane. After the addition of deuterated internal standards the sample was hydrolyzed with sodium hydroxide and directly submitted to HS-SPME. After absorption of analytes for an on-fiber derivatization procedure the fiber was directly placed into the headspace of a second vial containing N-methyl-N-trimethylsilyltrifluoroacetamide (MSTFA) before GC-MS analysis. The limit of detection was 0.05 ng/mg for delta9-tetrahydrocannabinol (THC), 0.08 ng/mg for cannabidiol (CBD), and 0.14 ng/mg for cannabinol (CBN). Absolute recoveries were in the range between 0.3 and 7.5%. Linearity was proved over a range from 0.1 to 20 ng/mg with coefficients of correlation from 0.998 to 0.999. Validation of the whole procedure revealed excellent results. In comparison with conventional methods of hair analysis this automated HS-SPME-GC-MS procedure is substantially faster. It is easy to perform without use of solvents and with minimal sample quantities, but with the same degree of sensitivity and reproducibility. The applicability was demonstrated by the analysis of 25 hair samples from several forensic cases. The following concentration ranges were determined: THC 0.29-2.20 (mean 1.7) ng/mg, CBN 0.55-4.54 (mean 1.2) ng/mg, and CBD 0.53-18.36 (mean 1.3) ng/mg. 11-nor-Delta9-tetrahydrocannabinol-9-carboxylic acid could not be detected with this method.

  9. Assessment for Operator Confidence in Automated Space Situational Awareness and Satellite Control Systems

    NASA Astrophysics Data System (ADS)

    Gorman, J.; Voshell, M.; Sliva, A.

    2016-09-01

    The United States is highly dependent on space resources to support military, government, commercial, and research activities. Satellites operate at great distances, observation capacity is limited, and operator actions and observations can be significantly delayed. Safe operations require support systems that provide situational understanding, enhance decision making, and facilitate collaboration between human operators and system automation both in-the-loop, and on-the-loop. Joint cognitive systems engineering (JCSE) provides a rich set of methods for analyzing and informing the design of complex systems that include both human decision-makers and autonomous elements as coordinating teammates. While, JCSE-based systems can enhance a system analysts' understanding of both existing and new system processes, JCSE activities typically occur outside of traditional systems engineering (SE) methods, providing sparse guidance about how systems should be implemented. In contrast, the Joint Director's Laboratory (JDL) information fusion model and extensions, such as the Dual Node Network (DNN) technical architecture, provide the means to divide and conquer such engineering and implementation complexity, but are loosely coupled to specialized organizational contexts and needs. We previously describe how Dual Node Decision Wheels (DNDW) extend the DNN to integrate JCSE analysis and design with the practicalities of system engineering and implementation using the DNN. Insights from Rasmussen's JCSE Decision Ladders align system implementation with organizational structures and processes. In the current work, we present a novel approach to assessing system performance based on patterns occurring in operational decisions that are documented by JCSE processes as traces in a decision ladder. In this way, system assessment is closely tied not just to system design, but the design of the joint cognitive system that includes human operators, decision-makers, information systems, and

  10. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  11. Determination of selenium in marine biological tissues by transverse heated electrothermal atomic absorption spectrometry with longitudinal Zeeman background correction and automated ultrasonic slurry sampling.

    PubMed

    Méndez, H; Alava, F; Lavilla, I; Bendicho, C

    2001-01-01

    A fast, sensitive, and reliable method for determination of selenium in marine biological tissues by electrothermal atomic absorption spectrometry with slurry sampling was developed. Slurries were prepared from fresh and frozen seafood samples that were previously homogenized, dried, and ground; particle sizes <100 microm were taken for analysis. A 3% (v/v) HNO3 solution containing 0.01% (v/v) Triton X-100 was used as slurry diluent. Slurries were mixed on an automated ultrasonic slurry sampler at 20% amplitude for 30 s just before an aliquot was injected into the furnace. The method was successfully validated against the following certified reference materials: NRCC CRM DORM-2 (Dogfish muscle); NRCC CRM TORT-2 (Lobster hepatopancreas); NRCC CRM DOLT-2 (Dogfish liver); and BCR CRM 278 (Mussel tissue), and was subsequently applied to determination of Se in 10 marine biological samples. The influences of the drying procedure (oven-, microwave-, and freeze-drying), matrix modifier amount, mass of solid material in cup, and pipetting sequence are discussed. The limit of determination of Se was 0.16 microg/g and the repeatability, estimated as between-batch precision, was in the range of 4-8%. Se contents in the samples ranged from 0.6 to 2.8 microg/g. The proposed method should be useful for fast assessment of the daily dietary intake of Se.

  12. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    PubMed

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample.

  13. Use of automated monitoring to assess behavioral toxicology in fish: Linking behavior and physiology

    USGS Publications Warehouse

    Brewer, S.K.; DeLonay, A.J.; Beauvais, S.L.; Little, E.E.; Jones, S.B.

    1999-01-01

    We measured locomotory behaviors (distance traveled, speed, tortuosity of path, and rate of change in direction) with computer-assisted analysis in 30 day posthatch rainbow trout (Oncorhynchus mykiss) exposed to pesticides. We also examined cholinesterase inhibition as a potential endpoint linking physiology and behavior. Sublethal exposure to chemicals often causes changes in swimming behavior, reflecting alterations in sensory and motor systems. Swimming behavior also integrates functions of the nervous system. Rarely are the connections between physiology and behavior made. Although behavior is often suggested as a sensitive, early indicator of toxicity, behavioral toxicology has not been used to its full potential because conventional methods of behavioral assessment have relied on manual techniques, which are often time-consuming and difficult to quantify. This has severely limited the application and utility of behavioral procedures. Swimming behavior is particularly amenable to computerized assessment and automated monitoring. Locomotory responses are sensitive to toxicants and can be easily measured. We briefly discuss the use of behavior in toxicology and automated techniques used in behavioral toxicology. We also describe the system we used to determine locomotory behaviors of fish, and present data demonstrating the system's effectiveness in measuring alterations in response to chemical challenges. Lastly, we correlate behavioral and physiological endpoints.

  14. Impact of Moderate Blast Exposures on Thrombin Biomarkers Assessed by Calibrated Automated Thrombography in Rats

    PubMed Central

    Serebruany, Victor L.; Svetlov, Artem; Hayes, Ronald L.

    2013-01-01

    Abstract Severe blast exposures are frequently complicated with fatal intracranial hemorrhages. However, many more sustain low level blasts without tissue damage detectable by brain imaging. To investigate effects of nonlethal blast on thrombin-related biomarkers, rats were subjected to two different types of head-directed blast: 1) moderate “composite” blast with strong head acceleration or 2) moderate primary blast, without head acceleration. Thrombin generation (TG) ex vivo after blast was studied by calibrated automated thrombography (CAT). In the same blood samples, we assessed maximal concentration of TG (TGmax), start time, peak time, mean time, and concentrations of protein markers for vascular/hemostatic dysfunctions: integrin α/β, soluble endothelial selectin (sE-selectin), soluble intercellular cell adhesion molecule-1 (sICAM-1), and matrix metalloproteinases (MMP)-2, MMP-8, and MMP-13. Blast remarkably affected all TG indices. In animals exposed to “composite” blast, TGmax peaked at 6 h (∼4.5-fold vs. control), sustained at day 1 (∼3.8-fold increase), and declined to a 2-fold increase over control at day 7 post-blast. After primary blast, TGmax also rose to ∼4.2-fold of control at 6 h, dropped to ∼1.7-fold of control at day 1, and then exhibited a slight secondary increase at 2-fold of control at day 7. Other TG indices did not differ significantly between two types of blast exposure. The changes were also observed in other microvascular/inflammatory/hemostatic biomarkers. Integrin α/β and sICAM-1 levels were elevated after both “composite” and primary blast at 6 h, 1 day, and 7 days. sE-selectin exhibited near normal levels after “composite” blast, but increased significantly at 7 days after primary blast; MMP-2, MMP-8, and MMP-13 slightly rose after “composite” blast and significantly increased (∼2-4-fold) after primary blast. In summary, CAT may have a clinical diagnostic utility in combination with selected

  15. An Automated Laboratory-scale Methodology for the Generation of Sheared Mammalian Cell Culture Samples.

    PubMed

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-02-24

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10(5) W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10(5) W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality.

  16. Automated sample preparation for monitoring groundwater pollution by carbamate insecticides and their transformation products.

    PubMed

    Chiron, S; Valverde, A; Fernandez-Alba, A; Barceló, D

    1995-01-01

    We investigated automated on-line solid-phase extraction (SPE) followed by liquid chromatographic (LC) techniques for monitoring carbamates and their transformation products. Analytical determinations were performed by LC with UV or postcolumn fluorescence detection (U.S. Environmental Protection Agency Method 531.1 for carbamate insecticides) after preconcentration with on-line SPE using C18 Empore extraction disks. On-line SPE/LC/thermospray mass spectrometry with time-scheduled selected-ion monitoring was used as confirmatory method. The method was used to determine pesticide traces in well waters of a typical aquifer in the Almeria area (Andalucia, south of Spain) from March 1993 to February 1994. The major pollutants, found in highest amounts, were carbofuran, methiocarb, and methomyl, at levels of 0.32, 0.3, and 0.8 micrograms/L, respectively. According to results of seasonal variation studies, pollution by carbamate insecticides is sporadic and exceeds the limit of 0.5 micrograms/L for total pesticides allowed by the European Economic Community Drinking Water Directive only twice a year. 3-Hydroxycarbofuran and methiocarb sulfone also were detected, showing the importance of including the main toxic break-down products of carbamate insecticides in future monitoring programs.

  17. Rapid and automated sample preparation for nucleic acid extraction on a microfluidic CD (compact disk)

    NASA Astrophysics Data System (ADS)

    Kim, Jitae; Kido, Horacio; Zoval, Jim V.; Gagné, Dominic; Peytavi, Régis; Picard, François J.; Bastien, Martine; Boissinot, Maurice; Bergeron, Michel G.; Madou, Marc J.

    2006-01-01

    Rapid and automated preparation of PCR (polymerase chain reaction)-ready genomic DNA was demonstrated on a multiplexed CD (compact disk) platform by using hard-to-lyse bacterial spores. Cell disruption is carried out while beadcell suspensions are pushed back and forth in center-tapered lysing chambers by angular oscillation of the disk - keystone effect. During this lysis period, the cell suspensions are securely held within the lysing chambers by heatactivated wax valves. Upon application of a remote heat to the disk in motion, the wax valves release lysate solutions into centrifuge chambers where cell debris are separated by an elevated rotation of the disk. Only debris-free DNA extract is then transferred to collection chambers by capillary-assisted siphon and collected for heating that inactivates PCR inhibitors. Lysing capacity was evaluated using a real-time PCR assay to monitor the efficiency of Bacillus globigii spore lysis. PCR analysis showed that 5 minutes' CD lysis run gave spore lysis efficiency similar to that obtained with a popular commercial DNA extraction kit (i.e., IDI-lysis kit from GeneOhm Sciences Inc.) which is highly efficient for microbial cell and spore lysis. This work will contribute to the development of an integrated CD-based assay for rapid diagnosis of infectious diseases.

  18. Automated Algorithm for J-Tpeak and Tpeak-Tend Assessment of Drug-Induced Proarrhythmia Risk

    PubMed Central

    Johannesen, Lars; Vicente, Jose; Hosseini, Meisam; Strauss, David G.

    2016-01-01

    Background Prolongation of the heart rate corrected QT (QTc) interval is a sensitive marker of torsade de pointes risk; however it is not specific as QTc prolonging drugs that block inward currents are often not associated with torsade. Recent work demonstrated that separate analysis of the heart rate corrected J-Tpeakc (J-Tpeakc) and Tpeak-Tend intervals can identify QTc prolonging drugs with inward current block and is being proposed as a part of a new cardiac safety paradigm for new drugs (the “CiPA” initiative). Methods In this work, we describe an automated measurement methodology for assessment of the J-Tpeakc and Tpeak-Tend intervals using the vector magnitude lead. The automated measurement methodology was developed using data from one clinical trial and was evaluated using independent data from a second clinical trial. Results Comparison between the automated and the prior semi-automated measurements shows that the automated algorithm reproduces the semi-automated measurements with a mean difference of single-deltas <1 ms and no difference in intra-time point variability (p for all > 0.39). In addition, the time-profile of the baseline and placebo-adjusted changes are within 1 ms for 63% of the time-points (86% within 2 ms). Importantly, the automated results lead to the same conclusions about the electrophysiological mechanisms of the studied drugs. Conclusions We have developed an automated algorithm for assessment of J-Tpeakc and Tpeak-Tend intervals that can be applied in clinical drug trials. Under the CiPA initiative this ECG assessment would determine if there are unexpected ion channel effects in humans compared to preclinical studies. The algorithm is being released as open-source software. Trial Registration NCT02308748 and NCT01873950 PMID:28036330

  19. Automated, Unobtrusive, Action-by-Action Assessment of Self-Regulation during Learning with an Intelligent Tutoring System

    ERIC Educational Resources Information Center

    Aleven, Vincent; Roll, Ido; McLaren, Bruce M.; Koedinger, Kenneth R.

    2010-01-01

    Assessment of students' self-regulated learning (SRL) requires a method for evaluating whether observed actions are appropriate acts of self-regulation in theEv specific learning context in which they occur. We review research that has resulted in an automated method for context-sensitive assessment of a specific SRL strategy, help seeking while…

  20. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    PubMed

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  1. Revisiting the Hubble sequence in the SDSS DR7 spectroscopic sample: a publicly available Bayesian automated classification

    NASA Astrophysics Data System (ADS)

    Huertas-Company, M.; Aguerri, J. A. L.; Bernardi, M.; Mei, S.; Sánchez Almeida, J.

    2011-01-01

    We present an automated morphological classification in 4 types (E, S0, Sab, Scd) of ~700 000 galaxies from the SDSS DR7 spectroscopic sample based on support vector machines. The main new property of the classification is that we associate a probability to each galaxy of being in the four morphological classes instead of assigning a single class. The classification is therefore better adapted to nature where we expect a continuous transition between different morphological types. The algorithm is trained with a visual classification and then compared to several independent visual classifications including the Galaxy Zoo first-release catalog. We find a very good correlation between the automated classification and classical visual ones. The compiled catalog is intended for use in different applications and is therefore freely available through a dedicated webpage* and soon from the CasJobs database. Full catalog is only available in electronic form at CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/525/A157 or via http://gepicom04.obspm.fr/sdss_morphology/Morphology_2010.html

  2. An automated thermophoretic soot sampling device for laboratory-scale high-pressure flames.

    PubMed

    Leschowski, M; Dreier, T; Schulz, C

    2014-04-01

    Studying soot particle morphology in high-pressure flames via thermophoretic sampling critically depends on sampling precision, speed, and reproducibility. This is mainly limited by the challenges of applying pneumatically driven devices for burner chamber pressures higher than the pneumatic pressure. We present a pneumatically driven device for high-pressure applications up to 90 bars. The novelty is to separate the pneumatic driver section from the high-pressure environment in the burner chamber. The device was tested by sampling soot from a laminar high-pressure flame at 20 bars.

  3. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    NASA Technical Reports Server (NTRS)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  4. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    PubMed

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2016-08-05

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  5. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques

    PubMed Central

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J.; Nobukawa, Kazutoshi; Pan, Christopher S.

    2016-01-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs. PMID:27840592

  6. Investigation of Mercury Wet Deposition Physicochemistry in the Ohio River Valley through Automated Sequential Sampling

    EPA Science Inventory

    Intra-storm variability and soluble fractionation was explored for summer-time rain events in Steubenville, Ohio to evaluate the physical processes controlling mercury (Hg) in wet deposition in this industrialized region. Comprehensive precipitation sample collection was conducte...

  7. Automated transmission line fault analysis using synchronized sampling at two ends

    SciTech Connect

    Kezunovic, M.; Perunicic, B.

    1996-02-01

    This paper introduces a new approach to fault analysis using synchronized sampling. A digital fault recorder with Global Positioning System (GPS) satellite receiver is the source of data for this approach. Fault analysis functions, such as fault detection, classification and location are implemented for a transmission line using synchronized samples from two ends of a line. This technique can be extremely fast, selective and accurate, providing fault analysis performance that can not easily be matched by other known techniques.

  8. Automated transmission line fault analysis using synchronized sampling at two ends

    SciTech Connect

    Kezunovic, M.; Perunicic, B.

    1995-12-31

    This paper introduces a new approach to fault analysis using synchronized sampling. A digital fault recorder with Global Positioning System (GPS) satellite receiver is the source of data for this approach. Fault analysis functions, such as fault detection, classification and location are implemented for a transmission line using synchronized samples from two ends of a line. This technique can be extremely fast, selective and accurate, providing fault analysis performance that can not easily be matched by other known techniques.

  9. High-throughput automated microfluidic sample preparation for accurate microbial genomics

    PubMed Central

    Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B.; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P.; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C.

    2017-01-01

    Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications. PMID:28128213

  10. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    PubMed

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-07

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples.

  11. Effect of sampling frequency on shoreline microbiology assessments.

    PubMed

    Leecaster, M K; Weisberg, S B

    2001-11-01

    More than 80,000 shoreline bacteriological samples are collected annually in southern California to protect beachgoer health, but sampling frequency varies from daily to monthly among sampling sites. To assess the effectiveness of various sampling frequencies, we used five years of data from 24 Los Angeles area sites that have been monitored daily to simulate five alternative sampling strategies: five weekdays, five days per week including a weekend day, three days per week, weekly, and monthly. For each of these sampling strategies, we included in the simulation the local custom of adaptive sampling, in which a site is resampled the following day if bacterial concentrations exceed the State of California's beach water quality standards. We found that sampling five times per week resulted in observing about 80% of the events in which State standards were exceeded. This frequency dropped to 55%, 25%, and 5% for three times per week, weekly, and monthly sampling, respectively. Adaptive sampling was ineffective because nearly 70% of the water quality exceedences were single-day events, even at the most frequently contaminated sites. This high frequency of single-day events is of concern because the public is typically notified about water quality conditions 24-48 h after samples are collected, meaning that most warnings are out-of-date when they are issued.

  12. Use of single particle aerosol mass spectrometry for the automated nondestructive identification of drugs in multicomponent samples.

    PubMed

    Martin, Audrey N; Farquar, George R; Steele, Paul T; Jones, A Daniel; Frank, Matthias

    2009-11-15

    In this work, single particle aerosol mass spectrometry (SPAMS) was used to identify the active drug ingredients in samples of multicomponent over-the-counter (OTC) drug tablets with minimal damage to the tablets. OTC drug tablets in various formulations were analyzed including single active ingredient tablets and multi-ingredient tablets. Using a sampling apparatus developed in-house, micrometer-sized particles were simultaneously dislodged from tablets and introduced to the SPAMS, where dual-polarity mass spectra were obtained from individual particles. Active ingredients were identified from the parent ions and fragment ions formed from each sample, and alarm files were developed for each active ingredient, allowing successful automated identification of each compound in a mixture. The alarm algorithm developed for SPAMS correctly identified all drug compounds in all single-ingredient and multi-ingredient tablets studied. A further study demonstrated the ability of this technique to identify the active ingredient in a single tablet analyzed in the presence of several other nonidentical tablets. In situ measurements were also made by sampling directly from a drug sample in its original bottle. A single tablet embedded in 11 identical tablets of different composition was detected in this manner. Overall, this work demonstrates the ability of the SPAMS technique to detect a target drug compound both in complex tablets, i.e., multidrug ingredient tablets, and complex sampling environments, i.e., multitablet sampling sources. The technique is practically nondestructive, leaving the characteristic shape, color, and imprint of a tablet intact for further analysis. Applications of this technique may include forensic and pharmaceutical analysis.

  13. The T-lock: Automated compensation of radio-frequency induced sample heating

    PubMed Central

    Hiller, Sebastian; Arthanari, Haribabu; Wagner, Gerhard

    2009-01-01

    Modern high-field NMR spectrometers can stabilize the nominal sample temperature at a precision of less than 0.1 K. However, the actual sample temperature may differ from the nominal value by several degrees because the sample heating caused by high-power radio frequency pulses is not readily detected by the temperature sensors. Without correction, transfer of chemical shifts between different experiments causes problems in the data analysis. In principle, the temperature differences can be corrected by manual procedures but this is cumbersome and not fully reliable. Here, we introduce the concept of a „T-lock“, which automatically maintains the sample at the same reference temperature over the course of different NMR experiments. The T-lock works by continuously measuring the resonance frequency of a suitable spin and simultaneously adjusting the temperature control, thus locking the sample temperature at the reference value. For three different nuclei, 13C, 17O and 31P in the compounds alanine, water, and phosphate, respectively, the T-lock accuracy was found to be < 0.1 K. The use of dummy scan periods with variable lengths allows a reliable establishment of the thermal equilibrium before the acquisition of an experiment starts. PMID:19434373

  14. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection.

    PubMed

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target's nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer's heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers.

  15. Low-Cost 3D Printers Enable High-Quality and Automated Sample Preparation and Molecular Detection

    PubMed Central

    Chan, Kamfai; Coen, Mauricio; Hardick, Justin; Gaydos, Charlotte A.; Wong, Kah-Yat; Smith, Clayton; Wilson, Scott A.; Vayugundla, Siva Praneeth; Wong, Season

    2016-01-01

    Most molecular diagnostic assays require upfront sample preparation steps to isolate the target’s nucleic acids, followed by its amplification and detection using various nucleic acid amplification techniques. Because molecular diagnostic methods are generally rather difficult to perform manually without highly trained users, automated and integrated systems are highly desirable but too costly for use at point-of-care or low-resource settings. Here, we showcase the development of a low-cost and rapid nucleic acid isolation and amplification platform by modifying entry-level 3D printers that cost between $400 and $750. Our modifications consisted of replacing the extruder with a tip-comb attachment that houses magnets to conduct magnetic particle-based nucleic acid extraction. We then programmed the 3D printer to conduct motions that can perform high-quality extraction protocols. Up to 12 samples can be processed simultaneously in under 13 minutes and the efficiency of nucleic acid isolation matches well against gold-standard spin-column-based extraction technology. Additionally, we used the 3D printer’s heated bed to supply heat to perform water bath-based polymerase chain reactions (PCRs). Using another attachment to hold PCR tubes, the 3D printer was programmed to automate the process of shuttling PCR tubes between water baths. By eliminating the temperature ramping needed in most commercial thermal cyclers, the run time of a 35-cycle PCR protocol was shortened by 33%. This article demonstrates that for applications in resource-limited settings, expensive nucleic acid extraction devices and thermal cyclers that are used in many central laboratories can be potentially replaced by a device modified from inexpensive entry-level 3D printers. PMID:27362424

  16. Automation of preparation of nonmetallic samples for analysis by atomic absorption and inductively coupled plasma spectrometry

    NASA Technical Reports Server (NTRS)

    Wittmann, A.; Willay, G.

    1986-01-01

    For a rapid preparation of solutions intended for analysis by inductively coupled plasma emission spectrometry or atomic absorption spectrometry, an automatic device called Plasmasol was developed. This apparatus used the property of nonwettability of glassy C to fuse the sample in an appropriate flux. The sample-flux mixture is placed in a composite crucible, then heated at high temperature, swirled until full dissolution is achieved, and then poured into a water-filled beaker. After acid addition, dissolution of the melt, and filling to the mark, the solution is ready for analysis. The analytical results obtained, either for oxide samples or for prereduced iron ores show that the solutions prepared with this device are undistinguished from those obtained by manual dissolutions done by acid digestion or by high temperature fusion. Preparation reproducibility and analytical tests illustrate the performance of Plasmasol.

  17. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits.

  18. Assessing the accuracy of an inter-institutional automated patient-specific health problem list

    PubMed Central

    2010-01-01

    Background Health problem lists are a key component of electronic health records and are instrumental in the development of decision-support systems that encourage best practices and optimal patient safety. Most health problem lists require initial clinical information to be entered manually and few integrate information across care providers and institutions. This study assesses the accuracy of a novel approach to create an inter-institutional automated health problem list in a computerized medical record (MOXXI) that integrates three sources of information for an individual patient: diagnostic codes from medical services claims from all treating physicians, therapeutic indications from electronic prescriptions, and single-indication drugs. Methods Data for this study were obtained from 121 general practitioners and all medical services provided for 22,248 of their patients. At the opening of a patient's file, all health problems detected through medical service utilization or single-indication drug use were flagged to the physician in the MOXXI system. Each new arising health problem were presented as 'potential' and physicians were prompted to specify if the health problem was valid (Y) or not (N) or if they preferred to reassess its validity at a later time. Results A total of 263,527 health problems, representing 891 unique problems, were identified for the group of 22,248 patients. Medical services claims contributed to the majority of problems identified (77%), followed by therapeutic indications from electronic prescriptions (14%), and single-indication drugs (9%). Physicians actively chose to assess 41.7% (n = 106,950) of health problems. Overall, 73% of the problems assessed were considered valid; 42% originated from medical service diagnostic codes, 11% from single indication drugs, and 47% from prescription indications. Twelve percent of problems identified through other treating physicians were considered valid compared to 28% identified through study

  19. A self-contained polymeric cartridge for automated biological sample preparation.

    PubMed

    Xu, Guolin; Lee, Daniel Yoke San; Xie, Hong; Chiew, Deon; Hsieh, Tseng-Ming; Ali, Emril Mohamed; Lun Looi, Xing; Li, Mo-Huang; Ying, Jackie Y

    2011-09-01

    Sample preparation is one of the most crucial processes for nucleic acids based disease diagnosis. Several steps are required for nucleic acids extraction, impurity washes, and DNA/RNA elution. Careful sample preparation is vital to the obtaining of reliable diagnosis, especially with low copies of pathogens and cells. This paper describes a low-cost, disposable lab cartridge for automatic sample preparation, which is capable of handling flexible sample volumes of 10 μl to 1 ml. This plastic cartridge contains all the necessary reagents for pathogen and cell lysis, DNA/RNA extraction, impurity washes, DNA/RNA elution and waste processing in a completely sealed cartridge. The entire sample preparation processes are automatically conducted within the cartridge on a desktop unit using a pneumatic fluid manipulation approach. Reagents transportation is achieved with a combination of push and pull forces (with compressed air and vacuum, respectively), which are connected to the pneumatic inlets at the bottom of the cartridge. These pneumatic forces are regulated by pinch valve manifold and two pneumatic syringe pumps within the desktop unit. The performance of this pneumatic reagent delivery method was examined. We have demonstrated the capability of the on-cartridge RNA extraction and cancer-specific gene amplification from 10 copies of MCF-7 breast cancer cells. The on-cartridge DNA recovery efficiency was 54-63%, which was comparable to or better than the conventional manual approach using silica spin column. The lab cartridge would be suitable for integration with lab-chip real-time polymerase chain reaction devices in providing a portable system for decentralized disease diagnosis.

  20. On-line sample preparation for the automated sequential determination of HG in blood, urine and waste water

    SciTech Connect

    Schlemmer, G.; Erler, W.

    1995-12-31

    The accurate determination of mercury in environmental and clinical samples such as waste water, urine or blood with the cold vapour technique requires a complete oxidation and stabilization of mercury in the liquid phase prior to its reduction. It has been shown that the oxidation of all relevant organo-mercury compounds in this type of matrix can be achieved on-line by an appropriate oxidizing agent used in an open microwave system coupled to a flow injection cold vapour system. The various matrices, however, are handled individually. Blood samples, for example are diluted and injected into a neutral carrier. The acid to start the reaction is added on-line only shortly before the sample enters the heating zone of the microwave oven. Urine and waste water on the other hand are acidified already in the autosampler vessel and the microwave digestion is used for completion of the oxidation only. In this application, blood, urine and waste water, three most commonly encountered matrices, were analyzed using the same FIAS and microwave parameters in an automated run. The time for one individual measurement including the on-line deposition is about 90s. The detection limits obtained with a mercury specific detector is about 20 nm/L for urine and waste water and 100 ng/L for blood.

  1. Automated sample-scanning methods for radiation damage mitigation and diffraction-based centering of macromolecular crystals.

    PubMed

    Hilgart, Mark C; Sanishvili, Ruslan; Ogata, Craig M; Becker, Michael; Venugopalan, Nagarajan; Stepanov, Sergey; Makarov, Oleg; Smith, Janet L; Fischetti, Robert F

    2011-09-01

    Automated scanning capabilities have been added to the data acquisition software, JBluIce-EPICS, at the National Institute of General Medical Sciences and the National Cancer Institute Collaborative Access Team (GM/CA CAT) at the Advanced Photon Source. A `raster' feature enables sample centering via diffraction scanning over two-dimensional grids of simple rectangular or complex polygonal shape. The feature is used to locate crystals that are optically invisible owing to their small size or are visually obfuscated owing to properties of the sample mount. The raster feature is also used to identify the best-diffracting regions of large inhomogeneous crystals. Low-dose diffraction images taken at grid positions are automatically processed in real time to provide a quick quality ranking of potential data-collection sites. A `vector collect' feature mitigates the effects of radiation damage by scanning the sample along a user-defined three-dimensional vector during data collection to maximize the use of the crystal volume and the quality of the collected data. These features are integrated into the JBluIce-EPICS data acquisition software developed at GM/CA CAT where they are used in combination with a robust mini-beam of rapidly changeable diameter from 5 µm to 20 µm. The powerful software-hardware combination is being applied to challenging problems in structural biology.

  2. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples.

    PubMed

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M; Fortina, Paolo; Pienta, Kenneth J; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization.

  3. Development of an Automated and Sensitive Microfluidic Device for Capturing and Characterizing Circulating Tumor Cells (CTCs) from Clinical Blood Samples

    PubMed Central

    Gogoi, Priya; Sepehri, Saedeh; Zhou, Yi; Gorin, Michael A.; Paolillo, Carmela; Capoluongo, Ettore; Gleason, Kyle; Payne, Austin; Boniface, Brian; Cristofanilli, Massimo; Morgan, Todd M.; Fortina, Paolo; Pienta, Kenneth J.; Handique, Kalyan; Wang, Yixin

    2016-01-01

    Current analysis of circulating tumor cells (CTCs) is hindered by sub-optimal sensitivity and specificity of devices or assays as well as lack of capability of characterization of CTCs with clinical biomarkers. Here, we validate a novel technology to enrich and characterize CTCs from blood samples of patients with metastatic breast, prostate and colorectal cancers using a microfluidic chip which is processed by using an automated staining and scanning system from sample preparation to image processing. The Celsee system allowed for the detection of CTCs with apparent high sensitivity and specificity (94% sensitivity and 100% specificity). Moreover, the system facilitated rapid capture of CTCs from blood samples and also allowed for downstream characterization of the captured cells by immunohistochemistry, DNA and mRNA fluorescence in-situ hybridization (FISH). In a subset of patients with prostate cancer we compared the technology with a FDA-approved CTC device, CellSearch and found a higher degree of sensitivity with the Celsee instrument. In conclusion, the integrated Celsee system represents a promising CTC technology for enumeration and molecular characterization. PMID:26808060

  4. A Simple Method for Automated Solid Phase Extraction of Water Samples for Immunological Analysis of Small Pollutants.

    PubMed

    Heub, Sarah; Tscharner, Noe; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2016-01-01

    A new method for solid phase extraction (SPE) of environmental water samples is proposed. The developed prototype is cost-efficient and user friendly, and enables to perform rapid, automated and simple SPE. The pre-concentrated solution is compatible with analysis by immunoassay, with a low organic solvent content. A method is described for the extraction and pre-concentration of natural hormone 17β-estradiol in 100 ml water samples. Reverse phase SPE is performed with octadecyl-silica sorbent and elution is done with 200 µl of methanol 50% v/v. Eluent is diluted by adding di-water to lower the amount of methanol. After preparing manually the SPE column, the overall procedure is performed automatically within 1 hr. At the end of the process, estradiol concentration is measured by using a commercial enzyme-linked immune-sorbent assay (ELISA). 100-fold pre-concentration is achieved and the methanol content in only 10% v/v. Full recoveries of the molecule are achieved with 1 ng/L spiked de-ionized and synthetic sea water samples.

  5. Automated on-line preconcentration of palladium on different sorbents and its determination in environmental samples.

    PubMed

    Sánchez Rojas, Fuensanta; Bosch Ojeda, Catalina; Cano Pavón, José Manuel

    2007-01-01

    The determination of noble metals in environmental samples is of increasing importance. Palladium is often employed as a catalyst in chemical industry and is also used with platinum and rhodium in motor car catalytic converters which might cause environmental pollution problems. Two different sorbents for palladium preconcentration in different samples were investigated: silica gel functionalized with 1,5-bis(di-2-pyridyl)methylene tbiocarbohydrazide (DPTH-gel) and [1,5-Bis(2-pyridyl)-3-sulphophenyI methylene thiocarbonohydrazide (PSTH) immobilised on an anion-exchange resin (Dowex lx8-200)]. The sorbents were tested in a micro-column, placed in the auto-sampler arm, at the flow rate 2.8 mL min(-1). Elution was performed with 4 M HCl and 4 M HNO3, respectively. Satisfactory results were obtained for two sorbents.

  6. Improving semi-automated segmentation by integrating learning with active sampling

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Okada, Kazunori; Brown, Matthew

    2012-02-01

    Interactive segmentation algorithms such as GrowCut usually require quite a few user interactions to perform well, and have poor repeatability. In this study, we developed a novel technique to boost the performance of the interactive segmentation method GrowCut involving: 1) a novel "focused sampling" approach for supervised learning, as opposed to conventional random sampling; 2) boosting GrowCut using the machine learned results. We applied the proposed technique to the glioblastoma multiforme (GBM) brain tumor segmentation, and evaluated on a dataset of ten cases from a multiple center pharmaceutical drug trial. The results showed that the proposed system has the potential to reduce user interaction while maintaining similar segmentation accuracy.

  7. Automated Geospatial Watershed Assessment Tool (AGWA): Applications for Assessing the Impact of Urban Growth and the use of Low Impact Development Practices.

    EPA Science Inventory

    New tools and functionality have been incorporated into the Automated Geospatial Watershed Assessment Tool (AGWA) to assess the impact of urban growth and evaluate the effects of low impact development (LID) practices. AGWA (see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov...

  8. High-throughput pharmacokinetics screen of VLA-4 antagonists by LC/MS/MS coupled with automated solid-phase extraction sample preparation.

    PubMed

    Tong, Xinchun S; Wang, Junying; Zheng, Song; Pivnichny, James V

    2004-06-29

    Automation of plasma sample preparation for pharmacokinetic studies on VLA-4 antagonists has been achieved by using 96-well format solid-phase extraction operated by Beckman Coulter Biomek 2000 liquid handling system. A Biomek 2000 robot is used to perform fully automated plasma sample preparation tasks that include serial dilution of standard solutions, pipetting plasma samples, addition of standard and internal standard solutions, performing solid-phase extraction (SPE) on Waters OASIS 96-well plates. This automated sample preparation process takes less than 2 h for a typical pharmacokinetic study, including 51 samples, 24 standards, 9 quality controls, and 3-6 dose checks with minimal manual intervention. Extensive validation has been made to ensure the accuracy and reliability of this method. A two-stage vacuum pressure controller has been incorporated in the program to improve SPE efficiency. This automated SPE sample preparation approach combined with liquid chromatography coupled with the high sensitivity and selectivity of tandem mass spectrometry (LC/MS)/MS has been successfully applied on both individual and cassette dosing for pharmacokinetic screening of a large number of VLA-4 antagonists with a limit of quantitation in the range of 1-5 ng/ml. Consequently, a significant throughput increase has been achieved along with an elimination of tedious labor and its consequential tendency to produce errors.

  9. Steady-State Vacuum Ultraviolet Exposure Facility With Automated Lamp Calibration and Sample Positioning Fabricated

    NASA Technical Reports Server (NTRS)

    Sechkar, Edward A.; Steuber, Thomas J.; Banks, Bruce A.; Dever, Joyce A.

    2000-01-01

    The Next Generation Space Telescope (NGST) will be placed in an orbit that will subject it to constant solar radiation during its planned 10-year mission. A sunshield will be necessary to passively cool the telescope, protecting it from the Sun s energy and assuring proper operating temperatures for the telescope s instruments. This sunshield will be composed of metalized polymer multilayer insulation with an outer polymer membrane (12 to 25 mm in thickness) that will be metalized on the back to assure maximum reflectance of sunlight. The sunshield must maintain mechanical integrity and optical properties for the full 10 years. This durability requirement is most challenging for the outermost, constantly solar-facing polymer membrane of the sunshield. One of the potential threats to the membrane material s durability is from vacuum ultraviolet (VUV) radiation in wavelengths below 200 nm. Such radiation can be absorbed in the bulk of these thin polymer membrane materials and degrade the polymer s optical and mechanical properties. So that a suitable membrane material can be selected that demonstrates durability to solar VUV radiation, ground-based testing of candidate materials must be conducted to simulate the total 10- year VUV exposure expected during the Next Generation Space Telescope mission. The Steady State Vacuum Ultraviolet exposure facility was designed and fabricated at the NASA Glenn Research Center at Lewis Field to provide unattended 24-hr exposure of candidate materials to VUV radiation of 3 to 5 times the Sun s intensity in the wavelength range of 115 to 200 nm. The facility s chamber, which maintains a pressure of approximately 5 10(exp -6) torr, is divided into three individual exposure cells, each with a separate VUV source and sample-positioning mechanism. The three test cells are separated by a water-cooled copper shield plate assembly to minimize thermal effects from adjacent test cells. Part of the interior sample positioning mechanism of one

  10. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  11. Automated cell viability assessment using a microfluidics based portable imaging flow analyzer

    PubMed Central

    Jagannadh, Veerendra Kalyan; Adhikari, Jayesh Vasudeva; Gorthi, Sai Siva

    2015-01-01

    In this work, we report a system-level integration of portable microscopy and microfluidics for the realization of optofluidic imaging flow analyzer with a throughput of 450 cells/s. With the use of a cellphone augmented with off-the-shelf optical components and custom designed microfluidics, we demonstrate a portable optofluidic imaging flow analyzer. A multiple microfluidic channel geometry was employed to demonstrate the enhancement of throughput in the context of low frame-rate imaging systems. Using the cell-phone based digital imaging flow analyzer, we have imaged yeast cells present in a suspension. By digitally processing the recorded videos of the flow stream on the cellphone, we demonstrated an automated cell viability assessment of the yeast cell population. In addition, we also demonstrate the suitability of the system for blood cell counting. PMID:26015835

  12. Consistency of breast density categories in serial screening mammograms: A comparison between automated and human assessment.

    PubMed

    Holland, Katharina; van Zelst, Jan; den Heeten, Gerard J; Imhof-Tas, Mechli; Mann, Ritse M; van Gils, Carla H; Karssemeijer, Nico

    2016-10-01

    Reliable breast density measurement is needed to personalize screening by using density as a risk factor and offering supplemental screening to women with dense breasts. We investigated the categorization of pairs of subsequent screening mammograms into density classes by human readers and by an automated system. With software (VDG) and by four readers, including three specialized breast radiologists, 1000 mammograms belonging to 500 pairs of subsequent screening exams were categorized into either two or four density classes. We calculated percent agreement and the percentage of women that changed from dense to non-dense and vice versa. Inter-exam agreement (IEA) was calculated with kappa statistics. Results were computed for each reader individually and for the case that each mammogram was classified by one of the four readers by random assignment (group reading). Higher percent agreement was found with VDG (90.4%, CI 87.9-92.9%) than with readers (86.2-89.2%), while less plausible changes from non-dense to dense occur less often with VDG (2.8%, CI 1.4-4.2%) than with group reading (4.2%, CI 2.4-6.0%). We found an IEA of 0.68-0.77 for the readers using two classes and an IEA of 0.76-0.82 using four classes. IEA is significantly higher with VDG compared to group reading. The categorization of serial mammograms in density classes is more consistent with automated software than with a mixed group of human readers. When using breast density to personalize screening protocols, assessment with software may be preferred over assessment by radiologists.

  13. Automated signal quality assessment of mobile phone-recorded heart sound signals.

    PubMed

    Springer, David B; Brennan, Thomas; Ntusi, Ntobeko; Abdelrahman, Hassan Y; Zühlke, Liesl J; Mayosi, Bongani M; Tarassenko, Lionel; Clifford, Gari D

    Mobile phones, due to their audio processing capabilities, have the potential to facilitate the diagnosis of heart disease through automated auscultation. However, such a platform is likely to be used by non-experts, and hence, it is essential that such a device is able to automatically differentiate poor quality from diagnostically useful recordings since non-experts are more likely to make poor-quality recordings. This paper investigates the automated signal quality assessment of heart sound recordings performed using both mobile phone-based and commercial medical-grade electronic stethoscopes. The recordings, each 60 s long, were taken from 151 random adult individuals with varying diagnoses referred to a cardiac clinic and were professionally annotated by five experts. A mean voting procedure was used to compute a final quality label for each recording. Nine signal quality indices were defined and calculated for each recording. A logistic regression model for classifying binary quality was then trained and tested. The inter-rater agreement level for the stethoscope and mobile phone recordings was measured using Conger's kappa for multiclass sets and found to be 0.24 and 0.54, respectively. One-third of all the mobile phone-recorded phonocardiogram (PCG) signals were found to be of sufficient quality for analysis. The classifier was able to distinguish good- and poor-quality mobile phone recordings with 82.2% accuracy, and those made with the electronic stethoscope with an accuracy of 86.5%. We conclude that our classification approach provides a mechanism for substantially improving auscultation recordings by non-experts. This work is the first systematic evaluation of a PCG signal quality classification algorithm (using a separate test dataset) and assessment of the quality of PCG recordings captured by non-experts, using both a medical-grade digital stethoscope and a mobile phone.

  14. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    PubMed

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands.

  15. Versatile sample environments and automation for biological solution X-ray scattering experiments at the P12 beamline (PETRA III, DESY)

    PubMed Central

    Blanchet, Clement E.; Spilotros, Alessandro; Schwemmer, Frank; Graewert, Melissa A.; Kikhney, Alexey; Jeffries, Cy M.; Franke, Daniel; Mark, Daniel; Zengerle, Roland; Cipriani, Florent; Fiedler, Stefan; Roessle, Manfred; Svergun, Dmitri I.

    2015-01-01

    A high-brilliance synchrotron P12 beamline of the EMBL located at the PETRA III storage ring (DESY, Hamburg) is dedicated to biological small-angle X-ray scattering (SAXS) and has been designed and optimized for scattering experiments on macromolecular solutions. Scatterless slits reduce the parasitic scattering, a custom-designed miniature active beamstop ensures accurate data normalization and the photon-counting PILATUS 2M detector enables the background-free detection of weak scattering signals. The high flux and small beam size allow for rapid experiments with exposure time down to 30–50 ms covering the resolution range from about 300 to 0.5 nm. P12 possesses a versatile and flexible sample environment system that caters for the diverse experimental needs required to study macromolecular solutions. These include an in-vacuum capillary mode for standard batch sample analyses with robotic sample delivery and for continuous-flow in-line sample purification and characterization, as well as an in-air capillary time-resolved stopped-flow setup. A novel microfluidic centrifugal mixing device (SAXS disc) is developed for a high-throughput screening mode using sub-microlitre sample volumes. Automation is a key feature of P12; it is controlled by a beamline meta server, which coordinates and schedules experiments from either standard or nonstandard operational setups. The integrated SASFLOW pipeline automatically checks for consistency, and processes and analyses the data, providing near real-time assessments of overall parameters and the generation of low-resolution models within minutes of data collection. These advances, combined with a remote access option, allow for rapid high-throughput analysis, as well as time-resolved and screening experiments for novice and expert biological SAXS users. PMID:25844078

  16. Evaluation of the appropriate time period between sampling and analyzing for automated urinalysis

    PubMed Central

    Dolscheid-Pommerich, Ramona C.; Klarmann-Schulz, Ute; Conrad, Rupert; Stoffel-Wagner, Birgit; Zur, Berndt

    2016-01-01

    Introduction Preanalytical specifications for urinalysis must be strictly adhered to avoid false interpretations. Aim of the present study is to examine whether the preanalytical factor ‘time point of analysis’ significantly influences stability of urine samples for urine particle and dipstick analysis. Materials and methods In 321 pathological spontaneous urine samples, urine dipstick (Urisys™2400, Combur-10-Test™strips, Roche Diagnostics, Mannheim, Germany) and particle analysis (UF-1000 i™, Sysmex, Norderstedt, Germany) were performed within 90 min, 120 min and 240 min after urine collection. Results For urine particle analysis, a significant increase in conductivity (120 vs. 90 min: P < 0.001, 240 vs. 90 min: P < 0.001) and a significant decrease in WBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), RBC (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), casts (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and epithelial cells (120 vs. 90 min P = 0.610, 240 vs. 90 min P = 0.041) were found. There were no significant changes for bacteria. Regarding urine dipstick analysis, misclassification rates between measurements were significant for pH (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), leukocytes (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), nitrite (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), protein (120 vs. 90 min P < 0.001, 240 vs. 90 min P<0.001), ketone (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), blood (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001), specific gravity (120 vs. 90 min P < 0.001, 240 vs. 90 min P < 0.001) and urobilinogen (120 vs. 90 min, P = 0.031). Misclassification rates were not significant for glucose and bilirubin. Conclusion Most parameters critically depend on the time window between sampling and analysis. Our study stresses the importance of adherence to early time points in urinalysis (within 90 min). PMID:26981022

  17. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    PubMed

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected.

  18. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    PubMed

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  19. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  20. Examples of Optical Assessment of Surface Cleanliness of Genesis Samples

    NASA Technical Reports Server (NTRS)

    Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.; Gonzalez, C. P.

    2013-01-01

    Optical microscope assessment of Genesis solar wind collector surfaces is a coordinated part of the effort to obtain an assessed clean subset of flown wafer material for the scientific community. Microscopic survey is typically done at 50X magnification at selected approximately 1 square millimeter areas on the fragment surface. This survey is performed each time a principle investigator (PI) returns a sample to JSC for documentation as part of the established cleaning plan. The cleaning plan encompasses sample handling and analysis by Genesis science team members, and optical survey is done at each step in the process. Sample surface cleaning is performed at JSC (ultrapure water [1] and UV ozone cleaning [2]) and experimentally by other science team members (acid etch [3], acetate replica peels [4], CO2 snow [5], etc.). The documentation of each cleaning method can potentially be assessed with optical observation utilizing Image Pro Plus software [6]. Differences in particle counts can be studied and discussed within analysis groups. Approximately 25 samples have been identified as part of the cleaning matrix effort to date.

  1. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  2. An Automated Version of the BAT Syntactic Comprehension Task for Assessing Auditory L2 Proficiency in Healthy Adults

    ERIC Educational Resources Information Center

    Achim, Andre; Marquis, Alexandra

    2011-01-01

    Studies of bilingualism sometimes require healthy subjects to be assessed for proficiency at auditory sentence processing in their second language (L2). The Syntactic Comprehension task of the Bilingual Aphasia Test could satisfy this need. For ease and uniformity of application, we automated its English (Paradis, M., Libben, G., and Hummel, K.…

  3. ALL-ON-ALL CONJUNCTION ASSESSMENT: Methods for Automating and Minimizing the Computation Time

    NASA Astrophysics Data System (ADS)

    Hall, R.; Berry, M.; Coppola, V.; Woodburn, J.

    assessment results in terms of computational performance, automation, and capability, making full catalog propagation and conjunction assessment available within very short times. The authors will discuss and present the detailed approach and results for performing conjunction assessment including All-on-All assessments.

  4. Aerothermodynamics Feasibility Assessment of a Mars Atmoshperic Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Ferracina, L.; Larranaga, J.; Falkner, P.

    2011-02-01

    ESA's optional Mars Robotic Exploration Preparation (MREP) programme is based on a long term collaboration with NASA, by taking Mars exploration as global objective, and Mars Sample Return (MSR) mission as long term goal to be achieved by the mid 2020's. Considering today's uncertainties, different missions are envisaged and prepared by ESA as possible alternative missions to MSR in the timeframe of 2020- 2026, in case the required technology readiness is not reached by 2015 or landed mass capabilities are exceeded for any of the MSR mission elements. One of the ESA considered missions within this framework is the Mars Atmospheric Sample Return Mission. This mission has been recently assessed by ESA using its Concurrent Design Facility (CDF), aiming to enter with a probe at Mars low altitudes (≈50 km), collect a sample of airborne atmosphere (gas and dust) and return the sample back to Earth. This paper aim at reporting the preliminary aerothermodynamic assessment of the design of the Martian entry probe conducted within the CDF study. Special attention has been paid to the selection of aerodynamically efficient vehicle concepts compare to blunt bodies and to the effect of the hot-temperature shock to the cavity placed at stagnation point and used in the atmospheric sampling system.

  5. Adjustable virtual pore-size filter for automated sample preparation using acoustic radiation force

    SciTech Connect

    Jung, B; Fisher, K; Ness, K; Rose, K; Mariella, R

    2008-05-22

    We present a rapid and robust size-based separation method for high throughput microfluidic devices using acoustic radiation force. We developed a finite element modeling tool to predict the two-dimensional acoustic radiation force field perpendicular to the flow direction in microfluidic devices. Here we compare the results from this model with experimental parametric studies including variations of the PZT driving frequencies and voltages as well as various particle sizes and compressidensities. These experimental parametric studies also provide insight into the development of an adjustable 'virtual' pore-size filter as well as optimal operating conditions for various microparticle sizes. We demonstrated the separation of Saccharomyces cerevisiae and MS2 bacteriophage using acoustic focusing. The acoustic radiation force did not affect the MS2 viruses, and their concentration profile remained unchanged. With optimized design of our microfluidic flow system we were able to achieve yields of > 90% for the MS2 with > 80% of the S. cerevisiae being removed in this continuous-flow sample preparation device.

  6. Bias Assessment of General Chemistry Analytes using Commutable Samples

    PubMed Central

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham RD; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-01-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals. PMID:25678726

  7. Automated contour mapping using sparse volume sampling for 4D radiation therapy

    SciTech Connect

    Chao Ming; Schreibmann, Eduard; Li Tianfang; Wink, Nicole; Xing Lei

    2007-10-15

    The purpose of this work is to develop a novel strategy to automatically map organ contours from one phase of respiration to all other phases on a four-dimensional computed tomography (4D CT). A region of interest (ROI) was manually delineated by a physician on one phase specific image set of a 4D CT. A number of cubic control volumes of the size of {approx}1 cm were automatically placed along the contours. The control volumes were then collectively mapped to the next phase using a rigid transformation. To accommodate organ deformation, a model-based adaptation of the control volume positions was followed after the rigid mapping procedure. This further adjustment of control volume positions was performed by minimizing an energy function which balances the tendency for the control volumes to move to their correspondences with the desire to maintain similar image features and shape integrity of the contour. The mapped ROI surface was then constructed based on the central positions of the control volumes using a triangulated surface construction technique. The proposed technique was assessed using a digital phantom and 4D CT images of three lung patients. Our digital phantom study data indicated that a spatial accuracy better than 2.5 mm is achievable using the proposed technique. The patient study showed a similar level of accuracy. In addition, the computational speed of our algorithm was significantly improved as compared with a conventional deformable registration-based contour mapping technique. The robustness and accuracy of this approach make it a valuable tool for the efficient use of the available spatial-tempo information for 4D simulation and treatment.

  8. Liquid chromatography coupled with multi-channel electrochemical detection for the determination of daidzin in rat blood sampled by an automated blood sampling system.

    PubMed

    Tian, Feifei; Zhu, Yongxin; Long, Hong; Cregor, Meloney; Xie, Fuming; Kissinger, Candice B; Kissinger, Peter T

    2002-05-25

    Daidzin, a soy-derived biologically active natural product, has been reported to inhibit mitochondrial aldehyde dehydrogenase and suppress ethanol intake. This paper describes a method for the determination of daidzin in rat blood. After administration of daidzin, blood samples were periodically collected from awake, freely moving animals by a Culex automated blood sampler. Daidzin was extracted from 50 microl of diluted blood (blood and saline at a ratio of 1:1) with ethyl acetate. Chromatographic separation was achieved within 12 min using a microbore C(18) (100 x 1.0 mm) 3 microm column with a mobile phase containing 20 mM sodium acetate, 0.25 mM EDTA, pH 4.3, 4% methanol and 11% acetonitrile at a flow-rate of 90 microl/min. Detection was attained using a four-channel electrochemical detector with glassy carbon electrodes using oxidation potentials of +1100, 950, 850, 750 mV vs. Ag/AgCl. The limit of detection for daidzin in rat plasma was 5 ng/ml at a signal-to-noise ratio of 3:1. The extraction recovery of daidzin from rat plasma was over 74%. Linearity was obtained for the range of 25-1000 ng/ml. The intra- and inter-assay precisions were in the ranges of 2.7-6.6 and 1.9-3.7%, respectively. This method is suitable to routine in vivo monitoring of daidzin in rat plasma.

  9. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    SciTech Connect

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  10. Re-Emergence of Under-Selected Stimuli, after the Extinction of Over-Selected Stimuli in an Automated Match to Samples Procedure

    ERIC Educational Resources Information Center

    Broomfield, Laura; McHugh, Louise; Reed, Phil

    2008-01-01

    Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…

  11. All Hazards Risk Assessment Transition Project: Report on Capability Assessment Management System (CAMS) Automation

    DTIC Science & Technology

    2014-04-01

    utilisés pour rassembler les exigences nécessaires à la mise au point du système de gestion de l’évaluation des capacités (Capability Assessment...comparaison et l’analyse des écarts et des exigences en matière de capacité à l’échelle du spectre de la gestion des urgences. 1 2 Table of...30 7.1.1.4 User Access Controls

  12. An automated serial Grinding, Imaging and Reconstruction Instrument (GIRI) for digital modeling of samples with weak density contrasts

    NASA Astrophysics Data System (ADS)

    Maloof, A. C.; Samuels, B.; Mehra, A.; Spatzier, A.

    2013-12-01

    We present the first results from the new Princeton University Grinder Lab dedicated to the digital reconstruction of hidden objects through serial grinding and imaging. The purpose of a destructive technique like serial grinding is to facilitate the discovery of embedded objects with weak density contrasts outside the sensitivity limits of X-ray CT-scanning devices (Feature segmentation and object reconstruction are based on color and textural contrasts in the stack of images rather than density). The device we have developed is a retrofit imaging station designed for a precision CNC surface. The instrument is capable of processing a sample 20x25x40 cm in size at 1 micron resolution in x, y and z axes. Directly coupled to the vertical axis of the grinder is an 80 megapixel medium format camera and specialty macro lens capable of imaging a 4x5 cm surface at 5 micron resolution in full 16 bit color. The system is automated such that after each surface grind, the sample is cleaned, travels to the opposite end of the bed from the grinder wheel, is photographed, and then moved back to the grinding position. This process establishes a comprehensive archive of the specimen that is used for digital reconstruction and quantitative analysis. For example, in one night, a 7 cm thick sample can be imaged completely at 20 micron horizontal and vertical resolution without human supervision. Some of the initial results we present here include new digital reconstructions of early animal fossils, 3D sedimentary bedforms, the size and shape distribution of chondrules in chondritic meteorites, and the porosity structure of carbonate cemented reservoir rocks.

  13. Information-Theoretic Assessment of Sample Imaging Systems

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.

  14. Assessing the matrix effects of hemolyzed samples in bioanalysis.

    PubMed

    Hughes, Nicola C; Bajaj, Navgeet; Fan, Juan; Wong, Ernest Y K

    2009-09-01

    Validation of LC-MS/MS assays includes an assessment of matrix effects. Hemolysis effect, a special type of matrix effect, can also have an impact on analyte quantitation. In situations where the hemolysis effect is marginal, this can be resolved simply by dilution of hemolyzed samples with plasma prior to analysis. However, in some cases, the impact can be so dramatic that analytes are completely immeasurable. In such situations, modification to the bioanalytical method will be required, including, but not limited to, adjusting the chromatographic conditions to separate interferences present in hemolyzed samples; additional sample clean-up techniques such as protein precipitation in combination with SPE or a change in extraction technique such as from SPE to a liquid-liquid extraction method. Here, we report examples from four bioanalytical methods, where the presence of hemolyzed blood in plasma was found to have an impact on analyte quantitation and a description of the solutions adopted to resolve this are provided.

  15. Automated Cognitive Health Assessment From Smart Home-Based Behavior Data.

    PubMed

    Dawadi, Prafulla Nath; Cook, Diane Joyce; Schmitter-Edgecombe, Maureen

    2016-07-01

    Smart home technologies offer potential benefits for assisting clinicians by automating health monitoring and well-being assessment. In this paper, we examine the actual benefits of smart home-based analysis by monitoring daily behavior in the home and predicting clinical scores of the residents. To accomplish this goal, we propose a clinical assessment using activity behavior (CAAB) approach to model a smart home resident's daily behavior and predict the corresponding clinical scores. CAAB uses statistical features that describe characteristics of a resident's daily activity performance to train machine learning algorithms that predict the clinical scores. We evaluate the performance of CAAB utilizing smart home sensor data collected from 18 smart homes over two years. We obtain a statistically significant correlation ( r=0.72) between CAAB-predicted and clinician-provided cognitive scores and a statistically significant correlation ( r=0.45) between CAAB-predicted and clinician-provided mobility scores. These prediction results suggest that it is feasible to predict clinical scores using smart home sensor data and learning-based data analysis.

  16. Electromechanical probe and automated indentation maps are sensitive techniques in assessing early degenerated human articular cartilage.

    PubMed

    Sim, Sotcheadt; Chevrier, Anik; Garon, Martin; Quenneville, Eric; Lavigne, Patrick; Yaroshinsky, Alex; Hoemann, Caroline D; Buschmann, Michael D

    2016-06-09

    Recent advances in the development of new drugs to halt or even reverse the progression of Osteoarthritis at an early-stage requires new tools to detect early degeneration of articular cartilage. We investigated the ability of an electromechanical probe and an automated indentation technique to characterize entire human articular surfaces for rapid non-destructive discrimination between early degenerated and healthy articular cartilage. Human cadaveric asymptomatic articular surfaces (4 pairs of distal femurs and 4 pairs of tibial plateaus) were used. They were assessed ex vivo: macroscopically, electromechanically (maps of the electromechanical quantitative parameter, QP, reflecting streaming potentials), mechanically (maps of the instantaneous modulus, IM) and through cartilage thickness. Osteochondral cores were also harvested from healthy and degenerated regions for histological assessment, biochemical analyses and unconfined compression tests. The macroscopic visual assessment delimited three distinct regions on each articular surface: region I was macroscopically degenerated, region II was macroscopically normal but adjacent to region I and region III was the remaining normal articular surface. Thus, each extracted core was assigned to one of the three regions. A mixed effect model revealed that only the QP (p < 0.0001) and IM (p < 0.0001) were able to statistically discriminate the three regions. Effect size was higher for QP and IM than other assessments, indicating greater sensitivity to distinguish early degeneration of cartilage. When considering the mapping feature of the QP and IM techniques, it also revealed bilateral symmetry in a moderately similar distribution pattern between bilateral joints. This article is protected by copyright. All rights reserved.

  17. Assessment of the 296-S-21 Stack Sampling Probe Location

    SciTech Connect

    Glissmeyer, John A.

    2006-09-08

    Tests were performed to assess the suitability of the location of the air sampling probe on the 296-S-21 stack according to the criteria of ANSI N13.1-1999, Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities. Pacific Northwest National Laboratory conducted most tests on a 3.67:1 scale model of the stack. CH2MHill also performed some limited confirmatory tests on the actual stack. The tests assessed the capability of the air-monitoring probe to extract a sample representative of the effluent stream. The tests were conducted for the practical combinations of operating fans and addressed: (1) Angular Flow--The purpose is to determine whether the velocity vector is aligned with the sampling nozzle. The average yaw angle relative to the nozzle axis should not be more than 20. The measured values ranged from 5 to 11 degrees on the scale model and 10 to 12 degrees on the actual stack. (2) Uniform Air Velocity--The gas momentum across the stack cross section where the sample is extracted should be well mixed or uniform. The uniformity is expressed as the variability of the measurements about the mean, the coefficient of variance (COV). The lower the COV value, the more uniform the velocity. The acceptance criterion is that the COV of the air velocity must be ?20% across the center two-thirds of the area of the stack. At the location simulating the sampling probe, the measured values ranged form 4 to 11%, which are within the criterion. To confirm the validity of the scale model results, air velocity uniformity measurements were made both on the actual stack and on the scale model at the test ports 1.5 stack diameters upstream of the sampling probe. The results ranged from 6 to 8% COV on the actual stack and 10 to 13% COV on the scale model. The average difference for the eight runs was 4.8% COV, which is within the validation criterion. The fact that the scale model results were slightly higher than the

  18. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  19. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    SciTech Connect

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-07-15

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site.

  20. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    ERIC Educational Resources Information Center

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  1. Comparison of Automated Scoring Methods for a Computerized Performance Assessment of Clinical Judgment

    ERIC Educational Resources Information Center

    Harik, Polina; Baldwin, Peter; Clauser, Brian

    2013-01-01

    Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that "compare" automated scoring strategies. Here, comparisons are made among five strategies for…

  2. Automated assessment of pain in rats using a voluntarily accessed static weight-bearing test

    PubMed Central

    Kim, Hung Tae; Uchimoto, Kazuhiro; Duellman, Tyler; Yang, Jay

    2015-01-01

    The weight-bearing test is one method to assess pain in rodent animal models; however, the acceptance of this convenient method is limited by the low throughput data acquisition and necessity of confining the rodents to a small chamber. New methods We developed novel data acquisition hardware and software, data analysis software, and a conditioning protocol for an automated high throughput static weight-bearing assessment of pain. With this device, the rats voluntarily enter the weighing chamber, precluding the necessity to restrain the animals and thereby removing the potential stress-induced confounds as well as operator selection bias during data collection. We name this device the Voluntarily Accessed Static Incapacitance Chamber (VASIC). Results Control rats subjected to the VASIC device provided hundreds of weight-bearing data points in a single behavioral assay. Chronic constriction injury (CCI) surgery and paw pad injection of complete Freund's adjuvant (CFA) or carrageenan in rats generated hundreds of weight-bearing data during a 30 minute recording session. Rats subjected to CCI, CFA, or carrageenan demonstrated the expected bias in weight distribution favoring the un-operated leg, and the analgesic effect of i.p. morphine was demonstrated. In comparison with existing methods, brief water restriction encouraged the rats to enter the weighing chamber to access water, and an infrared detector confirmed the rat position with feet properly positioned on the footplates, triggering data collection. This allowed hands-off measurement of weight distribution data reducing operator selection bias. Conclusion The VASIC device should enhance the hands-free parallel collection of unbiased weight-bearing data in a high throughput manner, allowing further testing of this behavioral measure as an effective assessment of pain in rodents. PMID:26143745

  3. Mixed species radioiodine air sampling readout and dose assessment system

    DOEpatents

    Distenfeld, Carl H.; Klemish, Jr., Joseph R.

    1978-01-01

    This invention provides a simple, reliable, inexpensive and portable means and method for determining the thyroid dose rate of mixed airborne species of solid and gaseous radioiodine without requiring highly skilled personnel, such as health physicists or electronics technicians. To this end, this invention provides a means and method for sampling a gas from a source of a mixed species of solid and gaseous radioiodine for collection of the mixed species and readout and assessment of the emissions therefrom by cylindrically, concentrically and annularly molding the respective species around a cylindrical passage for receiving a conventional probe-type Geiger-Mueller radiation detector.

  4. Automated content and quality assessment of full-motion-video for the generation of meta data

    NASA Astrophysics Data System (ADS)

    Harguess, Josh

    2015-05-01

    Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.

  5. Automated Health Alerts Using In-Home Sensor Data for Embedded Health Assessment

    PubMed Central

    Guevara, Rainer Dane; Rantz, Marilyn

    2015-01-01

    We present an example of unobtrusive, continuous monitoring in the home for the purpose of assessing early health changes. Sensors embedded in the environment capture behavior and activity patterns. Changes in patterns are detected as potential signs of changing health. We first present results of a preliminary study investigating 22 features extracted from in-home sensor data. A 1-D alert algorithm was then implemented to generate health alerts to clinicians in a senior housing facility. Clinicians analyze each alert and provide a rating on the clinical relevance. These ratings are then used as ground truth for training and testing classifiers. Here, we present the methodology for four classification approaches that fuse multisensor data. Results are shown using embedded sensor data and health alert ratings collected on 21 seniors over nine months. The best results show similar performance for two techniques, where one approach uses only domain knowledge and the second uses supervised learning for training. Finally, we propose a health change detection model based on these results and clinical expertise. The system of in-home sensors and algorithms for automated health alerts provides a method for detecting health problems very early so that early treatment is possible. This method of passive in-home sensing alleviates compliance issues. PMID:27170900

  6. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals.

  7. Holistic approach for automated background EEG assessment in asphyxiated full-term infants

    NASA Astrophysics Data System (ADS)

    Matic, Vladimir; Cherian, Perumpillichira J.; Koolen, Ninah; Naulaers, Gunnar; Swarte, Renate M.; Govaert, Paul; Van Huffel, Sabine; De Vos, Maarten

    2014-12-01

    Objective. To develop an automated algorithm to quantify background EEG abnormalities in full-term neonates with hypoxic ischemic encephalopathy. Approach. The algorithm classifies 1 h of continuous neonatal EEG (cEEG) into a mild, moderate or severe background abnormality grade. These classes are well established in the literature and a clinical neurophysiologist labeled 272 1 h cEEG epochs selected from 34 neonates. The algorithm is based on adaptive EEG segmentation and mapping of the segments into the so-called segments’ feature space. Three features are suggested and further processing is obtained using a discretized three-dimensional distribution of the segments’ features represented as a 3-way data tensor. Further classification has been achieved using recently developed tensor decomposition/classification methods that reduce the size of the model and extract a significant and discriminative set of features. Main results. Effective parameterization of cEEG data has been achieved resulting in high classification accuracy (89%) to grade background EEG abnormalities. Significance. For the first time, the algorithm for the background EEG assessment has been validated on an extensive dataset which contained major artifacts and epileptic seizures. The demonstrated high robustness, while processing real-case EEGs, suggests that the algorithm can be used as an assistive tool to monitor the severity of hypoxic insults in newborns.

  8. Interim assessment of the VAL automated guideway transit system. Interim report

    SciTech Connect

    Anagnostopoulos, G.

    1981-11-01

    This report describes an interim assessment of the VAL (Vehicules Automatiques Legers or Light Automated Vehicle) AGT system which is currently under construction in Lille, France, and which is to become fully operational in December 1983. This report contains a technical description and performance data resulting from a demonstration test program performed concurrently in August 1980. VAL is the first driverless AGT urban system application in France. The system operates at grade, elevated, and in tunnels on an exclusive concrete dual-lane guideway that is 12.7 kilometers long. The configuration of the system is a push-pull loop operating between 17 on-line stations. The system is designed to provide scheduled operation at 60-second headways and a normal one-way capacity of 7440 passengers per hour per direction with 55 percent of the passengers seated. Two pneumatic-tired vehicles are coupled into a single vehicle capable of carrying 124 passengers at line speeds of 60 km/hr. During the course of the demonstration test program, VAL demonstrated that it could achieve high levels of dependability and availability and could perform safely under all perceivable conditions.

  9. Automated processing of high resolution airborne images for earthquake damage assessment

    NASA Astrophysics Data System (ADS)

    Nex, F.; Rupnik, E.; Toschi, I.; Remondino, F.

    2014-11-01

    Emergency response ought to be rapid, reliable and efficient in terms of bringing the necessary help to sites where it is actually needed. Although the remote sensing techniques require minimum fieldwork and allow for continuous coverage, the established approaches rely on a vast manual work and visual assessment thus are time-consuming and imprecise. Automated processes with little possible interaction are in demand. This paper attempts to address the aforementioned issues by employing an unsupervised classification approach to identify building areas affected by an earthquake event. The classification task is formulated in the Markov Random Fields (MRF) framework and only post-event airborne high-resolution images serve as the input. The generated photogrammetric Digital Surface Model (DSM) and a true orthophoto provide height and spectral information to characterize the urban scene through a set of features. The classification proceeds in two phases, one for distinguishing the buildings out of an urban context (urban classification), and the other for identifying the damaged structures (building classification). The algorithms are evaluated on a dataset consisting of aerial images (7 cm GSD) taken after the Emilia-Romagna (Italy) earthquake in 2012.

  10. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    SciTech Connect

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose estimates

  11. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  12. The visual assessment of broth cultures for tissue bank samples.

    PubMed

    Varettas, Kerry

    2017-01-05

    The bioburden screening process of allograft musculoskeletal tissue samples received at the South Eastern Area Laboratory Services includes the routine use of solid agar and cooked meat (CM) broth media. CM has been routinely sub-cultured onto solid agar plates after aerobic incubation at 35 °C. This study will evaluate whether a visual assessment of CM can replace sub-culture by an in vitro inoculation and a prospective study. Eight challenge organisms were serially diluted and inoculated into CM. The average inoculum of 0.5-5.5 CFU produced visible turbidity of CM after 24-h incubation for 7 of the challenge organisms with one organism producing turbidity after 48-h incubation. The prospective study evaluated 222 CM of which 213 were visually clear and no-growth on sub-culture and 9 turbid CM which were culture positive. Broth cultures are an integral part of the bioburden screening process of allograft musculoskeletal tissue and swab samples and visual assessment of CM can replace sub-culture.

  13. Automated sample preparation by pressurized liquid extraction-solid-phase extraction for the liquid chromatographic-mass spectrometric investigation of polyphenols in the brewing process.

    PubMed

    Papagiannopoulos, Menelaos; Mellenthin, Annett

    2002-11-08

    The analysis of polyphenols from solid plant or food samples usually requires laborious sample preparation. The liquid extraction of these compounds from the sample is compromised by apolar matrix interferences, an excess of which has to be eliminated prior to subsequent purification and separation. Applying pressurized liquid extraction to the extraction of polyphenols from hops, the use of different solvents sequentially can partly overcome these problems. Initial extraction with pentane eliminates hydrophobic compounds like hop resins and oils and enables the straightforward automated on-line solid-phase extraction as part of an optimized LC-MS analysis.

  14. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  15. Performance assessment of automated tissue characterization for prostate H and E stained histopathology

    NASA Astrophysics Data System (ADS)

    DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette

    2015-03-01

    Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.

  16. Associations between Interhemispheric Functional Connectivity and the Automated Neuropsychological Assessment Metrics (ANAM) in Civilian Mild TBI

    PubMed Central

    Sours, Chandler; Rosenberg, Joseph; Kane, Robert; Roys, Steve; Zhuo, Jiachen; Shanmuganathan, Kathirkamanthan; Gullapalli, Rao P.

    2014-01-01

    This study investigates cognitive deficits and alterations in resting state functional connectivity in civilian mild traumatic brain injury (mTBI) participants with high and low symptoms. Forty-one mTBI participants completed a resting state fMRI scan and the Automated Neuropsychological Assessment Metrics (ANAM) during initial testing (<10 days of injury) and a one month follow up. Data were compared to 30 healthy control subjects. Results from the ANAM demonstrate that mTBI participants performed significantly worse than controls on the code substitution delayed subtest (p=0.032) and weighted throughput score (p=0.001). Among the mTBI patients, high symptom mTBI participants performed worse than those with low symptoms on the code substitution delayed (p=0.017), code substitution (p=0.012), repeated simple reaction time (p=0.031), and weighted throughput score (p=0.009). Imaging results reveal that during the initial visit, low symptom mTBI participants had reduced interhemispheric functional connectivity (IH-FC) within the lateral parietal lobe (p=0.020); however, during follow up, high symptom mTBI participants showed reduced IH-FC compared to the control group within the dorsolateral prefrontal cortex (DLPFC) (p=0.013). Reduced IH-FC within the DLPFC during the follow-up was associated with reduced cognitive performance. Together, these findings suggest that reduced rs-FC may contribute to the subtle cognitive deficits noted in high symptom mTBI participants compared to control subjects and low symptom mTBI participants. PMID:24557591

  17. Automated Liquid Microjunction Surface Sampling-HPLC-MS/MS Analysis of Drugs and Metabolites in Whole-Body Thin Tissue Sections

    SciTech Connect

    Kertesz, Vilmos; Van Berkel, Gary J

    2013-01-01

    A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmaps of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.

  18. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    NASA Astrophysics Data System (ADS)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  19. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    PubMed

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput.

  20. Automated sample preparation based on the sequential injection principle. Solid-phase extraction on a molecularly imprinted polymer coupled on-line to high-performance liquid chromatography.

    PubMed

    Theodoridis, Georgios; Zacharis, Constantinos K; Tzanavaras, Paraskevas D; Themelis, Demetrius G; Economou, Anastasios

    2004-03-19

    A molecularly imprinted polymer (MIP) prepared using caffeine, as a template, was validated as a selective sorbent for solid-phase extraction (SPE), within an automated on-line sample preparation method. The polymer produced was packed in a polypropylene cartridge, which was incorporated in a flow system prior to the HPLC analytical instrumentation. The principle of sequential injection was utilised for a rapid automated and efficient SPE procedure on the MIP. Samples, buffers, washing and elution solvents were introduced to the extraction cartridge via a peristaltic pump and a multi-position valve, both controlled by appropriate software developed in-house. The method was optimised in terms of flow rates, extraction time and volume. After extraction, the final eluent from the extraction cartridge was directed to the injection loop and was subsequently analysed on HPLC. The overall set-up facilitated unattended operation, operation and improved both mixing fluidics and method development flexibility. This system may be readily built in the laboratory and can be further used as an automated platform for on-line sample preparation.

  1. Enviromental sampling at remote sites based on radiological screening assessments

    SciTech Connect

    Ebinger, M.H.; Hansen, W.R.; Wenz, G.; Oxenberg, T.P.

    1996-06-01

    Environmental radiation monitoring (ERM) data from remote sites on the White Sands Missile Range, New Mexico, were used to estimate doses to humans and terrestrial mammals from residual radiation deposited during testing of components containing depleted uranium (DU) and thorium (Th). ERM data were used with the DOE code RESRAD and a simple steady-state pathway code to estimate the potential adverse effects from DU and Th to workers in the contaminated zones, to hunters consuming animals from the contaminated zones, and to terrestrial mammals that inhabit the contaminated zones. Assessments of zones contaminated with DU and Th and DU alone were conducted. Radiological doses from Th and DU in soils were largest with a maximum of about 3.5 mrem y{sup -1} in humans and maximum of about 0.1 mrad d{sup -1} in deer. Dose estimates from DU alone in soils were significantly less with a maximum of about 1 mrem y{sup -1} in humans and about 0.04 mrad d{sup -1} in deer. The results of the dose estimates suggest strongly that environmental sampling in these affected areas can be infrequent and still provide adequate assessments of radiological doses to workers, hunters, and terrestrial mammals.

  2. Sampling for Soil Carbon Stock Assessment in Rocky Agricultural Soils

    NASA Technical Reports Server (NTRS)

    Beem-Miller, Jeffrey P.; Kong, Angela Y. Y.; Ogle, Stephen; Wolfe, David

    2016-01-01

    Coring methods commonly employed in soil organic C (SOC) stock assessment may not accurately capture soil rock fragment (RF) content or soil bulk density (rho (sub b)) in rocky agricultural soils, potentially biasing SOC stock estimates. Quantitative pits are considered less biased than coring methods but are invasive and often cost-prohibitive. We compared fixed-depth and mass-based estimates of SOC stocks (0.3-meters depth) for hammer, hydraulic push, and rotary coring methods relative to quantitative pits at four agricultural sites ranging in RF content from less than 0.01 to 0.24 cubic meters per cubic meter. Sampling costs were also compared. Coring methods significantly underestimated RF content at all rocky sites, but significant differences (p is less than 0.05) in SOC stocks between pits and corers were only found with the hammer method using the fixed-depth approach at the less than 0.01 cubic meters per cubic meter RF site (pit, 5.80 kilograms C per square meter; hammer, 4.74 kilograms C per square meter) and at the 0.14 cubic meters per cubic meter RF site (pit, 8.81 kilograms C per square meter; hammer, 6.71 kilograms C per square meter). The hammer corer also underestimated rho (sub b) at all sites as did the hydraulic push corer at the 0.21 cubic meters per cubic meter RF site. No significant differences in mass-based SOC stock estimates were observed between pits and corers. Our results indicate that (i) calculating SOC stocks on a mass basis can overcome biases in RF and rho (sub b) estimates introduced by sampling equipment and (ii) a quantitative pit is the optimal sampling method for establishing reference soil masses, followed by rotary and then hydraulic push corers.

  3. Sampling for Soil Carbon Stock Assessment in Rocky Agricultural Soils

    NASA Technical Reports Server (NTRS)

    Beem-Miller, Jeffrey P.; Kong, Angela Y. Y.; Ogle, Stephen; Wolfe, David

    Coring methods commonly employed in soil organic C (SOC) stock assessment may not accurately capture soil rock fragment (RF) content or soil bulk density (rho (sub b)) in rocky agricultural soils, potentially biasing SOC stock estimates. Quantitative pits are considered less biased than coring methods but are invasive and often cost-prohibitive. We compared fixed-depth and mass-based estimates of SOC stocks (0.3-meters depth) for hammer, hydraulic push, and rotary coring methods relative to quantitative pits at four agricultural sites ranging in RF content from less than 0.01 to 0.24 cubic meters per cubic meter. Sampling costs were also compared. Coring methods significantly underestimated RF content at all rocky sites, but significant differences (p is less than 0.05) in SOC stocks between pits and corers were only found with the hammer method using the fixed-depth approach at the less than 0.01 cubic meters per cubic meter RF site (pit, 5.80 kilograms C per square meter; hammer, 4.74 kilograms C per square meter) and at the 0.14 cubic meters per cubic meter RF site (pit, 8.81 kilograms C per square meter; hammer, 6.71 kilograms C per square meter). The hammer corer also underestimated rho (sub b) at all sites as did the hydraulic push corer at the 0.21 cubic meters per cubic meter RF site. No significant differences in mass-based SOC stock estimates were observed between pits and corers. Our results indicate that (i) calculating SOC stocks on a mass basis can overcome biases in RF and rho (sub b) estimates introduced by sampling equipment and (ii) a quantitative pit is the optimal sampling method for establishing reference soil masses, followed by rotary and then hydraulic push corers.

  4. Experimental Assessment of Mouse Sociability Using an Automated Image Processing Approach.

    PubMed

    Varghese, Frency; Burket, Jessica A; Benson, Andrew D; Deutsch, Stephen I; Zemlin, Christian W

    2016-05-15

    Mouse is the preferred model organism for testing drugs designed to increase sociability. We present a method to quantify mouse sociability in which the test mouse is placed in a standardized apparatus and relevant behaviors are assessed in three different sessions (called session I, II, and III). The apparatus has three compartments (see Figure 1), the left and right compartments contain an inverted cup which can house a mouse (called "stimulus mouse"). In session I, the test mouse is placed in the cage and its mobility is characterized by the number of transitions made between compartments. In session II, a stimulus mouse is placed under one of the inverted cups and the sociability of the test mouse is quantified by the amounts of time it spends near the cup containing the enclosed stimulus mouse vs. the empty inverted cup. In session III, the inverted cups are removed and both mice interact freely. The sociability of the test mouse in session III is quantified by the number of social approaches it makes toward the stimulus mouse and by the number of times it avoids a social approach by the stimulus mouse. The automated evaluation of the movie detects the nose of the test mouse, which allows the determination of all described sociability measures in session I and II (in session III, approaches are identified automatically but classified manually). To find the nose, the image of an empty cage is digitally subtracted from each frame of the movie and the resulting image is binarized to identify the mouse pixels. The mouse tail is automatically removed and the two most distant points of the remaining mouse are determined; these are close to nose and base of tail. By analyzing the motion of the mouse and using continuity arguments, the nose is identified. Figure 1. Assessment of Sociability During 3 sessions. Session I (top): Acclimation of test mouse to the cage. Session II (middle): Test mouse moving freely in the cage while the stimulus mouse is enclosed in an

  5. Automated determination of nitrate plus nitrite in aqueous samples with flow injection analysis using vanadium (III) chloride as reductant.

    PubMed

    Wang, Shu; Lin, Kunning; Chen, Nengwang; Yuan, Dongxing; Ma, Jian

    2016-01-01

    Determination of nitrate in aqueous samples is an important analytical objective for environmental monitoring and assessment. Here we report the first automatic flow injection analysis (FIA) of nitrate (plus nitrite) using VCl3 as reductant instead of the well-known but toxic cadmium column for reducing nitrate to nitrite. The reduced nitrate plus the nitrite originally present in the sample react with the Griess reagent (sulfanilamide and N-1-naphthylethylenediamine dihydrochloride) under acidic condition. The resulting pink azo dye can be detected at 540 nm. The Griess reagent and VCl3 are used as a single mixed reagent solution to simplify the system. The various parameters of the FIA procedure including reagent composition, temperature, volume of the injection loop, and flow rate were carefully investigated and optimized via univariate experimental design. Under the optimized conditions, the linear range and detection limit of this method are 0-100 µM (R(2)=0.9995) and 0.1 µM, respectively. The targeted analytical range can be easily extended to higher concentrations by selecting alternative detection wavelengths or increasing flow rate. The FIA system provides a sample throughput of 20 h(-1), which is much higher than that of previously reported manual methods based on the same chemistry. National reference solutions and different kinds of aqueous samples were analyzed with our method as well as the cadmium column reduction method. The results from our method agree well with both the certified value and the results from the cadmium column reduction method (no significant difference with P=0.95). The spiked recovery varies from 89% to 108% for samples with different matrices, showing insignificant matrix interference in this method.

  6. Automated assessment of bone changes in cross-sectional micro-CT studies of murine experimental osteoarthritis

    PubMed Central

    Vincent, Tonia L.; Marenzana, Massimo

    2017-01-01

    Objective The degradation of articular cartilage, which characterises osteoarthritis (OA), is usually paired with excessive bone remodelling, including subchondral bone sclerosis, cysts, and osteophyte formation. Experimental models of OA are widely used to investigate pathogenesis, yet few validated methodologies for assessing periarticular bone morphology exist and quantitative measurements are limited by manual segmentation of micro-CT scans. The aim of this work was to chart the temporal changes in periarticular bone in murine OA by novel, automated micro-CT methods. Methods OA was induced by destabilisation of the medial meniscus (DMM) in 10-week old male mice and disease assessed cross-sectionally from 1- to 20-weeks post-surgery. A novel approach was developed to automatically segment subchondral bone compartments into plate and trabecular bone in micro-CT scans of tibial epiphyses. Osteophyte volume, as assessed by shape differences using 3D image registration, and by measuring total epiphyseal volume was performed. Results Significant linear and volumetric structural modifications in subchondral bone compartments and osteophytes were measured from 4-weeks post-surgery and showed progressive changes at all time points; by 20 weeks, medial subchondral bone plate thickness increased by 160±19.5 μm and the medial osteophyte grew by 0.124±0.028 μm3. Excellent agreement was found when automated measurements were compared with manual assessments. Conclusion Our automated methods for assessing bone changes in murine periarticular bone are rapid, quantitative, and highly accurate, and promise to be a useful tool in future preclinical studies of OA progression and treatment. The current approaches were developed specifically for cross-sectional micro-CT studies but could be applied to longitudinal studies. PMID:28334010

  7. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    EPA Science Inventory

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  8. Assessment of automated access to existing federal marine pollution data and information systems

    SciTech Connect

    Not Available

    1986-06-01

    The objective of the Ocean Pollution Data and Information Network (OPDIN) under Section 8 of Public Law 95-273 is to improve the dissemination of information concerning Federal marine pollution-related projects. One means of improving dissemination is developing direct access and establishing a working knowledge with those systems that have automated data and information files. The Central Coordination and Referral Office (CCRO) maintains a description of the systems and services with pollution interests that are available from eleven Federal organizations. These systems and services range from completely automated, user-accessible systems of national interest, to non-automated services for retrieval of hard-copy materials and products of regional interest. This report focuses on the present CCRO access capabilities to those systems and identifies additional systems with potential access by the CCRO and its Network participants.

  9. Automation of flow injection gas diffusion-ion chromatography for the nanomolar determination of methylamines and ammonia in seawater and atmospheric samples

    PubMed Central

    Gibb, Stuart W.; Wood, John W.; Fauzi, R.; Mantoura, C.

    1995-01-01

    The automation and improved design and performance of Flow Injection Gas Diffusion-Ion Chromatography (FIGD-IC), a novel technique for the simultaneous analysis of trace ammonia (NH3) and methylamines (MAs) in aqueous media, is presented. Automated Flow Injection Gas Diffusion (FIGD) promotes the selective transmembrane diffusion of MAs and NH3 from aqueous sample under strongly alkaline (pH > 12, NaOH), chelated (EDTA) conditions into a recycled acidic acceptor stream. The acceptor is then injected onto an ion chromatograph where NH3 and the MAs are fully resolved as their cations and detected conductimetrically. A versatile PC interfaced control unit and data capture unit (DCU) are employed in series to direct the selonoid valve switching sequence, IC operation and collection of data. Automation, together with other modifications improved both linearily (R2 > 0.99 MAs 0-100 nM, NH3 0-1000 nM) and precision (<8%) of FIGD-IC at nanomolar concentrations, compared with the manual procedure. The system was successfully applied to the determination of MAs and NH3 in seawater and in trapped particulate and gaseous atmospheric samples during an oceanographic research cruise. PMID:18925047

  10. An automated tool for the design and assessment of space systems

    NASA Technical Reports Server (NTRS)

    Dalcambre, Lois M. L.; Landry, Steve P.

    1990-01-01

    Space systems can be characterized as both large and complex but they often rely on reusable subcomponents. One problem in the design of such systems is the representation and validation of the system, particularly at the higher levels of management. An automated tool is described for the representation, refinement, and validation of such complex systems based on a formal design theory, the Theory of Plausible Design. In particular, the steps necessary to automate the tool and make it a competent, usable assistant, are described.

  11. Assessing tiger population dynamics using photographic capture-recapture sampling.

    PubMed

    Karanth, K Ullas; Nichols, James D; Kumar, N Samba; Hines, James E

    2006-11-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, "robust design" capture-recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of gamma" = gamma' = 0.10 +/- 0.069 (values are estimated mean +/- SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 +/- 0.051, and the estimated probability that a newly caught animal was a transient was tau = 0.18 +/- 0.11. During the period when the sampled area was of constant size, the estimated population size N(t) varied from 17 +/- 1.7 to 31 +/- 2.1 tigers, with a geometric mean rate of annual population change estimated as lambda = 1.03 +/- 0.020, representing a 3% annual increase. The estimated recruitment of new animals, B(t), varied from 0 +/- 3.0 to 14 +/- 2.9 tigers. Population density estimates, D, ranged from 7.33 +/- 0.8 tigers/100 km2 to 21.73 +/- 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis

  12. Assessing tiger population dynamics using photographic capture-recapture sampling

    USGS Publications Warehouse

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain

  13. Development of Automated Signal and Meta-data Quality Assessment at the USGS ANSS NOC

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Buland, R.; Boaz, R.; Benz, H.; Gee, L.; Leith, W.

    2007-05-01

    Real-time earthquake processing systems at the Advanced National Seismic System (ANSS) National Operations Center (NOC) rely on high-quality broadband seismic data to compute accurate earthquake locations, moment-tensor solutions, finite-fault models, Shakemaps and impact assessments. The NEIC receives real- time seismic data from the ANSS backbone, the Global Seismographic Network, ANSS regional network operators, foreign regional and national networks, the tsunami warning centers and the International Monitoring System. For many contributed stations, calibration information is not well known. In addition, equipment upgrades or changes may occur, making it difficult to maintain accurate metadata. The high-degree of real-time integration of seismic data necessitates the development of automated QC tools and procedures that identify changes in instrument response, quality of waveforms and other systematic changes in station performance that might affect NEIC computations and products. We present new tools and methods that will allow NEIC and other network operations to evaluate seismic station performance and characteristics both in the time and frequency domain using probability density functions (PDF) of power spectral densities (PSD) (McNamara and Buland, 2004). The method involves determining station standard noise conditions and characterizing deviations from the standard using the probabilistic distribution hourly PSDs. We define the standard station noise conditions to lie within the 10th and 90th percentile of the PSD distribution. The computed PSDs are stored in a database, allowing a user to access specific time periods of PSDs (PDF subsets) and time series segments through a client-interface or programmatic database calls. This allows the user to visually define the spectral characteristics of known system transients. In order to identify instrument response changes or systems transients we compare short-term spectral envelopes (1 hour to 1 day) against

  14. Assessing uncertainty in DNA evidence caused by sampling effects.

    PubMed

    Curran, J M; Buckleton, J S; Triggs, C M; Weir, B S

    2002-01-01

    Sampling error estimation in forensic DNA testimony was discussed. Is an estimate necessary and how should it be made? The authors find that all modern methods have areas of strength and weakness. The assessment of which is the 'best' is subjective and depends on the performance of the method, the type of problem (criminal work or paternity), the database size and availability of computing software and support. The authors preferred the highest posterior density approach for performance, however the other methods all have areas where their performance is adequate. For single-contributor stains normal approximation methods are suitable, also the bootstrap and the highest posterior density method. For multiple-contributor stains or other complex situations the match probability expressions become quite complex and it may not be possible to derive the necessary variance expressions. The highest posterior density or the bootstrap provide a better general method, with non-zero theta. The size-bias correction and the factor of 10 approaches may be considered acceptable by many forensic scientists as long as their limitations are understood.

  15. Using Group Projects to Assess the Learning of Sampling Distributions

    ERIC Educational Resources Information Center

    Neidigh, Robert O.; Dunkelberger, Jake

    2012-01-01

    In an introductory business statistics course, student groups used sample data to compare a set of sample means to the theoretical sampling distribution. Each group was given a production measurement with a population mean and standard deviation. The groups were also provided an excel spreadsheet with 40 sample measurements per week for 52 weeks…

  16. Taking Advantage of Automated Assessment of Student-Constructed Graphs in Science

    ERIC Educational Resources Information Center

    Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.

    2015-01-01

    We present a new system for automated scoring of graph construction items that address complex science concepts, feature qualitative prompts, and support a range of possible solutions. This system utilizes analysis of spatial features (e.g., slope of a line) to evaluate potential student ideas represented within graphs. Student ideas are then…

  17. Automation in East Asian Libraries in the United States: A Review and Assessment.

    ERIC Educational Resources Information Center

    Elman, Sarah Su-erh

    1991-01-01

    Examines the development and implementation of the Chinese, Japanese, and Korean (CJK) systems on RLIN (Research Libraries Information Network) and OCLC and describes the results of a survey of academic and research member libraries that was conducted to learn how the two CJK systems have been incorporated into local automated library systems. (10…

  18. Assessing the Potential Value for an Automated Body Condition Scoring System through Stochastic Simulation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Automated body condition scoring (BCS) through extraction of information from digital images has been demonstrated to be feasible; and commercial technologies are being developed. The primary objective of this research was to identify the factors that influence the potential profitability of investi...

  19. Assessment of H.264 video compression on automated face recognition performance in surveillance and mobile video scenarios

    NASA Astrophysics Data System (ADS)

    Klare, Brendan; Burge, Mark

    2010-04-01

    We assess the impact of the H.264 video codec on the match performance of automated face recognition in surveillance and mobile video applications. A set of two hundred access control (90 pixel inter-pupilary distance) and distance surveillance (45 pixel inter-pupilary distance) videos taken under non-ideal imaging and facial recognition (e.g., pose, illumination, and expression) conditions were matched using two commercial face recognition engines in the studies. The first study evaluated automated face recognition performance on access control and distance surveillance videos at CIF and VGA resolutions using the H.264 baseline profile at nine bitrates rates ranging from 8kbs to 2048kbs. In our experiments, video signals were able to be compressed up to 128kbs before a significant drop face recognition performance occurred. The second study evaluated automated face recognition on mobile devices at QCIF, iPhone, and Android resolutions for each of the H.264 PDA profiles. Rank one match performance, cumulative match scores, and failure to enroll rates are reported.

  20. Negative symptoms in schizophrenia: a study in a large clinical sample of patients using a novel automated method

    PubMed Central

    Patel, Rashmi; Jayatilleke, Nishamali; Broadbent, Matthew; Chang, Chin-Kuo; Foskett, Nadia; Gorrell, Genevieve; Hayes, Richard D; Jackson, Richard; Johnston, Caroline; Shetty, Hitesh; Roberts, Angus; McGuire, Philip; Stewart, Robert

    2015-01-01

    Objectives To identify negative symptoms in the clinical records of a large sample of patients with schizophrenia using natural language processing and assess their relationship with clinical outcomes. Design Observational study using an anonymised electronic health record case register. Setting South London and Maudsley NHS Trust (SLaM), a large provider of inpatient and community mental healthcare in the UK. Participants 7678 patients with schizophrenia receiving care during 2011. Main outcome measures Hospital admission, readmission and duration of admission. Results 10 different negative symptoms were ascertained with precision statistics above 0.80. 41% of patients had 2 or more negative symptoms. Negative symptoms were associated with younger age, male gender and single marital status, and with increased likelihood of hospital admission (OR 1.24, 95% CI 1.10 to 1.39), longer duration of admission (β-coefficient 20.5 days, 7.6–33.5), and increased likelihood of readmission following discharge (OR 1.58, 1.28 to 1.95). Conclusions Negative symptoms were common and associated with adverse clinical outcomes, consistent with evidence that these symptoms account for much of the disability associated with schizophrenia. Natural language processing provides a means of conducting research in large representative samples of patients, using data recorded during routine clinical practice. PMID:26346872

  1. A fully automated system with online sample loading, isotope dimethyl labeling and multidimensional separation for high-throughput quantitative proteome analysis.

    PubMed

    Wang, Fangjun; Chen, Rui; Zhu, Jun; Sun, Deguang; Song, Chunxia; Wu, Yifeng; Ye, Mingliang; Wang, Liming; Zou, Hanfa

    2010-04-01

    Multidimensional separation is often applied for large-scale qualitative and quantitative proteome analysis. A fully automated system with integration of a reversed phase-strong cation exchange (RP-SCX) biphasic trap column into vented sample injection system was developed to realize online sample loading, isotope dimethyl labeling and online multidimensional separation of the proteome samples. Comparing to conventionally manual isotope labeling and off-line fractionation technologies, this system is fully automated and time-saving, which is benefit for improving the quantification reproducibility and accuracy. As phosphate SCX monolith was integrated into the biphasic trap column, high sample injection flow rate and high-resolution stepwise fractionation could be easily achieved. Approximately 1000 proteins could be quantified in approximately 30 h proteome analysis, and the proteome coverage of quantitative analysis can be further greatly improved by prolong the multidimensional separation time. This system was applied to analyze the different protein expression level of HCC and normal human liver tissues. After three times replicated analysis, finally 94 up-regulated and 249 down-regulated (HCC/Normal) proteins were successfully obtained. These significantly regulated proteins are widely validated by both gene and proteins expression studies previously. Such as some enzymes involved in urea cycle, methylation cycle and fatty acids catabolism in liver were all observed down-regulated.

  2. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius

  3. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  4. Assessment of neonatal rat's activity by the automated registration of the animal entries in the squares of a testing arena.

    PubMed

    Menshanov, Petr N; Dygalo, Nikolay N

    2007-08-30

    Automated registration of neonatal rat entries in the squares of a testing chamber is suggested for the animal locomotion assessment. This method allows detection of paddling and pivoting activities that are not accompanied by forward movement of the animal. The proposed technique is also relatively insensitive to nonlocomotor changes in a pup's body position, such as breathing and shaking, and thus offers a selective detection of locomotor-related activity. The application of the method permits the evaluation of spontaneous and stimulated motor activity of neonatal rats using relatively short test duration and a minimal number of animals.

  5. Preliminary biogeochemical assessment of EPICA LGM and Holocene ice samples

    NASA Astrophysics Data System (ADS)

    Bulat, S.; Alekhina, I.; Marie, D.; Wagenbach, D.; Raynaud, D.; Petit, J. R.

    2009-04-01

    weak signals were possible to generate which are now under cloning. The signals were hard to reproduce because of rather low volume of samples. More ice volume is needed to get the biosignal stronger and reproducible. Meantime we are adjusting PCR and in addition testing DNA repair-enzyme cocktail in case of DNA damage. As a preliminary conclusion we would like to highlight the following. Both Holocene and LGM ice samples (EDC99 and EDML) are very clean in terms of Ultra low biomass and Ultra low DOC content. The most basal ice of EDC and EDML ice cores could help in assessing microbial biomass and diversity if present under the glacier at the ice-bedrock boundary. * The present-day consortium includes S. Bulat, I. Alekhina, P. Normand, D. Prieur, J-R. Petit and D. Raynaud (France) and E. Willerslev and J.P. Steffensen (Denmark)

  6. Effects of sample aging on total cholesterol values determined by the automated ferric chloride-sulfuric acid and Liebermann-Burchard procedures.

    PubMed

    Wood, P D; Bachorik, P S; Albers, J J; Stewart, C C; Winn, C; Lippel, K

    1980-04-01

    To investigate the comparability of three commonly used methods for determination of total cholesterol in plasma in several studies, we used fresh plasma samples as well as plasmas and reference sera that had been stored frozen at -15 degrees C for as long as several years. Duplicate determinations by the manual method of Abell et al. (J. Biol. Chem. 195: 357, 1952) were compared with estimates from one to five continuous-flow analyzers by the ferric chloride-sulfuric acid procedure and also with estimates from five to 13 continuous-flow analyzers by the Liebermann-Burchard procedure with calibrator, as part of the laboratory standardization activities of the Lipid Research Clinics. The agreement among all three procedures was generally within acceptable limits (within 5% of the manual method) when plasmas or sera were fresh or had been frozen for less than one month. Results by the manual method of Abell et al. agreed well with those by the automated Liebermann-Burchard method for samples that had been stored at -15 degrees C for as long as two years. However, the automated ferric chloride-sulfuric acid procedure often showed unacceptably high values (as compared with those from the manual method) for samples that had been stored frozen for a year or more. With the ferric chloride-sulfuric acid method, measured cholesterol concentration increased about 2.5% per year of storage for at least two years. We conclude that reference sera of plasmas that have been kept in long-term frozen storage (-15 degrees C) are not suitable for ongoing standardization of the automated ferric chloride-sulfuric acid assay for cholesterol.

  7. Energy Impact of Different Penetrations of Connected and Automated Vehicles: A Preliminary Assessment

    SciTech Connect

    Rios-Torres, Jackeline; Malikopoulos, Andreas

    2016-01-01

    Previous research reported in the literature has shown the benefits of traffic coordination to alleviate congestion, and reduce fuel consumption and emissions. However, there are still many remaining challenges that need to be addressed before a massive deployment of fully automated vehicles. This paper aims to investigate the energy impacts of different penetration rates of connected and automated vehicles (CAVs) and their interaction with human-driven vehicles. We develop a simulation framework for mixed traffic (CAVs interacting with human-driven vehicles) in merging roadways and analyze the energy impact of different penetration rates of CAVs on the energy consumption. The Gipps car following model is used along with heuristic controls to represent the driver decisions in a merging roadways traffic scenario. Using different penetration rates of CAVs, the simulation results indicated that for low penetration rates, the fuel consumption benefits are significant but the total travel time increases. The benefits in travel time are noticeable for higher penetration rates of CAVs.

  8. Automated DNA-based plant identification for large-scale biodiversity assessment.

    PubMed

    Papadopoulou, Anna; Chesters, Douglas; Coronado, Indiana; De la Cadena, Gissela; Cardoso, Anabela; Reyes, Jazmina C; Maes, Jean-Michel; Rueda, Ricardo M; Gómez-Zurita, Jesús

    2015-01-01

    Rapid degradation of tropical forests urges to improve our efficiency in large-scale biodiversity assessment. DNA barcoding can assist greatly in this task, but commonly used phenetic approaches for DNA-based identifications rely on the existence of comprehensive reference databases, which are infeasible for hyperdiverse tropical ecosystems. Alternatively, phylogenetic methods are more robust to sparse taxon sampling but time-consuming, while multiple alignment of species-diagnostic, typically length-variable, markers can be problematic across divergent taxa. We advocate the combination of phylogenetic and phenetic methods for taxonomic assignment of DNA-barcode sequences against incomplete reference databases such as GenBank, and we developed a pipeline to implement this approach on large-scale plant diversity projects. The pipeline workflow includes several steps: database construction and curation, query sequence clustering, sequence retrieval, distance calculation, multiple alignment and phylogenetic inference. We describe the strategies used to establish these steps and the optimization of parameters to fit the selected psbA-trnH marker. We tested the pipeline using infertile plant samples and herbivore diet sequences from the highly threatened Nicaraguan seasonally dry forest and exploiting a valuable purpose-built resource: a partial local reference database of plant psbA-trnH. The selected methodology proved efficient and reliable for high-throughput taxonomic assignment, and our results corroborate the advantage of applying 'strict' tree-based criteria to avoid false positives. The pipeline tools are distributed as the scripts suite 'BAGpipe' (pipeline for Biodiversity Assessment using GenBank data), which can be readily adjusted to the purposes of other projects and applied to sequence-based identification for any marker or taxon.

  9. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.

  10. Space Station Freedom automation and robotics: An assessment of the potential for increased productivity

    NASA Technical Reports Server (NTRS)

    Weeks, David J.; Zimmerman, Wayne F.; Swietek, Gregory E.; Reid, David H.; Hoffman, Ronald B.; Stammerjohn, Lambert W., Jr.; Stoney, William; Ghovanlou, Ali H.

    1990-01-01

    This report presents the results of a study performed in support of the Space Station Freedom Advanced Development Program, under the sponsorship of the Space Station Engineering (Code MT), Office of Space Flight. The study consisted of the collection, compilation, and analysis of lessons learned, crew time requirements, and other factors influencing the application of advanced automation and robotics, with emphasis on potential improvements in productivity. The lessons learned data collected were based primarily on Skylab, Spacelab, and other Space Shuttle experiences, consisting principally of interviews with current and former crew members and other NASA personnel with relevant experience. The objectives of this report are to present a summary of this data and its analysis, and to present conclusions regarding promising areas for the application of advanced automation and robotics technology to the Space Station Freedom and the potential benefits in terms of increased productivity. In this study, primary emphasis was placed on advanced automation technology because of its fairly extensive utilization within private industry including the aerospace sector. In contrast, other than the Remote Manipulator System (RMS), there has been relatively limited experience with advanced robotics technology applicable to the Space Station. This report should be used as a guide and is not intended to be used as a substitute for official Astronaut Office crew positions on specific issues.

  11. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs: Part II

    NASA Astrophysics Data System (ADS)

    Straub, J. A.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Gladhill, R.; Nolke, S.; Riddick, J.

    2006-10-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. In the previous paper, it was shown how photomask MRC is used to uncover data related problems prior to automated defect inspection. It was demonstrated how jobs which are likely to have problems at inspection could be identified and separated from those which are not. The use of photomask MRC in production was shown to reduce time lost to aborted runs and troubleshooting due to data issues. In this paper, the effectiveness of this photomask MRC program in a high volume photomask factory over the course of a year as applied to more than ten thousand jobs will be shown. Statistics on the results of the MRC runs will be presented along with the associated impact to the automated defect inspection process. Common design problems will be shown as well as their impact to mask manufacturing throughput and productivity. Finally, solutions to the most common and most severe problems will be offered and discussed.

  12. Reliability assessment of an automated forced swim test device using two mouse strains.

    PubMed

    Kurtuncu, Murat; Luka, Lance J; Dimitrijevic, Nikola; Uz, Tolga; Manev, Hari

    2005-11-30

    The Porsolt forced swim test (FST) is one of the most widely used behavioral tests in the evaluation of the antidepressant effects of drugs. It is based on the fact that these drugs reduce the depression-related behaviors of learned helplessness. The model has been modified for use in mice. In contrast to rats, mice are exposed to forced swimming only once and their immobility behavior is measured and considered a "depression-like" phenotype. Like many other behavioral tests, FST can be affected by observer-related artifacts. In recent years, automated testing systems have been developed to decrease artifacts that may greatly influence the interpretation of results. In this work, we used two strains of mice, i.e., C3H/HeJ and C57BL/6J, which differ in their FST immobility times. We employed a new commercially available automated FST device and a blinded observer-based FST, and we examined their ability to measure behavioral differences between these two mouse strains. Our results suggest that the tested automated FST system generates reliable data comparable to results obtained by trained observers.

  13. Semi-automated disk-type solid-phase extraction method for polychlorinated dibenzo-p-dioxins and dibenzofurans in aqueous samples and its application to natural water.

    PubMed

    Choi, J W; Lee, J H; Moon, B S; Baek, K H

    2007-07-20

    A disk-type solid-phase extraction (SPE) method was used for the extraction of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in natural water and tap water. Since this SPE system comprised airtight glass covers with a decompression pump, it enabled continuous extraction with semi-automation. The disk-type SPE method was validated by comparing its recovery rates of spiked internal standards with those of the liquid-liquid extraction (LLE). The recovery ranges of both methods were similar in terms of (13)C-labeled internal standards: 64.3-99.2% for the LLE and 52.4-93.6% for the SPE. For the native spike of 1,3,6,8-tetrachlorinated dibenzo-p-dioxin (TCDD) and octachlorinated dibenzo-p-dioxin (OCDD), the recoveries in the SPE were in the normal range of 77.9-101.1%. However, in the LLE, the recoveries of 1,3,6,8-TCDD decreased significantly. One of the reasons for the low recovery is that the solubility of this congener is high. The semi-automated SPE method was applied to the analysis of different types of water: river water, snow, sea water, raw water for drinking purposes, and tap water. PCDD/F congeners were found in some sea water and snow samples, while their concentrations in the other samples were below the limits of detection (LODs). This SPE system is appropriate for the routine analysis of water samples below 50L.

  14. An automated system to mount cryo-cooled protein crystals on a synchrotron beam line, using compact sample cassettes and a small-scale robot

    PubMed Central

    Cohen, Aina E.; Ellis, Paul J.; Miller, Mitchell D.; Deacon, Ashley M.; Phizackerley, R. Paul

    2014-01-01

    An automated system for mounting and dismounting pre-frozen crystals has been implemented at the Stanford Synchrotron Radiation Laboratory (SSRL). It is based on a small industrial robot and compact cylindrical cassettes, each holding up to 96 crystals mounted on Hampton Research sample pins. For easy shipping and storage, the cassette fits inside several popular dry-shippers and long-term storage Dewars. A dispensing Dewar holds up to three cassettes in liquid nitrogen adjacent to the beam line goniometer. The robot uses a permanent magnet tool to extract samples from, and insert samples into a cassette, and a cryo-tong tool to transfer them to and from the beam line goniometer. The system is simple, with few moving parts, reliable in operation and convenient to use. PMID:24899734

  15. Automated analysis of mitomycin C in body fluids by high-performance liquid chromatography with on-line sample pre-treatment.

    PubMed

    Tjaden, U R; de Bruijn, E A; van der Hoeven, R A; Jol, C; van der Greef, J; Lingeman, H

    1987-09-04

    A fully automated liquid chromatographic system for the bioanalysis of mitomycin C has been described. The isolation of the analyte from the biological matrix (plasma, ascites and urine) is performed using a continuous-flow system equipped with a dialysis membrane in order to remove proteins. The samples are concentrated on a reversed-phase pre-column and subsequently introduced on to a reversed-phase analytical column by applying column-switching techniques. The drug is detected by absorbance measurements at 360 nm. Using the described system up to 100 samples a day can be analysed with determination limits of the order of 1 ng/ml, with a linear dynamic range of at least three decades for plasma and urine samples. The procedure was applied to pharmacokinetic studies of ovarian cancer patients treated intraperitoneally with mitomycin C.

  16. Optimization of automated gas sample collection and isotope ratio mass spectrometric analysis of delta(13)C of CO(2) in air.

    PubMed

    Zeeman, Matthias J; Werner, Roland A; Eugster, Werner; Siegwolf, Rolf T W; Wehrle, Günther; Mohn, Joachim; Buchmann, Nina

    2008-12-01

    The application of (13)C/(12)C in ecosystem-scale tracer models for CO(2) in air requires accurate measurements of the mixing ratios and stable isotope ratios of CO(2). To increase measurement reliability and data intercomparability, as well as to shorten analysis times, we have improved an existing field sampling setup with portable air sampling units and developed a laboratory setup for the analysis of the delta(13)C of CO(2) in air by isotope ratio mass spectrometry (IRMS). The changes consist of (a) optimization of sample and standard gas flow paths, (b) additional software configuration, and (c) automation of liquid nitrogen refilling for the cryogenic trap. We achieved a precision better than 0.1 per thousand and an accuracy of 0.11 +/- 0.04 per thousand for the measurement of delta(13)C of CO(2) in air and unattended operation of measurement sequences up to 12 h.

  17. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  18. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  19. Automated ambulatory assessment of cognitive performance, environmental conditions, and motor activity during military operations

    NASA Astrophysics Data System (ADS)

    Lieberman, Harris R.; Kramer, F. Matthew; Montain, Scott J.; Niro, Philip; Young, Andrew J.

    2005-05-01

    Until recently scientists had limited opportunities to study human cognitive performance in non-laboratory, fully ambulatory situations. Recently, advances in technology have made it possible to extend behavioral assessment to the field environment. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device, now widely employed, can acquire minute-by-minute information on an individual"s level of motor activity. Actigraphs can, with reasonable accuracy, distinguish sleep from waking, the most critical and basic aspect of human behavior. However, rapid technologic advances have provided the opportunity to collect much more information from fully ambulatory humans. Our laboratory has developed a series of wrist-worn devices, which are not much larger then a watch, which can assess simple and choice reaction time, vigilance and memory. In addition, the devices can concurrently assess motor activity with much greater temporal resolution then the standard actigraph. Furthermore, they continuously monitor multiple environmental variables including temperature, humidity, sound and light. We have employed these monitors during training and simulated military operations to collect information that would typically be unavailable under such circumstances. In this paper we will describe various versions of the vigilance monitor and how each successive version extended the capabilities of the device. Samples of data from several studies are presented, included studies conducted in harsh field environments during simulated infantry assaults, a Marine Corps Officer training course and mechanized infantry (Stryker) operations. The monitors have been useful for documenting environmental conditions experienced by wearers, studying patterns of sleep and activity and examining the effects of nutritional manipulations on warfighter performance.

  20. Toxicological Assessment of ISS Air Quality: Contingency Sampling - February 2013

    NASA Technical Reports Server (NTRS)

    Meyers, Valerie

    2013-01-01

    Two grab sample containers (GSCs) were collected by crew members onboard ISS in response to a vinegar-like odor in the US Lab. On February 5, the first sample was collected approximately 1 hour after the odor was noted by the crew in the forward portion of the Lab. The second sample was collected on February 22 when a similar odor was noted and localized to the end ports of the microgravity science glovebox (MSG). The crewmember removed a glove from the MSG and collected the GSC inside the glovebox volume. Both samples were returned on SpaceX-2 for ground analysis.

  1. Adjustment for unbalanced sample size for analytical biosimilar equivalence assessment.

    PubMed

    Dong, Xiaoyu Cassie; Weng, Yu-Ting; Tsong, Yi

    2017-01-06

    Large sample size imbalance is not uncommon in the biosimilar development. At the beginning of a product development, sample sizes of a biosimilar and a reference product may be limited. Thus, a sample size calculation may not be feasible. During the development stage, more batches of reference products may be added at a later stage to have a more reliable estimate of the reference variability. On the other hand, we also need a sufficient number of biosimilar batches in order to have a better understanding of the product. Those challenges lead to a potential sample size imbalance. In this paper, we show that large sample size imbalance may increase the power of the equivalence test in an unfavorable way, giving higher power for less similar products when the sample size of biosimilar is much smaller than that of the reference product. Thus, it is necessary to make some sample size imbalance adjustments to motivate sufficient sample size for biosimilar as well. This paper discusses two adjustment methods for the equivalence test in analytical biosimilarity studies. Please keep in mind that sufficient sample sizes for both biosimilar and reference products (if feasible) are desired during the planning stage.

  2. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  3. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    PubMed Central

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors. DOI: http://dx.doi.org/10.7554/eLife.19532.001 PMID:27731798

  4. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism.

    PubMed

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-10-12

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.

  5. Automated Writing Evaluation for Formative Assessment of Second Language Writing: Investigating the Accuracy and Usefulness of Feedback as Part of Argument-Based Validation

    ERIC Educational Resources Information Center

    Ranalli, Jim; Link, Stephanie; Chukharev-Hudilainen, Evgeny

    2017-01-01

    An increasing number of studies on the use of tools for automated writing evaluation (AWE) in writing classrooms suggest growing interest in their potential for formative assessment. As with all assessments, these applications should be validated in terms of their intended interpretations and uses. A recent argument-based validation framework…

  6. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  7. Automated screening of 2D crystallization trials using transmission electron microscopy: a high-throughput tool-chain for sample preparation and microscopic analysis.

    PubMed

    Coudray, Nicolas; Hermann, Gilles; Caujolle-Bert, Daniel; Karathanou, Argyro; Erne-Brand, Françoise; Buessler, Jean-Luc; Daum, Pamela; Plitzko, Juergen M; Chami, Mohamed; Mueller, Urs; Kihl, Hubert; Urban, Jean-Philippe; Engel, Andreas; Rémigy, Hervé-W

    2011-02-01

    We have built and extensively tested a tool-chain to prepare and screen two-dimensional crystals of membrane proteins by transmission electron microscopy (TEM) at room temperature. This automated process is an extension of a new procedure described recently that allows membrane protein 2D crystallization in parallel (Iacovache et al., 2010). The system includes a gantry robot that transfers and prepares the crystalline solutions on grids suitable for TEM analysis and an entirely automated microscope that can analyze 96 grids at once without human interference. The operation of the system at the user level is solely controlled within the MATLAB environment: the commands to perform sample handling (loading/unloading in the microscope), microscope steering (magnification, focus, image acquisition, etc.) as well as automatic crystal detection have been implemented. Different types of thin samples can efficiently be screened provided that the particular detection algorithm is adapted to the specific task. Hence, operating time can be shared between multiple users. This is a major step towards the integration of transmission electron microscopy into a high throughput work-flow.

  8. Post-operative corticosterone levels in plasma and feces of mice subjected to permanent catheterization and automated blood sampling.

    PubMed

    Sundbom, Renée; Jacobsen, Kirsten R; Kalliokoski, Otto; Hau, Jann; Abelson, Klas S P

    2011-01-01

    This study investigated the effects of surgical placement of permanent arterial catheters on plasma corticosterone levels, fecal corticosterone excretion and body weight in male BALB/c/Sca mice. In addition, the effects of voluntarily ingested buprenorphine in doses of 0.5 and 1.0 mg/kg body weight on these parameters were studied. A catheter was placed in the carotid artery during isoflurane anesthesia. Immediately after surgery, the mice were connected to an AccuSampler® μ and blood samples for plasma corticosterone quantification were collected automatically during the first 24 h postoperatively. All fecal boli produced 24 h before and 24 h after surgery were collected for fecal corticosterone excretion measures and the pre- and post-operative body weights were registered. Plasma corticosterone levels were in the range of 150-300 ng/ml after the surgical procedure and the body weight was significantly lower 24 h after surgery compared to its pre-operative value. Contrary to what was expected, the total fecal corticosterone excretion was significantly reduced 24 h after surgery, as was the defecation. Buprenorphine treatment significantly lowered the plasma corticosterone levels, but had no effect on fecal corticosterone excretion or body weight change. It was concluded that surgical placement of an arterial catheter induces a significant stress response, as judged by its effect on plasma corticosterone and body weight. Voluntary ingestion of buprenorphine improved postoperative recovery by lowering plasma corticosterone concentrations. Neither fecal corticosterone excretion nor body weight change seems suitable for postoperative stress assessment in mice in the present experimental setup.

  9. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  10. A molecular method to assess Phytophthora diversity in environmental samples.

    PubMed

    Scibetta, Silvia; Schena, Leonardo; Chimento, Antonio; Cacciola, Santa O; Cooke, David E L

    2012-03-01

    Current molecular detection methods for the genus Phytophthora are specific to a few key species rather than the whole genus and this is a recognized weakness of protocols for ecological studies and international plant health legislation. In the present study a molecular approach was developed to detect Phytophthora species in soil and water samples using novel sets of genus-specific primers designed against the internal transcribed spacer (ITS) regions. Two different rDNA primer sets were tested: one assay amplified a long product including the ITS1, 5.8S and ITS2 regions (LP) and the other a shorter product including the ITS1 only (SP). Both assays specifically amplified products from Phytophthora species without cross-reaction with the related Pythium s. lato, however the SP assay proved the more sensitive and reliable. The method was validated using woodland soil and stream water from Invergowrie, Scotland. On-site use of a knapsack sprayer and in-line water filters proved more rapid and effective than centrifugation at sampling Phytophthora propagules. A total of 15 different Phytophthora phylotypes were identified which clustered within the reported ITS-clades 1, 2, 3, 6, 7 and 8. The range and type of the sequences detected varied from sample to sample and up to three and five different Phytophthora phylotypes were detected within a single sample of soil or water, respectively. The most frequently detected sequences were related to members of ITS-clade 6 (i.e. P. gonapodyides-like). The new method proved very effective at discriminating multiple species in a given sample and can also detect as yet unknown species. The reported primers and methods will prove valuable for ecological studies, biosecurity and commercial plant, soil or water (e.g. irrigation water) testing as well as the wider metagenomic sampling of this fascinating component of microbial pathogen diversity.

  11. Automated assessment of split lung functon in post-lung-transplant evaluation

    NASA Astrophysics Data System (ADS)

    Goldin, Jonathan G.; Brown, Matthew S.; McNitt-Gray, Michael F.; Greaser, Lloyd E.; Martin, Katherine; Sayre, James W.; Aberle, Denise R.

    1998-07-01

    The purpose of this work was to develop an automated technique for calculating dynamic lung attenuation changes, through a forced expiratory maneuver, as a measure of split lung function. A total of ten patients post single lung transplantation (SLT) for emphysema were imaged using an Electron Beam CT Scanner; three were studied twice following stent placement. A single-slice flow study, using 100 msec exposures and 3 mm collimation, was performed at the level of the anastomosis during a forced expiration. Images were acquired every 500 msec for the first 3 seconds and every second for the last 4 seconds. An automated, knowledge-based system was developed to segment the chest wall, mediastinum, large airways and lung parenchyma in each image. Knowledge of the expected size, shape, topology and X-ray attenuation of anatomical structures were used to guide image segmentation involving attenuation thresholding, region-growing and morphology. From the segmented left and right parenchyma, the system calculated median attenuation (HU) and cross-sectional areas. These results were plotted against time for both the native and transplanted lungs. In five patients, significant shift of the attenuation/time curve to the right (slower flow) was detected, although the end expiration attenuation was not different. Following stent placement the curve shifted back to the left (faster flow).

  12. Automated cytochrome c oxidase bioassay developed for ionic liquids' toxicity assessment.

    PubMed

    Costa, Susana P F; Martins, Bárbara S F; Pinto, Paula C A G; Saraiva, M Lúcia M F S

    2016-05-15

    A fully automated cytochrome c oxidase assay resorting to sequential injection analysis (SIA) was developed for the first time and implemented to evaluate potential toxic compounds. The bioassay was validated by evaluation of 15 ionic liquids (ILs) with distinct cationic head groups, alkyl side chains and anions. The assay was based on cytochrome c oxidase activity reduction in presence of tested compounds and quantification of inhibitor concentration required to cause 50% of enzyme activity inhibition (EC50). The obtained results demonstrated that enzyme activity was considerably inhibited by BF4 anion and ILs incorporating non-aromatic pyrrolidinium and tetrabutylphosphonium cation cores. Emim [Ac] and chol [Ac], on contrary, presented the higher EC50 values among the ILs tested. The developed automated SIA methodology is a simple and robust high-throughput screening bioassay and exhibited good repeatability in all the tested conditions (rsd<3.7%, n=10). Therefore, it is expected that due to its simplicity and low cost, the developed approach can be used as alternative to traditional screening assays for evaluation of ILs toxicity and identification of possible toxicophore structures. Additionally, the results presented in this study provide further information about ILs toxicity.

  13. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images.

    PubMed

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  14. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  15. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  16. High-throughput, automated extraction of DNA and RNA from clinical samples using TruTip technology on common liquid handling robots.

    PubMed

    Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P

    2013-06-11

    TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).

  17. Protein Quality Assessment on Saliva Samples for Biobanking Purposes.

    PubMed

    Rosa, Nuno; Marques, Jéssica; Esteves, Eduardo; Fernandes, Mónica; Mendes, Vera M; Afonso, Ângela; Dias, Sérgio; Pereira, Joaquim Polido; Manadas, Bruno; Correia, Maria José; Barros, Marlene

    2016-08-01

    Biobank saliva sample quality depends on specific criteria applied to collection, processing, and storage. In spite of the growing interest in saliva as a diagnostic fluid, few biobanks currently store large collections of such samples. The development of a standard operating procedure (SOP) for saliva collection and quality control is fundamental for the establishment of a new saliva biobank, which stores samples to be made available to the saliva research community. Different collection methods were tested regarding total volume of protein obtained, protein content, and protein profiles, and the results were used to choose the best method for protein studies. Furthermore, the impact of the circadian variability and inter- and intraindividual differences, as well as the saliva sample stability at room temperature, were also evaluated. Considering our results, a sublingual cotton roll method for saliva collection proved to produce saliva with the best characteristics and should be applied in the morning, whenever possible. In addition, there is more variability in salivary proteins between individuals than in the same individual for a 5-month period. According to the electrophoretic protein profile, protein stability is guaranteed for 24 hours at room temperature and the protein degradation profile and protein identification were characterized. All this information was used to establish an SOP for saliva collection, processing, and storage in a biobank. We conclude that it is possible to collect saliva using an easy and inexpensive protocol, resulting in saliva samples for protein analysis with sufficient quality for biobanking purposes.

  18. Influence of sample preparation and reliability of automated numerical refocusing in stain-free analysis of dissected tissues with quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Lenz, Philipp; Bettenworth, Dominik; Krausewitz, Philipp; Domagk, Dirk; Ketelhut, Steffi

    2015-05-01

    Digital holographic microscopy (DHM) has been demonstrated to be a versatile tool for high resolution non-destructive quantitative phase imaging of surfaces and multi-modal minimally-invasive monitoring of living cell cultures in-vitro. DHM provides quantitative monitoring of physiological processes through functional imaging and structural analysis which, for example, gives new insight into signalling of cellular water permeability and cell morphology changes due to toxins and infections. Also the analysis of dissected tissues quantitative DHM phase contrast prospects application fields by stain-free imaging and the quantification of tissue density changes. We show that DHM allows imaging of different tissue layers with high contrast in unstained tissue sections. As the investigation of fixed samples represents a very important application field in pathology, we also analyzed the influence of the sample preparation. The retrieved data demonstrate that the quality of quantitative DHM phase images of dissected tissues depends strongly on the fixing method and common staining agents. As in DHM the reconstruction is performed numerically, multi-focus imaging is achieved from a single digital hologram. Thus, we evaluated the automated refocussing feature of DHM for application on different types of dissected tissues and revealed that on moderately stained samples highly reproducible holographic autofocussing can be achieved. Finally, it is demonstrated that alterations of the spatial refractive index distribution in murine and human tissue samples represent a reliable absolute parameter that is related of different degrees of inflammation in experimental colitis and Crohn's disease. This paves the way towards the usage of DHM in digital pathology for automated histological examinations and further studies to elucidate the translational potential of quantitative phase microscopy for the clinical management of patients, e.g., with inflammatory bowel disease.

  19. Rapid assessment of soil and groundwater tritium by vegetation sampling

    SciTech Connect

    Murphy, C.E. Jr.

    1995-09-01

    A rapid and relatively inexpensive technique for defining the extent of groundwater contamination by tritium has been investigated. The technique uses existing vegetation to sample the groundwater. Water taken up by deep rooted trees is collected by enclosing tree branches in clear plastic bags. Water evaporated from the leaves condenses on the inner surface of the bag. The water is removed from the bag with a syringe. The bags can be sampled many times. Tritium in the water is detected by liquid scintillation counting. The water collected in the bags has no color and counts as well as distilled water reference samples. The technique was used in an area of known tritium contamination and proved to be useful in defining the extent of tritium contamination.

  20. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    SciTech Connect

    Bennett, Bonnie; Boddy, Mark; Doyle, Frank; Jamshidi, Mo; Ogunnaike, Tunde

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  1. Metabolomic Quality Assessment of EDTA Plasma and Serum Samples.

    PubMed

    Malm, Linus; Tybring, Gunnel; Moritz, Thomas; Landin, Britta; Galli, Joakim

    2016-10-01

    Handling and processing of blood can significantly alter the molecular composition and consistency of biobank samples and can have a major impact on the identification of biomarkers. It is thus crucial to identify tools to determine the quality of samples to be used in biomarker discovery studies. In this study, a non-targeted gas chromatography/time-of-flight mass spectrometry (GC-TOFMS) metabolomic strategy was used with the aim of identifying quality markers for serum and plasma biobank collections lacking proper documentation of preanalytical handling. The effect of postcentrifugation delay was examined in serum stored in tubes with gel separation plugs and ethylenediaminetetraacetic acid (EDTA) plasma in tubes with or without gel separation plugs. The change in metabolic pattern was negligible in all sample types processed within 3 hours after centrifugation regardless of whether the samples were kept at 4°C or 22°C. After 8 and 24 hours postcentrifugation delay before aliquoting, there was a pronounced increase in the number of affected metabolites, as well as in the magnitude of the observed changes. No protective effect on the metabolites was observed in gel-separated EDTA plasma samples. In a separate series of experiments, lactate and glucose levels were determined in plasma to estimate the effect of precentrifugation delay. This separate experiment indicates that the lactate to glucose ratio may serve as a marker to identify samples with delayed time to centrifugation. Although our data from the untargeted GC-TOFMS analysis did not identify any specific markers, we conclude that plasma and serum metabolic profiles remain quite stable when plasma and serum are centrifuged and separated from the blood cells within 3 hours.

  2. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    SciTech Connect

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  3. Quantifying Vocal Mimicry in the Greater Racket-Tailed Drongo: A Comparison of Automated Methods and Human Assessment

    PubMed Central

    Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini

    2014-01-01

    Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717

  4. Automated 3D quantitative assessment and measurement of alpha angles from the femoral head-neck junction using MR imaging

    NASA Astrophysics Data System (ADS)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Walker, Duncan; Crozier, Stuart; Engstrom, Craig

    2015-10-01

    To develop an automated approach for 3D quantitative assessment and measurement of alpha angles from the femoral head-neck (FHN) junction using bone models derived from magnetic resonance (MR) images of the hip joint. Bilateral MR images of the hip joints were acquired from 30 male volunteers (healthy active individuals and high-performance athletes, aged 18-49 years) using a water-excited 3D dual echo steady state (DESS) sequence. In a subset of these subjects (18 water-polo players), additional True Fast Imaging with Steady-state Precession (TrueFISP) images were acquired from the right hip joint. For both MR image sets, an active shape model based algorithm was used to generate automated 3D bone reconstructions of the proximal femur. Subsequently, a local coordinate system of the femur was constructed to compute a 2D shape map to project femoral head sphericity for calculation of alpha angles around the FHN junction. To evaluate automated alpha angle measures, manual analyses were performed on anterosuperior and anterior radial MR slices from the FHN junction that were automatically reformatted using the constructed coordinate system. High intra- and inter-rater reliability (intra-class correlation coefficients  >  0.95) was found for manual alpha angle measurements from the auto-extracted anterosuperior and anterior radial slices. Strong correlations were observed between manual and automatic measures of alpha angles for anterosuperior (r  =  0.84) and anterior (r  =  0.92) FHN positions. For matched DESS and TrueFISP images, there were no significant differences between automated alpha angle measures obtained from the upper anterior quadrant of the FHN junction (two-way repeated measures ANOVA, F  <  0.01, p  =  0.98). Our automatic 3D method analysed MR images of the hip joints to generate alpha angle measures around the FHN junction circumference with very good reliability and reproducibility. This work has the

  5. Genesis Solar Wind Collector Cleaning Assessment: 60366 Sample Case Study

    NASA Technical Reports Server (NTRS)

    Goreva, Y. S.; Gonzalez, C. P.; Kuhlman, K. R.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, M. C.; Burkett, P. J.

    2014-01-01

    In order to recognize, localize, characterize and remove particle and thin film surface contamination, a small subset of Genesis mission collector fragments are being subjected to extensive study via various techniques [1-5]. Here we present preliminary results for sample 60336, a Czochralski silicon (Si-CZ) based wafer from the bulk array (B/C).

  6. Automated system for generation of soil moisture products for agricultural drought assessment

    NASA Astrophysics Data System (ADS)

    Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically

  7. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  8. Assessing the Validity of Automated Webcrawlers as Data Collection Tools to Investigate Online Child Sexual Exploitation.

    PubMed

    Westlake, Bryce; Bouchard, Martin; Frank, Richard

    2015-11-26

    The distribution of child sexual exploitation (CE) material has been aided by the growth of the Internet. The graphic nature and prevalence of the material has made researching and combating difficult. Although used to study online CE distribution, automated data collection tools (e.g., webcrawlers) have yet to be shown effective at targeting only relevant data. Using CE-related image and keyword criteria, we compare networks starting from CE websites to those from similar non-CE sexuality websites and dissimilar sports websites. Our results provide evidence that (a) webcrawlers have the potential to provide valid CE data, if the appropriate criterion is selected; (b) CE distribution is still heavily image-based suggesting images as an effective criterion; (c) CE-seeded networks are more hub-based and differ from non-CE-seeded networks on several website characteristics. Recommendations for improvements to reliable criteria selection are discussed.

  9. An Accuracy Assessment of Automated Photogrammetric Techniques for 3d Modeling of Complex Interiors

    NASA Astrophysics Data System (ADS)

    Georgantas, A.; Brédif, M.; Pierrot-Desseilligny, M.

    2012-07-01

    This paper presents a comparison of automatic photogrammetric techniques to terrestrial laser scanning for 3D modelling of complex interior spaces. We try to evaluate the automated photogrammetric techniques not only in terms of their geometric quality compared to laser scanning but also in terms of cost in money, acquisition and computational time. To this purpose we chose as test site a modern building's stairway. APERO/MICMAC ( ©IGN )which is an Open Source photogrammetric software was used for the production of the 3D photogrammetric point cloud which was compared to the one acquired by a Leica Scanstation 2 laser scanner. After performing various qualitative and quantitative controls we present the advantages and disadvantages of each 3D modelling method applied in a complex interior of a modern building.

  10. Development of Genesis Solar Wind Sample Cleanliness Assessment: Initial Report on Sample 60341 Optical Imagery and Elemental Mapping

    NASA Technical Reports Server (NTRS)

    Gonzalez, C. P.; Goreva, Y. S.; Burnett, D. S.; Woolum, D.; Jurewicz, A. J.; Allton, J. H.; Rodriguez, P. J.; Burkett, P. J.

    2014-01-01

    Since 2005 the Genesis science team has experimented with techniques for removing the contaminant particles and films from the collection surface of the Genesis fragments. A subset of 40 samples have been designated as "cleaning matrix" samples. These are small samples to which various cleaning approaches are applied and then cleanliness is assessed optically, by TRXRF, SEM, ToF-SIMS, XPS, ellipsometry or other means [1-9]. Most of these sam-ples remain available for allocation, with cleanliness assessment data. This assessment allows evaluation of various cleaning techniques and handling or analytical effects. Cleaning techniques investigated by the Genesis community include acid/base etching, acetate replica peels, ion beam, and CO2 snow jet cleaning [10-16]. JSC provides surface cleaning using UV ozone exposure and ultra-pure water (UPW) [17-20]. The UPW rinse is commonly used to clean samples for handling debris between processing by different researchers. Optical microscopic images of the sample taken before and after UPW cleaning show what has been added or removed during the cleaning process.

  11. Assessing total and volatile solids in municipal solid waste samples.

    PubMed

    Peces, M; Astals, S; Mata-Alvarez, J

    2014-01-01

    Municipal solid waste is broadly generated in everyday activities and its treatment is a global challenge. Total solids (TS) and volatile solids (VS) are typical control parameters measured in biological treatments. In this study, the TS and VS were determined using the standard methods, as well as introducing some variants: (i) the drying temperature for the TS assays was 105°C, 70°C and 50°C and (ii) the VS were determined using different heating ramps from room tempature to 550°C. TS could be determined at either 105°C or 70°C, but oven residence time was tripled at 70°C, increasing from 48 to 144 h. The VS could be determined by smouldering the sample (where the sample is burnt without a flame), which avoids the release of fumes and odours in the laboratory. However, smouldering can generate undesired pyrolysis products as a consequence of carbonization, which leads to VS being underestimated. Carbonization can be avoided using slow heating ramps to prevent the oxygen limitation. Furthermore, crushing the sample cores decreased the time to reach constant weight and decreased the potential to underestimate VS.

  12. Determination of the Biologically Relevant Sampling Depth for Terrestrial and Aquatic Ecological Risk Assessments (Final Report)

    EPA Science Inventory

    The Ecological Risk Assessment Support Center (ERASC) announced the release of the final report, Determination of the Biologically Relevant Sampling Depth for Terrestrial and Aquatic Ecological Risk Assessments. This technical paper provides defensible approximations fo...

  13. ON-LINE TOOLS FOR PROPER VERTICAL POSITIONING OF VERTICAL SAMPLING INTERVALS DURING SITE ASSESSMENT

    EPA Science Inventory

    This presentation presents on-line tools for proper vertical positioning of vertical sampling intervals during site assessment. Proper vertical sample interval selection is critical for generate data on the vertical distribution of contamination. Without vertical delineation, th...

  14. Design and construction of a medium-scale automated direct measurement respirometric system to assess aerobic biodegradation of polymers

    NASA Astrophysics Data System (ADS)

    Castro Aguirre, Edgar

    A medium-scale automated direct measurement respirometric (DMR) system was designed and built to assess the aerobic biodegradation of up to 30 materials in triplicate simultaneously. Likewise, a computer application was developed for rapid analysis of the data generated. The developed DMR system was able to simulate different testing conditions by varying temperature and relative humidity, which are the major exposure conditions affecting biodegradation. Two complete tests for determining the aerobic biodegradation of polymers under composting conditions were performed to show the efficacy and efficiency of both the DMR system and the DMR data analyzer. In both cases, cellulose reached 70% mineralization at 139 and 45 days. The difference in time for cellulose to reach 70% mineralization was attributed to the composition of the compost and water availability, which highly affect the biodegradation rate. Finally, among the tested materials, at least 60% of the organic carbon content of the biodegradable polymers was converted into carbon dioxide by the end of the test.

  15. Towards automated early cancer detection: Non-invasive, fluorescence-based approaches for quantitative assessment of cells and tissue to identify pre-cancers

    NASA Astrophysics Data System (ADS)

    Levitt, Jonathan Michael

    Cancer is the second leading cause of death globally, second only to heart disease. As in many diseases, patient survival is directly related to how early lesions are detected. Using conventional screening methods, the early changes associated with cancer, which occur on the microscopic scale, can easily go overlooked. Due to the inherent drawbacks of conventional techniques we present non-invasive, optically based methods to acquire high resolution images from live samples and assess cellular function associated with the onset of disease. Specifically, we acquired fluorescence images from NADH and FAD to quantify morphology and metabolic activity. We first conducted studies to monitor monolayers of keratinocytes in response to apoptosis which has been shown to be disrupted during cancer progression. We found that as keratinocytes undergo apoptosis there are populations of mitochondria that exhibit a higher metabolic activity that become progressively confined to a gradually smaller perinuclear region. To further assess the changes associated with early cancer growth we developed automated methods to rapidly quantify fluorescence images and extract morphological and metabolic information from life tissue. In this study, we simultaneously quantified mitochondrial organization, metabolic activity, nuclear size distribution, and the localization of the structural protein keratin, to differentiate between normal and pre-cancerous engineered tissues. We found the degree mitochondrial organization, as determined from the fractal derived Hurst parameter, was well correlated to level of cellular differentiation. We also found that the metabolic activity in the pre-cancerous cells was greater and more consistent throughout tissue depths in comparison to normal tissue. Keratin localization, also quantified from the fluorescence images, we found it to be confined to the uppermost layers of normal tissue while it was more evenly distributed in the precancerous tissues. To

  16. Locoregional Control of Non-Small Cell Lung Cancer in Relation to Automated Early Assessment of Tumor Regression on Cone Beam Computed Tomography

    SciTech Connect

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders; Hansen, Olfred; Schytte, Tine; Bentzen, Soren M.

    2014-07-15

    Purpose: Large interindividual variations in volume regression of non-small cell lung cancer (NSCLC) are observable on standard cone beam computed tomography (CBCT) during fractionated radiation therapy. Here, a method for automated assessment of tumor volume regression is presented and its potential use in response adapted personalized radiation therapy is evaluated empirically. Methods and Materials: Automated deformable registration with calculation of the Jacobian determinant was applied to serial CBCT scans in a series of 99 patients with NSCLC. Tumor volume at the end of treatment was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2 groups having volume regressions below or above the population median value. Kaplan-Meier plots of locoregional disease-free rate and overall survival in the 2 groups were used to evaluate the predictive value of tumor regression during treatment. Cox proportional hazards model was used to adjust for other clinical characteristics. Results: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could be quantified early into the treatment course. Interestingly, patients with pronounced volume regression had worse locoregional tumor control and overall survival. This was significant on patient with non-adenocarcinoma histology. Conclusions: Evaluation of routinely acquired CBCT images during radiation therapy provides biological information on the specific tumor. This could potentially form the basis for personalized response adaptive therapy.

  17. A New Capability for Automated Target Selection and Sampling for use with Remote Sensing Instruments on the MER Rovers

    NASA Astrophysics Data System (ADS)

    Castano, R.; Estlin, T.; Anderson, R. C.; Gaines, D.; Bornstein, B.; de Granville, C.; Tang, B.; Thompson, D.; Judd, M.

    2008-12-01

    The Onboard Autonomous Science Investigation System (OASIS) evaluates geologic data gathered by a planetary rover. The system is designed to operate onboard a rover identifying and reacting to serendipitous science opportunities, such as rocks with novel properties. OASIS operates by analyzing data the rover gathers, and then using machine learning techniques, prioritizing the data based on criteria set by the science team. This prioritization can be used to organize data for transmission back to Earth and it can be used to search for specific targets it has been told to find by the science team. If one of these targets is found, it is identified as a new science opportunity and a "science alert" is sent to a planning and scheduling system. After reviewing the rover's current operational status to ensure that it has enough resources to complete its traverse and act on the new science opportunity, OASIS can change the command sequence of the rover in order to obtain additional science measurements. Currently, OASIS is being applied on a new front. OASIS is providing a new rover mission technology that enables targeted remote-sensing science in an automated fashion during or after rover traverses. Currently, targets for remote sensing instruments, especially narrow field-of-view instruments (such as the MER Mini- TES spectrometer or the 2009 MSL ChemCam spectrometer) must be selected manually based on imagery already on the ground with the operations team. OASIS will enable the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. We are in the process of scheduling an onboard MER experiment to demonstrate the OASIS capability in early 2009.

  18. A lab-on-a-chip system integrating tissue sample preparation and multiplex RT-qPCR for gene expression analysis in point-of-care hepatotoxicity assessment.

    PubMed

    Lim, Geok Soon; Chang, Joseph S; Lei, Zhang; Wu, Ruige; Wang, Zhiping; Cui, Kemi; Wong, Stephen

    2015-10-21

    A truly practical lab-on-a-chip (LOC) system for point-of-care testing (POCT) hepatotoxicity assessment necessitates the embodiment of full-automation, ease-of-use and "sample-in-answer-out" diagnostic capabilities. To date, the reported microfluidic devices for POCT hepatotoxicity assessment remain rudimentary as they largely embody only semi-quantitative or single sample/gene detection capabilities. In this paper, we describe, for the first time, an integrated LOC system that is somewhat close to a practical POCT hepatotoxicity assessment device - it embodies both tissue sample preparation and multiplex real-time RT-PCR. It features semi-automation, is relatively easy to use, and has "sample-in-answer-out" capabilities for multiplex gene expression analysis. Our tissue sample preparation module incorporating both a microhomogenizer and surface-treated paramagnetic microbeads yielded high purity mRNA extracts, considerably better than manual means of extraction. A primer preloading surface treatment procedure and the single-loading inlet on our multiplex real-time RT-PCR module simplify off-chip handling procedures for ease-of-use. To demonstrate the efficacy of our LOC system for POCT hepatotoxicity assessment, we perform a preclinical animal study with the administration of cyclophosphamide, followed by gene expression analysis of two critical protein biomarkers for liver function tests, aspartate transaminase (AST) and alanine transaminase (ALT). Our experimental results depict normalized fold changes of 1.62 and 1.31 for AST and ALT, respectively, illustrating up-regulations in their expression levels and hence validating their selection as critical genes of interest. In short, we illustrate the feasibility of multiplex gene expression analysis in an integrated LOC system as a viable POCT means for hepatotoxicity assessment.

  19. Evaluation of automated direct sample introduction with comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry for the screening analysis of dioxins in fish oil.

    PubMed

    Hoh, Eunha; Lehotay, Steven J; Mastovska, Katerina; Huwe, Janice K

    2008-08-01

    An automated direct sample introduction technique coupled to comprehensive two-dimensional gas chromatography-time of flight mass spectrometry (DSI-GC x GC/TOF-MS) was applied for the development of a relatively fast and easy analytical screening method for 17 polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) and 4 non-ortho polychlorinated biphenyls (PCBs) in fish oil. Comparison of instrumental performance between DSI-GC x GC/TOF-MS and the traditional gas chromatographic high resolution mass spectrometric (GC-HRMS) method showed good agreement of results for standard solutions analyzed in blind fashion. Relatively high tolerance of the DSI technique for lipids in the final extracts enabled a streamlined sample preparation procedure that only required gel permeation chromatography (GPC) and solid-phase extraction (SPE) cleanup with graphitized carbon black. The sample size for the method was 2g of cod liver oil, which achieved limits of quantitation (LOQs) of 0.019-7.8 pg/g toxic equivalent quotients for the individual PCDD/Fs. Lower detection limits can be achieved by using larger sample size and scaling up the sample preparation procedure, but this adds to the labor, time, solvent consumption, and expense of the approach. However, the streamlined method yielded 0.94 pg/g and 2.3 pg/g LOQs for 2,3,7,8-tetrachloro dibenzofuran (TCDF) and 3,3',4,4',5-pentachloro biphenyl (CB126), which were sufficiently low for regulatory monitoring of 2g samples. Therefore, instead of congener specific analysis, this streamlined analytical screening method for TCDF and CB126 has the potential to monitor fish oil contaminated with dioxin and dioxin-like PCBs at or above current food safety limits. Acceptable recoveries for nearly all analytes at three different spiking levels in fish oil samples were achieved with good repeatability.

  20. Aquatic hazard assessment of a commercial sample of naphthenic acids.

    PubMed

    Swigert, James P; Lee, Carol; Wong, Diana C L; White, Russell; Scarlett, Alan G; West, Charles E; Rowland, Steven J

    2015-04-01

    This paper presents chemical composition and aquatic toxicity characteristics of a commercial sample of naphthenic acids (NAs). Naphthenic acids are derived from the refining of petroleum middle distillates and can contribute to refinery effluent toxicity. NAs are also present in oil sands process-affected water (OSPW), but differences in the NAs compositions from these sources precludes using a common aquatic toxicity dataset to represent the aquatic hazards of NAs from both origins. Our chemical characterization of a commercial sample of NAs showed it to contain in order of abundance, 1-ring>2-ring>acyclic>3-ring acids (∼84%). Also present were monoaromatic acids (7%) and non-acids (9%, polyaromatic hydrocarbons and sulfur heterocyclic compounds). While the acyclic acids were only the third most abundant group, the five most abundant individual compounds were identified as C(10-14) n-acids (n-decanoic acid to n-tetradecanoic acid). Aquatic toxicity testing of fish (Pimephales promelas), invertebrate (Daphnia magna), algae (Pseudokirchneriella subcapitata), and bacteria (Vibrio fischeri) showed P. promelas to be the most sensitive species with 96-h LL50=9.0 mg L(-1) (LC50=5.6 mg L(-1)). Acute EL50 values for the other species ranged 24-46 mg L(-1) (EC50 values ranged 20-30 mg L(-1)). Biomimetic extraction via solid-phase-microextraction (BE-SPME) suggested a nonpolar narcosis mode of toxic action for D. magna, P. subcapitata, and V. fischeri. The BE analysis under-predicted fish toxicity, which indicates that a specific mode of action, besides narcosis, may be a factor for fishes.

  1. Simultaneous analysis of organochlorinated pesticides (OCPs) and polychlorinated biphenyls (PCBs) from marine samples using automated pressurized liquid extraction (PLE) and Power Prep™ clean-up.

    PubMed

    Helaleh, Murad I H; Al-Rashdan, Amal; Ibtisam, A

    2012-05-30

    An automated pressurized liquid extraction (PLE) method followed by Power Prep™ clean-up was developed for organochlorinated pesticide (OCP) and polychlorinated biphenyl (PCB) analysis in environmental marine samples of fish, squid, bivalves, shells, octopus and shrimp. OCPs and PCBs were simultaneously determined in a single chromatographic run using gas chromatography-mass spectrometry-negative chemical ionization (GC-MS-NCI). About 5 g of each biological marine sample was mixed with anhydrous sodium sulphate and placed in the extraction cell of the PLE system. PLE is controlled by means of a PC using DMS 6000 software. Purification of the extract was accomplished using automated Power Prep™ clean-up with a pre-packed disposable silica column (6 g) supplied by Fluid Management Systems (FMS). All OCPs and PCBs were eluted from the silica column using two types of solvent: 80 mL of hexane and a 50 mL mixture of hexane and dichloromethane (1:1). A wide variety of fish and shellfish were collected from the fish market and analyzed using this method. The total PCB concentrations were 2.53, 0.25, 0.24, 0.24, 0.17 and 1.38 ng g(-1) (w/w) for fish, squid, bivalves, shells, octopus and shrimp, respectively, and the corresponding total OCP concentrations were 30.47, 2.86, 0.92, 10.72, 5.13 and 18.39 ng g(-1) (w/w). Lipids were removed using an SX-3 Bio-Beads gel permeation chromatography (GPC) column. Analytical criteria such as recovery, reproducibility and repeatability were evaluated through a range of biological matrices.

  2. Fully automated sample preparation microsystem for genetic testing of hereditary hearing loss using two-color multiplex allele-specific PCR.

    PubMed

    Zhuang, Bin; Gan, Wupeng; Wang, Shuaiqin; Han, Junping; Xiang, Guangxin; Li, Cai-Xia; Sun, Jing; Liu, Peng

    2015-01-20

    A fully automated microsystem consisting of a disposable DNA extraction and PCR microchip, as well as a compact control instrument, has been successfully developed for genetic testing of hereditary hearing loss from human whole blood. DNA extraction and PCR were integrated into a single 15-μL reaction chamber, where a piece of filter paper was embedded for capturing genomic DNA, followed by in-situ PCR amplification without elution. Diaphragm microvalves actuated by external solenoids together with a "one-way" fluidic control strategy operated by a modular valve positioner and a syringe pump were employed to control the fluids and to seal the chamber during thermal cycling. Fully automated DNA extractions from as low as 0.3-μL human whole blood followed by amplifications of 59-bp β-actin fragments can be completed on the microsystem in about 100 min. Negative control tests that were performed between blood sample analyses proved the successful elimination of any contamination or carryover in the system. To more critically test the microsystem, a two-color multiplex allele-specific PCR (ASPCR) assay for detecting c.176_191del16, c.235delC, and c.299_300delAT mutations in GJB2 gene that accounts for hereditary hearing loss was constructed. Two allele-specific primers, one labeled with TAMRA for wild type and the other with FAM for mutation, were designed for each locus. DNA extraction from blood and ASPCR were performed on the microsystem, followed by an electrophoretic analysis on a portable microchip capillary electrophoresis system. Blood samples from a healthy donor and five persons with genetic mutations were all accurately analyzed with only two steps in less than 2 h.

  3. Computerized self-assessment of automated lesion segmentation in breast ultrasound: implication for CADx applied to findings in the axilla

    NASA Astrophysics Data System (ADS)

    Drukker, K.; Giger, M. L.

    2008-03-01

    We developed a self-assessment method in which the CADx system provided a confidence level for its lesion segmentations. The self-assessment was performed by a fuzzy-inference system based on 4 computer-extracted features of the computer-segmented lesions in a leave-one-case-out evaluation protocol. In instances where the initial segmentation received a low assessment rating, lesions were re-segmented using the same segmentation method but based on a user-defined region-of-interest. A total of 542 cases with 1133 lesions were collected in this study, and we focused here on the 97 normal lymph nodes in this dataset since these pose challenges for automated segmentation due to their inhomogeneous appearance. The percentage of all lesions with satisfactory segmentation (i.e., normalized overlap with the radiologist-delineated lesion >=0.3) was 85%. For normal lymph nodes, however, this percentage was only 36%. Of the lymph nodes, 53 received a low confidence rating (<0.3) for their initial segmentation. When those lymph nodes were re-segmented, the percentage with a satisfactory segmentation improved to 80.0%. Computerassessed confidence levels demonstrated potential to 1) help radiologists decide whether to use or disregard CADx output, and 2) provide a guide for improvement of lesion segmentation.

  4. Automated Sample Preparation (ASP): Development of a Rapid Method to Sequentially Isolate Nucleic Acids and Protein from Any Sample Type by a Cartridge-Based System

    DTIC Science & Technology

    2013-11-27

    goals. Specifically, CUBRC will design and manufacture a prototype cartridge(s) and test the prototype cartridge for its ability to isolate each...analyte individually and in succession. Testing will be performed on both laboratory derived samples 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE...CUBRC will design and manufacture a prototype cartridge(s) and test the prototype cartridge for its ability to isolate each analyte individually and in

  5. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts

    PubMed Central

    Torney, Colin J.; Dobson, Andrew P.; Borner, Felix; Lloyd-Jones, David J.; Moyer, David; Maliti, Honori T.; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J. Grant C.

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future. PMID:27227888

  6. An automated assay for the assessment of cardiac arrest in fish embryo.

    PubMed

    Puybareau, Elodie; Genest, Diane; Barbeau, Emilie; Léonard, Marc; Talbot, Hugues

    2017-02-01

    Studies on fish embryo models are widely developed in research. They are used in several research fields including drug discovery or environmental toxicology. In this article, we propose an entirely automated assay to detect cardiac arrest in Medaka (Oryzias latipes) based on image analysis. We propose a multi-scale pipeline based on mathematical morphology. Starting from video sequences of entire wells in 24-well plates, we focus on the embryo, detect its heart, and ascertain whether or not the heart is beating based on intensity variation analysis. Our image analysis pipeline only uses commonly available operators. It has a low computational cost, allowing analysis at the same rate as acquisition. From an initial dataset of 3192 videos, 660 were discarded as unusable (20.7%), 655 of them correctly so (99.25%) and only 5 incorrectly so (0.75%). The 2532 remaining videos were used for our test. On these, 45 errors were made, leading to a success rate of 98.23%.

  7. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity.

  8. [Assessment of AFP in amniotic fluid: comparison of three automated techniques].

    PubMed

    Leguy, Marie-Clémence; Tavares, Silvina Dos Reis; Tsatsaris, Vassili; Lewin, Fanny; Clauser, Eric; Guibourdenche, Jean

    2011-01-01

    Ultrasound scanning is useful to detect neural tube defect (NTD) but scarcely distinguished between closed NTD and open NTD, which had very different prognosis. An amniotic fluid punction is thus mandatory to search for an increase in alpha foeto protein (AFP) levels and for the presence of acetylcholinesterase which identified open NTD. However, AFP levels fluctuate both with the gestational age and the assay used. Our aim was to establish normative values for AFP in amniotic fluid in the second half of pregnancy using three different immunoassays and to improve their clinical relevance. Amniotic fluid punctions were performed on 527 patients from 9 week of gestation (WG) to 37 WG either for maternal age, Trisomy 21 screening, increase in nucal translucency (control group, n = 527) or for suspicion of neural tube defect or abdominal defect (n = 5). AFP was measured using the immunoassay developed for serum AFP on the Access 2 system, the Immulite 2000 and the Advia Centaur. Results were expressed in ng/ml, multiple of the median (MoM) and percentiles. AFP decrease by 1.5 fold between 9 and 19 WG. When NTD was suspected, an increase in anmniotic AFP was observed (from 2.5 MoM to 9.3 MoM) confirming an open NTD. In conclusion, the assay developed on those 3 automates is suitable for the measurement of AFP in amniotic fluid.

  9. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    PubMed

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-07

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  10. Automated assessment of joint synovitis activity from medical ultrasound and power doppler examinations using image processing and machine learning methods

    PubMed Central

    Ziębiński, Adam

    2016-01-01

    Objectives Rheumatoid arthritis is the most common rheumatic disease with arthritis, and causes substantial functional disability in approximately 50% patients after 10 years. Accurate measurement of the disease activity is crucial to provide an adequate treatment and care to the patients. The aim of this study is focused on a computer aided diagnostic system that supports an assessment of synovitis severity. Material and methods This paper focus on a computer aided diagnostic system that was developed within joint Polish–Norwegian research project related to the automated assessment of the severity of synovitis. Semiquantitative ultrasound with power Doppler is a reliable and widely used method of assessing synovitis. Synovitis is estimated by ultrasound examiner using the scoring system graded from 0 to 3. Activity score is estimated on the basis of the examiner’s experience or standardized ultrasound atlases. The method needs trained medical personnel and the result can be affected by a human error. Results The porotype of a computer-aided diagnostic system and algorithms essential for an analysis of ultrasonic images of finger joints are main scientific output of the MEDUSA project. Medusa Evaluation System prototype uses bone, skin, joint and synovitis area detectors for mutual structural model based evaluation of synovitis. Finally, several algorithms that support the semi-automatic or automatic detection of the bone region were prepared as well as a system that uses the statistical data processing approach in order to automatically localize the regions of interest. Conclusions Semiquantitative ultrasound with power Doppler is a reliable and widely used method of assessing synovitis. Activity score is estimated on the basis of the examiner’s experience and the result can be affected by a human error. In this paper we presented the MEDUSA project which is focused on a computer aided diagnostic system that supports an assessment of synovitis severity

  11. Assessing the Alcohol-BMI Relationship in a US National Sample of College Students

    ERIC Educational Resources Information Center

    Barry, Adam E.; Piazza-Gardner, Anna K.; Holton, M. Kim

    2015-01-01

    Objective: This study sought to assess the body mass index (BMI)-alcohol relationship among a US national sample of college students. Design: Secondary data analysis using the Fall 2011 National College Health Assessment (NCHA). Setting: A total of 44 US higher education institutions. Methods: Participants included a national sample of college…

  12. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    PubMed

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  13. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  14. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  15. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  16. An assessment of two automated snow water equivalent instruments during the WMO Solid Precipitation Intercomparison Experiment

    NASA Astrophysics Data System (ADS)

    Smith, Craig D.; Kontu, Anna; Laffin, Richard; Pomeroy, John W.

    2017-01-01

    During the World Meteorological Organization (WMO) Solid Precipitation Intercomparison Experiment (SPICE), automated measurements of snow water equivalent (SWE) were made at the Sodankylä (Finland), Weissfluhjoch (Switzerland) and Caribou Creek (Canada) SPICE sites during the northern hemispheric winters of 2013/14 and 2014/15. Supplementary intercomparison measurements were made at Fortress Mountain (Kananaskis, Canada) during the 2013/14 winter. The objectives of this analysis are to compare automated SWE measurements with a reference, comment on their performance and, where possible, to make recommendations on how to best use the instruments and interpret their measurements. Sodankylä, Caribou Creek and Fortress Mountain hosted a Campbell Scientific CS725 passive gamma radiation SWE sensor. Sodankylä and Weissfluhjoch hosted a Sommer Messtechnik SSG1000 snow scale. The CS725 operating principle is based on measuring the attenuation of soil emitted gamma radiation by the snowpack and relating the attenuation to SWE. The SSG1000 measures the mass of the overlying snowpack directly by using a weighing platform and load cell. Manual SWE measurements were obtained at the intercomparison sites on a bi-weekly basis over the accumulation-ablation periods using bulk density samplers. These manual measurements are considered to be the reference for the intercomparison. Results from Sodankylä and Caribou Creek showed that the CS725 generally overestimates SWE as compared to manual measurements by roughly 30-35 % with correlations (r2) as high as 0.99 for Sodankylä and 0.90 for Caribou Creek. The RMSE varied from 30 to 43 mm water equivalent (mm w.e.) and from 18 to 25 mm w.e. at Sodankylä and Caribou Creek, which had respective SWE maximums of approximately 200 and 120 mm w.e. The correlation at Fortress Mountain was 0.94 (RMSE of 48 mm w.e. with a maximum SWE of approximately 650 mm w.e.) with no systematic overestimation. The SSG1000 snow scale, having a different

  17. Detection of coronary calcifications from computed tomography scans for automated risk assessment of coronary artery disease

    SciTech Connect

    Isgum, Ivana; Rutten, Annemarieke; Prokop, Mathias; Ginneken, Bram van

    2007-04-15

    A fully automated method for coronary calcification detection from non-contrast-enhanced, ECG-gated multi-slice computed tomography (CT) data is presented. Candidates for coronary calcifications are extracted by thresholding and component labeling. These candidates include coronary calcifications, calcifications in the aorta and in the heart, and other high-density structures such as noise and bone. A dedicated set of 64 features is calculated for each candidate object. They characterize the object's spatial position relative to the heart and the aorta, for which an automatic segmentation scheme was developed, its size and shape, and its appearance, which is described by a set of approximated Gaussian derivatives for which an efficient computational scheme is presented. Three classification strategies were designed. The first one tested direct classification without feature selection. The second approach also utilized direct classification, but with feature selection. Finally, the third scheme employed two-stage classification. In a computationally inexpensive first stage, the most easily recognizable false positives were discarded. The second stage discriminated between more difficult to separate coronary calcium and other candidates. Performance of linear, quadratic, nearest neighbor, and support vector machine classifiers was compared. The method was tested on 76 scans containing 275 calcifications in the coronary arteries and 335 calcifications in the heart and aorta. The best performance was obtained employing a two-stage classification system with a k-nearest neighbor (k-NN) classifier and a feature selection scheme. The method detected 73.8% of coronary calcifications at the expense of on average 0.1 false positives per scan. A calcium score was computed for each scan and subjects were assigned one of four risk categories based on this score. The method assigned the correct risk category to 93.4% of all scans.

  18. A Large-Sample Test of a Semi-Automated Clavicle Search Engine to Assist Skeletal Identification by Radiograph Comparison.

    PubMed

    D'Alonzo, Susan S; Guyomarc'h, Pierre; Byrd, John E; Stephan, Carl N

    2017-01-01

    In 2014, a morphometric capability to search chest radiograph databases by quantified clavicle shape was published to assist skeletal identification. Here, we extend the validation tests conducted by increasing the search universe 18-fold, from 409 to 7361 individuals to determine whether there is any associated decrease in performance under these more challenging circumstances. The number of trials and analysts were also increased, respectively, from 17 to 30 skeletons, and two to four examiners. Elliptical Fourier analysis was conducted on clavicles from each skeleton by each analyst (shadowgrams trimmed from scratch in every instance) and compared to the search universe. Correctly matching individuals were found in shortlists of 10% of the sample 70% of the time. This rate is similar to, although slightly lower than, rates previously found for much smaller samples (80%). Accuracy and reliability are thereby maintained, even when the comparison system is challenged by much larger search universes.

  19. Higher-Order Exploratory Factor Analysis of the Reynolds Intellectual Assessment Scales with a Referred Sample

    ERIC Educational Resources Information Center

    Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.

    2007-01-01

    The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…

  20. Assessment of Social Cognition in Non-human Primates Using a Network of Computerized Automated Learning Device (ALDM) Test Systems

    PubMed Central

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-01-01

    Fagot & Paleressompoulle1 and Fagot & Bonte2 have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing3. This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition. PMID:25992495

  1. Assessment of social cognition in non-human primates using a network of computerized automated learning device (ALDM) test systems.

    PubMed

    Fagot, Joël; Marzouki, Yousri; Huguet, Pascal; Gullstrand, Julie; Claidière, Nicolas

    2015-05-05

    Fagot & Paleressompoulle(1) and Fagot & Bonte(2) have published an automated learning device (ALDM) for the study of cognitive abilities of monkeys maintained in semi-free ranging conditions. Data accumulated during the last five years have consistently demonstrated the efficiency of this protocol to investigate individual/physical cognition in monkeys, and have further shown that this procedure reduces stress level during animal testing(3). This paper demonstrates that networks of ALDM can also be used to investigate different facets of social cognition and in-group expressed behaviors in monkeys, and describes three illustrative protocols developed for that purpose. The first study demonstrates how ethological assessments of social behavior and computerized assessments of cognitive performance could be integrated to investigate the effects of socially exhibited moods on the cognitive performance of individuals. The second study shows that batteries of ALDM running in parallel can provide unique information on the influence of the presence of others on task performance. Finally, the last study shows that networks of ALDM test units can also be used to study issues related to social transmission and cultural evolution. Combined together, these three studies demonstrate clearly that ALDM testing is a highly promising experimental tool for bridging the gap in the animal literature between research on individual cognition and research on social cognition.

  2. Evaluating hydrological response to forecasted land-use change—scenario testing with the automated geospatial watershed assessment (AGWA) tool

    USGS Publications Warehouse

    Kepner, William G.; Semmens, Darius J.; Hernandez, Mariano; Goodrich, David C.

    2009-01-01

    Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions to maintain the sustainable nature of our ecosystem services now and into the future. During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial-analysis technologies have been used to develop landscape information that can be integrated with hydrologic models to determine long-term change and make predictive inferences about the future. Two diverse case studies in northwest Oregon (Willamette River basin) and southeastern Arizona (San Pedro River) were examined in regard to future land use scenarios relative to their impact on surface water conditions (e.g., sediment yield and surface runoff) using hydrologic models associated with the Automated Geospatial Watershed Assessment (AGWA) tool. The base reference grid for land cover was modified in both study locations to reflect stakeholder preferences 20 to 60 yrs into the future, and the consequences of landscape change were evaluated relative to the selected future scenarios. The two studies provide examples of integrating hydrologic modeling with a scenario analysis framework to evaluate plausible future forecasts and to understand the potential impact of landscape change on ecosystem services.

  3. Disturbance automated reference toolset (DART): Assessing patterns in ecological recovery from energy development on the Colorado Plateau.

    PubMed

    Nauman, Travis W; Duniway, Michael C; Villarreal, Miguel L; Poitras, Travis B

    2017-04-15

    A new disturbance automated reference toolset (DART) was developed to monitor human land surface impacts using soil-type and ecological context. DART identifies reference areas with similar soils, topography, and geology; and compares the disturbance condition to the reference area condition using a quantile-based approach based on a satellite vegetation index. DART was able to represent 26-55% of variation of relative differences in bare ground and 26-41% of variation in total foliar cover when comparing sites with nearby ecological reference areas using the Soil Adjusted Total Vegetation Index (SATVI). Assessment of ecological recovery at oil and gas pads on the Colorado Plateau with DART revealed that more than half of well-pads were below the 25th percentile of reference areas. Machine learning trend analysis of poorly recovering well-pads (quantile<0.23) had out-of-bag error rates between 37 and 40% indicating moderate association with environmental and management variables hypothesized to influence recovery. Well-pads in grasslands (median quantile [MQ]=13%), blackbrush (Coleogyne ramosissima) shrublands (MQ=18%), arid canyon complexes (MQ=18%), warmer areas with more summer-dominated precipitation, and state administered areas (MQ=12%) had low recovery rates. Results showcase the usefulness of DART for assessing discrete surface land disturbances, and highlight the need for more targeted rehabilitation efforts at oil and gas well-pads in the arid southwest US.

  4. A novel 2D and 3D method for automated insulin granule measurement and its application in assessing accepted preparation methods for electron microscopy

    NASA Astrophysics Data System (ADS)

    Mantell, J.; Nam, D.; Bull, D.; Achim, A.; Verkade, P.

    2014-06-01

    Transmission electron microscopy images of insulin-producing beta cells in the islets of Langerhans contain many complex structures, making it difficult to accurately segment insulin granules. Furthermore the appearance of the granules and surrounding halo and limiting membrane can vary enormously depending on the methods used for sample preparation. An automated method has been developed using active contours to segment the insulin core initially and then expand to segment the halos [1]. The method has been validated against manual measurements and also yields higher accuracy than other automated methods [2]. It has then been extended to three dimensions to analyse a tomographic reconstruction from a thick section of the same material. The final step has been to produce a GUI and use the automated process to compare a number of different electron microscopy preparation protocols including chemical fixation (where many of halos are often distended) and to explore the many subtleties of high pressure freezing (where the halos are often minimal, [3]).

  5. Assessing automated image analysis of sand grain shape to identify sedimentary facies, Gran Dolina archaeological site (Burgos, Spain)

    NASA Astrophysics Data System (ADS)

    Campaña, I.; Benito-Calvo, A.; Pérez-González, A.; Bermúdez de Castro, J. M.; Carbonell, E.

    2016-12-01

    Gran Dolina is a cave (Sierra de Atapuerca, Spain) infilled by a 25 m thick sedimentary record, divided into 12 lithostratigraphic units that have been separated into 19 sedimentary facies containing Early and Middle Pleistocene hominin remains. In this paper, an automated image analysis method has been used to study the shape of the sedimentary particles. Since particle shape is interpreted as the result of sedimentary transport and sediment source, this study can provide valuable data about the sedimentological mechanism of sequence formation. The shape of the sand fraction in 73 samples from Gran Dolina site and Sierra de Atapuerca was analyzed using the Malvern Morphologi G3, an advanced particle characterization tool. In this first complete test, we used this method to the published sequence of Gran Dolina, defined previously through field work observations and geochemical and textural analysis. The results indicate that this image analysis method allows differentiation of the sedimentary facies, providing objective tools to identify weathered layers and measure the textural maturity of the sediments. Channel facies have the highest values of circularity and convexity, showing the highest textural maturity of particles. On the other hand, terra rossa and debris flow samples show similar values, with the lowest particle maturity.

  6. Automated quality assessment of autonomously acquired microscopic images of fluorescently stained bacteria.

    PubMed

    Zeder, M; Kohler, E; Pernthaler, J

    2010-01-01

    Quality assessment of autonomously acquired microscopic images is an important issue in high-throughput imaging systems. For example, the presence of low quality images (>or=10%) in a dataset significantly influences the counting precision of fluorescently stained bacterial cells. We present an approach based on an artificial neural network (ANN) to assess the quality of such images. Spatially invariant estimators were extracted as ANN input data from subdivided images by low level image processing. Different ANN designs were compared and >400 ANNs were trained and tested on a set of 25,000 manually classified images. The optimal ANN featured a correct identification rate of 94% (3% false positives, 3% false negatives) and could process about 10 images per second. We compared its performance with the image quality assessment by different humans and discuss the difficulties in assigning images to the correct quality class. The computer program and the documented source code (VB.NET) are provided under General Public Licence.

  7. Late cardiac sodium current can be assessed using automated patch-clamp

    PubMed Central

    Gawali, Vaibhavkumar; Todt, Hannes; Knott, Thomas; Scheel, Olaf; Abriel, Hugues

    2014-01-01

    The cardiac late Na + current is generated by a small fraction of voltage-dependent Na + channels that undergo a conformational change to a burst-gating mode, with repeated openings and closures during the action potential (AP) plateau. Its magnitude can be augmented by inactivation-defective mutations, myocardial ischemia, or prolonged exposure to chemical compounds leading to drug-induced (di)-long QT syndrome, and results in an increased susceptibility to cardiac arrhythmias. Using CytoPatch™ 2 automated patch-clamp equipment, we performed whole-cell recordings in HEK293 cells stably expressing human Nav1.5, and measured the late Na + component as average current over the last 100 ms of 300 ms depolarizing pulses to -10 mV from a holding potential of -100 mV, with a repetition frequency of 0.33 Hz. Averaged values in different steady-state experimental conditions were further corrected by the subtraction of current average during the application of tetrodotoxin (TTX) 30 μM. We show that ranolazine at 10 and 30 μM in 3 min applications reduced the late Na + current to 75.0 ± 2.7% (mean ± SEM, n = 17) and 58.4 ± 3.5% ( n = 18) of initial levels, respectively, while a 5 min application of veratridine 1 μM resulted in a reversible current increase to 269.1 ± 16.1% ( n = 28) of initial values. Using fluctuation analysis, we observed that ranolazine 30 μM decreased mean open probability p from 0.6 to 0.38 without modifying the number of active channels n, while veratridine 1 μM increased n 2.5-fold without changing p. In human iPSC-derived cardiomyocytes, veratridine 1 μM reversibly increased APD90 2.12 ± 0.41-fold (mean ± SEM, n = 6). This effect is attributable to inactivation removal in Nav1.5 channels, since significant inhibitory effects on hERG current were detected at higher concentrations in hERG-expressing HEK293 cells, with a 28.9 ± 6.0% inhibition (mean ± SD, n = 10) with 50 μM veratridine.        PMID:25383189

  8. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  9. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    SciTech Connect

    Walworth, Matthew J; ElNaggar, Mariam S; Stankovich, Joseph J; WitkowskiII, Charles E.; Norris, Jeremy L; Van Berkel, Gary J

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  10. Comparison of an automated ELFA and two different real-time PCR techniques for Salmonella detection in poultry samples.

    PubMed

    Rohonczy, Kata; Zoller, Linda; Hermann, Zsolt; Fodor, Andrea; Mráz, Balázs; Tabajdi-Pintér, Veronika

    2014-09-01

    The aim of this study was to compare an enzyme-linked fluorescent assay (ELFA)-based and two real-time polymerase chain reaction (PCR) methods with the results of the standard culture-based method EN ISO 6579:2002 (bacteriological standard method used in the European Union) for the detection of Salmonella spp. in raw chicken meat. Our investigations were performed on 141 poultry samples sorted from supermarkets. Relative accuracy, relative specificity and relative sensitivity were determined. According to the ISO 16140:2003 criteria for validation of alternative microbiological methods, the ELFA-based method (VIDAS ICS2 + SLM), and real-time PCR methods (TaqMan, Bax) were comparable to the reference standard method for the detection of Salmonella spp. in chicken meat. The use of these methods provide results within 48 hours with high sensitivity (100%). The TaqMan real-time PCR showed a relative specificity of 98% and both of the real-time PCR methods presented 100%.The VIDAS ICS2 + SLM and the Bax real-time PCR methods showed the highest relative accuracy (100%) and 99% in case of the TaqMan method. In conclusion, both the real-time PCR and the ELFA-based assay can be used as a rapid and user-friendly diagnostic method for detection of Salmonella spp. in chicken meat samples.

  11. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  12. GIS-BASED HYDROLOGIC MODELING: THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local scale problems toward complex, spatially explicit regional ones. Such problems have to be
    addressed with distributed models that can compute runoff and erosion at different spatial a...

  13. AGWA: The Automated Geospatial Watershed Assessment Tool to Inform Rangeland Management

    EPA Science Inventory

    Do you want a relatively easy to use tool to assess rangeland soil and water conservation practices on rangeland erosion that is specifically designed to use ecological information? New Decision Support Tools (DSTs) that are easy-to-use, incorporate ecological concepts and rangel...

  14. Automated liquid chromatographic determination of atenolol in plasma using dialysis and trace enrichment on a cation-exchange precolumn for sample handling.

    PubMed

    Chiap, P; Buraglia, B M; Ceccato, A; Hubert, P; Crommen, J

    2000-02-28

    A fully automated method involving dialysis combined with trace enrichment was developed for the liquid chromatographic (LC) determination of atenolol, a hydrophilic beta-blocking agent, in human plasma. The plasma samples were dialysed on a cellulose acetate membrane and the dialysate was reconcentrated on a short trace enrichment column (TEC) packed with a strong cation-exchange material. All sample handling operations can be executed automatically by a sample processor (ASTED system). After TEC conditioning, the plasma sample, to which the internal standard (sotalol, another hydrophilic beta-blocker) was automatically added, was introduced in the donor channel and dialysed in the static/pulsed mode. The dialysis liquid consisted of 4.3 mM phosphoric acid. When the dialysis process was discontinued, the analytes were eluted from the TEC in the back-flush mode by the LC mobile phase and transferred to the analytical column, packed with octyl silica. The LC mobile phase consisted of phosphate buffer, pH 7.0-methanol (81:19; v/v) with 1-octanesulfonate. Atenolol and the internal standard were monitored photometrically at 225 nm. The different parameters influencing the dialysis and trace enrichment processes were optimised with respect to analyte recovery. The influence of two different kinds of cation-exchange material on analyte recovery and peak efficiency was also studied. The method was then validated in the concentration range 25-1000 ng/ml. The mean recovery for atenolol was 65% and the limit of quantitation was 25 ng/ml.

  15. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA): A GIS-BASED HYDROLOGIC MODELING TOOL FOR LANDSCAPE ASSESSMENT AND WATERSHED MANAGEMENT

    EPA Science Inventory

    The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...

  16. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  17. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    NASA Astrophysics Data System (ADS)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  18. Semi-automated Volumetric and Morphological Assessment of Glioblastoma Resection with Fluorescence-Guided Surgery

    PubMed Central

    Cordova, J. Scott; Gurbani, Saumya S.; Holder, Chad A.; Olson, Jeffrey J.; Schreibmann, Eduard; Shi, Ran; Guo, Ying; Shu, Hui-Kuo G.; Shim, Hyunsuk; Hadjipanayis, Costas G.

    2016-01-01

    Purpose Glioblastoma (GBM) neurosurgical resection relies on contrast-enhanced MRI-based neuronavigation. However, it is well-known that infiltrating tumor extends beyond contrast enhancement. Fluorescence-guided surgery (FGS) using 5-aminolevulinic acid (5-ALA) was evaluated to improve extent of resection (EOR) of GBMs. Pre-operative morphological tumor metrics were also assessed. Procedures Thirty patients from a Phase II trial evaluating 5-ALA FGS in newly diagnosed GBM were assessed. Tumors were segmented pre-operatively to assess morphological features as well as post-operatively to evaluate EOR and residual tumor volume (RTV). Results Median EOR and RTV were 94.3% and 0.821 cm3, respectively. Pre-operative surface area to volume ratio and RTV were significantly associated with overall survival, even when controlling for the known survival confounders. Conclusions This study supports claims that 5-ALA FGS is helpful at decreasing tumor burden and prolonging survival in GBM. Moreover, morphological indices are shown to impact both resection and patient survival. PMID:26463215

  19. Vertical Sampling in Recharge Areas Versus Lateral Sampling in Discharge Areas: Assessing the Agricultural Nitrogen Legacy in Groundwater

    NASA Astrophysics Data System (ADS)

    Gilmore, T. E.; Genereux, D. P.; Solomon, D. K.; Mitasova, H.; Burnette, M.

    2014-12-01

    Agricultural nitrogen (N) is a legacy contaminant often found in shallow groundwater systems. This legacy has commonly been observed using well nests (vertical sampling) in recharge areas, but may also be observed by sampling at points in/beneath a streambed using pushable probes along transects across a channel (lateral sampling). We compared results from two different streambed point sampling approaches and from wells in the recharge area to assess whether the different approaches give fundamentally different pictures of (1) the magnitude of N contamination, (2) historic trends in N contamination, and (3) the extent to which denitrification attenuates nitrate transport through the surficial aquifer. Two different arrangements of streambed points (SP) were used to sample groundwater discharging into a coastal plain stream in North Carolina. In July 2012, a 58 m reach was sampled using closely-spaced lateral transects of SP, revealing high average [NO3-] (808 μM, n=39). In March 2013, transects of SP were widely distributed through a 2.7 km reach that contained the 58 m reach and suggested overall lower [NO3-] (210 μM, n=30), possibly due to variation in land use along the longer study reach. Mean [NO3-] from vertical sampling (2 well nests with 3 wells each) was 296 μM. Groundwater apparent ages from SP in the 58 m and 2.7 km reaches suggested lower recharge [NO3-] (observed [NO3-] plus modeled excess N2) in 0-10 year-old water (1250 μM and 525 μM, respectively), compared to higher recharge [NO3-] from 10-30 years ago (about 1600 μM and 900 μM, respectively). In the wells, [NO3-] was highest (835 μM) in groundwater with apparent age of 12-15 years and declined as apparent age increased, a trend that was consistent with SP in the 2.7 km reach. The 58 m reach suggested elevated recharge [NO3-] (>1100 μM) over a 50-year period. Excess N2 from wells suggested that about 62% of nitrate had been removed via denitrification since recharge, versus 51% and 78

  20. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  1. Assessing organic contaminants in fish: comparison of a nonlethal tissue sampling technique to mobile and stationary passive sampling devices.

    PubMed

    Heltsley, Rebecca M; Cope, W Gregory; Shea, Damian; Bringolf, Robert B; Kwak, Thomas J; Malindzak, Edward G

    2005-10-01

    As concerns mount over the human health risks associated with consumption of fish contaminated with persistent organic pollutants, there exists a need to better evaluate fish body burdens without lethally sampling many of the important commercial and sport species of interest. The aim of this study was to investigate two novel methods for estimating organic contaminants in fish that are a concern for both fish and human health. The removal of fish adipose fins, commonly done in mark-recapture studies with salmonid species, was evaluated as a nonlethal sampling technique to estimate concentrations of polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs) in flathead catfish (Pylodictis olivaris), relative to those found in muscle fillets of the same fish. We also assessed the efficacy of using poly(dimethylsiloxane) (PDMS) as a mobile passive sampling device (PSD) attached directly to wild flathead catfish for assessing location-specific exposure of the fish to waterborne contaminants. The results of this study have demonstrated for the first time that organic contaminant concentrations in adipose fin were highly correlated (R2 = 0.87) with muscle fillet concentrations, indicating that the adipose fin of certain fishes may be used to accurately estimate tissue concentrations without the need for lethal sampling. Moreover, mobile PSDs attached directly to fish and used here for the first time accurately estimated ultratrace concentrations of waterborne PCBs and OCPs without any apparent harm to the fish, indicating that there are no practical or physical barriers to the use of mobile passive samplers attached to aquatic organisms. Among the many practical implications of this research, two potential priority items include the analysis of organic contaminants in farm-raised and sport fish intended for human consumption, without the economic and population losses associated with lethally sampling fish to obtain tissues, and identifying specific areas

  2. Enabling automated magnetic resonance imaging-based targeting assessment during dipole field navigation

    NASA Astrophysics Data System (ADS)

    Latulippe, Maxime; Felfoul, Ouajdi; Dupont, Pierre E.; Martel, Sylvain

    2016-02-01

    The magnetic navigation of drugs in the vascular network promises to increase the efficacy and reduce the secondary toxicity of cancer treatments by targeting tumors directly. Recently, dipole field navigation (DFN) was proposed as the first method achieving both high field and high navigation gradient strengths for whole-body interventions in deep tissues. This is achieved by introducing large ferromagnetic cores around the patient inside a magnetic resonance imaging (MRI) scanner. However, doing so distorts the static field inside the scanner, which prevents imaging during the intervention. This limitation constrains DFN to open-loop navigation, thus exposing the risk of a harmful toxicity in case of a navigation failure. Here, we are interested in periodically assessing drug targeting efficiency using MRI even in the presence of a core. We demonstrate, using a clinical scanner, that it is in fact possible to acquire, in specific regions around a core, images of sufficient quality to perform this task. We show that the core can be moved inside the scanner to a position minimizing the distortion effect in the region of interest for imaging. Moving the core can be done automatically using the gradient coils of the scanner, which then also enables the core to be repositioned to perform navigation to additional targets. The feasibility and potential of the approach are validated in an in vitro experiment demonstrating navigation and assessment at two targets.

  3. A Multivariate Analysis of the Neonatal Behavioral Assessment Scale in Several Samples.

    ERIC Educational Resources Information Center

    Strauss, Milton E.; Rourke, Daniel L.

    1978-01-01

    Discusses differences in results of factor analyses of ten diverse samples which have been studied using the Brazelton Neonatal Behavioral Assessment Scale (NBAS). Concludes that a single common factor structure accounts for the intercorrelations among NBAS items. (Author/BH)

  4. Efficiency of EPI cluster sampling for assessing diarrhoea and dysentery prevalence.

    PubMed Central

    Yoon, S. S.; Katz, J.; Brendel, K.; West, K. P.

    1997-01-01

    This study examines the efficiency of EPI cluster sampling in assessing the prevalence of diarrhoea and dysentery. A computer was used to simulate fieldwork carried out by a survey taker. The bias and variance of prevalence estimates obtained using EPI cluster sampling were compared with those obtained using simple random sampling and cluster (stratified random) sampling. Efficiency ratios, calculated as the mean square error divided by total distance travelled, were used to compare EPI cluster sampling to simple random sampling and standard cluster sampling. EPI cluster sampling may be an appropriate low-cost tool for monitoring trends in the prevalence of diarrhoea and dysentery over time. However, it should be used with caution when estimating the prevalence of diarrhoea at a single point in time because of the bias associated with this cluster sampling method. PMID:9447775

  5. Assessment of fully automated antibody homology modeling protocols in molecular operating environment.

    PubMed

    Maier, Johannes K X; Labute, Paul

    2014-08-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed.

  6. Assessment of fully automated antibody homology modeling protocols in molecular operating environment

    PubMed Central

    Maier, Johannes K X; Labute, Paul

    2014-01-01

    The success of antibody-based drugs has led to an increased demand for predictive computational tools to assist antibody engineering efforts surrounding the six hypervariable loop regions making up the antigen binding site. Accurate computational modeling of isolated protein loop regions can be quite difficult; consequently, modeling an antigen binding site that includes six loops is particularly challenging. In this work, we present a method for automatic modeling of the FV region of an immunoglobulin based upon the use of a precompiled antibody x-ray structure database, which serves as a source of framework and hypervariable region structural templates that are grafted together. We applied this method (on common desktop hardware) to the Second Antibody Modeling Assessment (AMA-II) target structures as well as an experimental specialized CDR-H3 loop modeling method. The results of the computational structure predictions will be presented and discussed. PMID:24715627

  7. Survey material choices in haematology EQA: a confounding factor in automated counting performance assessment

    PubMed Central

    De la Salle, Barbara

    2017-01-01

    The complete blood count (CBC) is one of the most frequently requested tests in laboratory medicine, perfo