Sample records for efficient automated methods

  1. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  2. Automated Measurement of Patient-Specific Tibial Slopes from MRI

    PubMed Central

    Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward

    2017-01-01

    Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547

  3. Automated optimization techniques for aircraft synthesis

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1976-01-01

    Application of numerical optimization techniques to automated conceptual aircraft design is examined. These methods are shown to be a general and efficient way to obtain quantitative information for evaluating alternative new vehicle projects. Fully automated design is compared with traditional point design methods and time and resource requirements for automated design are given. The NASA Ames Research Center aircraft synthesis program (ACSYNT) is described with special attention to calculation of the weight of a vehicle to fly a specified mission. The ACSYNT procedures for automatically obtaining sensitivity of the design (aircraft weight, performance and cost) to various vehicle, mission, and material technology parameters are presented. Examples are used to demonstrate the efficient application of these techniques.

  4. Semi-automatic mapping of geological Structures using UAV-based photogrammetric data: An image analysis approach

    NASA Astrophysics Data System (ADS)

    Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven

    2014-08-01

    Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.

  5. A LSQR-type method provides a computationally efficient automated optimal choice of regularization parameter in diffuse optical tomography.

    PubMed

    Prakash, Jaya; Yalavarthy, Phaneendra K

    2013-03-01

    Developing a computationally efficient automated method for the optimal choice of regularization parameter in diffuse optical tomography. The least-squares QR (LSQR)-type method that uses Lanczos bidiagonalization is known to be computationally efficient in performing the reconstruction procedure in diffuse optical tomography. The same is effectively deployed via an optimization procedure that uses the simplex method to find the optimal regularization parameter. The proposed LSQR-type method is compared with the traditional methods such as L-curve, generalized cross-validation (GCV), and recently proposed minimal residual method (MRM)-based choice of regularization parameter using numerical and experimental phantom data. The results indicate that the proposed LSQR-type and MRM-based methods performance in terms of reconstructed image quality is similar and superior compared to L-curve and GCV-based methods. The proposed method computational complexity is at least five times lower compared to MRM-based method, making it an optimal technique. The LSQR-type method was able to overcome the inherent limitation of computationally expensive nature of MRM-based automated way finding the optimal regularization parameter in diffuse optical tomographic imaging, making this method more suitable to be deployed in real-time.

  6. Accuracy and efficiency of computer-aided anatomical analysis using 3D visualization software based on semi-automated and automated segmentations.

    PubMed

    An, Gao; Hong, Li; Zhou, Xiao-Bing; Yang, Qiong; Li, Mei-Qing; Tang, Xiang-Yang

    2017-03-01

    We investigated and compared the functionality of two 3D visualization software provided by a CT vendor and a third-party vendor, respectively. Using surgical anatomical measurement as baseline, we evaluated the accuracy of 3D visualization and verified their utility in computer-aided anatomical analysis. The study cohort consisted of 50 adult cadavers fixed with the classical formaldehyde method. The computer-aided anatomical analysis was based on CT images (in DICOM format) acquired by helical scan with contrast enhancement, using a CT vendor provided 3D visualization workstation (Syngo) and a third-party 3D visualization software (Mimics) that was installed on a PC. Automated and semi-automated segmentations were utilized in the 3D visualization workstation and software, respectively. The functionality and efficiency of automated and semi-automated segmentation methods were compared. Using surgical anatomical measurement as a baseline, the accuracy of 3D visualization based on automated and semi-automated segmentations was quantitatively compared. In semi-automated segmentation, the Mimics 3D visualization software outperformed the Syngo 3D visualization workstation. No significant difference was observed in anatomical data measurement by the Syngo 3D visualization workstation and the Mimics 3D visualization software (P>0.05). Both the Syngo 3D visualization workstation provided by a CT vendor and the Mimics 3D visualization software by a third-party vendor possessed the needed functionality, efficiency and accuracy for computer-aided anatomical analysis. Copyright © 2016 Elsevier GmbH. All rights reserved.

  7. Validity of automated measurement of left ventricular ejection fraction and volume using the Philips EPIQ system.

    PubMed

    Hovnanians, Ninel; Win, Theresa; Makkiya, Mohammed; Zheng, Qi; Taub, Cynthia

    2017-11-01

    To assess the efficiency and reproducibility of automated measurements of left ventricular (LV) volumes and LV ejection fraction (LVEF) in comparison to manually traced biplane Simpson's method. This is a single-center prospective study. Apical four- and two-chamber views were acquired in patients in sinus rhythm. Two operators independently measured LV volumes and LVEF using biplane Simpson's method. In addition, the image analysis software a2DQ on the Philips EPIQ system was applied to automatically assess the LV volumes and LVEF. Time spent on each analysis, using both methods, was documented. Concordance of echocardiographic measures was evaluated using intraclass correlation (ICC) and Bland-Altman analysis. Manual tracing and automated measurement of LV volumes and LVEF were performed in 184 patients with a mean age of 67.3 ± 17.3 years and BMI 28.0 ± 6.8 kg/m 2 . ICC and Bland-Altman analysis showed good agreements between manual and automated methods measuring LVEF, end-systolic, and end-diastolic volumes. The average analysis time was significantly less using the automated method than manual tracing (116 vs 217 seconds/patient, P < .0001). Automated measurement using the novel image analysis software a2DQ on the Philips EPIQ system produced accurate, efficient, and reproducible assessment of LV volumes and LVEF compared with manual measurement. © 2017, Wiley Periodicals, Inc.

  8. [Automated analyser of organ cultured corneal endothelial mosaic].

    PubMed

    Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L

    2002-05-01

    Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.

  9. Use of an automated drug distribution cabinet system in a disaster response mobile emergency department.

    PubMed

    Morchel, Herman; Ogedegbe, Chinwe; Desai, Nilesh; Faley, Brian; Mahmood, Nasir; Moro, Gary Del; Feldman, Joseph

    2015-01-01

    This article describes the innovative use of an automated drug distribution cabinet system for medication supply in a disaster response mobile Emergency Department vehicle. Prior to the use of the automated drug distribution cabinet system described in this article, the mobile hospitals were stocked as needed with drugs in individual boxes and draws. Experience with multiple deployments found this method to be very cumbersome and labor intensive, both in preparation, operational use, and demobilization. For a recent deployment to provide emergency medical care at the 2014 Super Bowl football event, the automated drug distribution cabinet system in the Institution's main campus Emergency Department was duplicated and incorporated into the mobile Emergency Department. This method of drug stocking and dispensing was found to be far more efficient than gathering and placing drugs in onboard draws and racks. Automated drug distribution cabinet systems can be used to significantly improve patient care and overall efficiency in mobile hospital deployments.

  10. Automation of On-Board Flightpath Management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.

    1981-01-01

    The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.

  11. Computational efficiency for the surface renewal method

    NASA Astrophysics Data System (ADS)

    Kelley, Jason; Higgins, Chad

    2018-04-01

    Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.

  12. On automating domain connectivity for overset grids

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1994-01-01

    An alternative method for domain connectivity among systems of overset grids is presented. Reference uniform Cartesian systems of points are used to achieve highly efficient domain connectivity, and form the basis for a future fully automated system. The Cartesian systems are used to approximated body surfaces and to map the computational space of component grids. By exploiting the characteristics of Cartesian Systems, Chimera type hole-cutting and identification of donor elements for intergrid boundary points can be carried out very efficiently. The method is tested for a range of geometrically complex multiple-body overset grid systems.

  13. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    PubMed

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.

  14. An automation-assisted generic approach for biological sample preparation and LC-MS/MS method validation.

    PubMed

    Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S

    2011-09-01

    Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.

  15. Input-output identification of controlled discrete manufacturing systems

    NASA Astrophysics Data System (ADS)

    Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques

    2014-03-01

    The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.

  16. Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.

    PubMed

    Alagrund, Katariina; Orpana, Arto K

    2014-01-01

    The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.

  17. Application of automated measurement and verification to utility energy efficiency program data

    DOE PAGES

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...

    2017-02-17

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  18. Application of automated measurement and verification to utility energy efficiency program data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  19. An automated and efficient conformation search of L-cysteine and L,L-cystine using the scaled hypersphere search method

    NASA Astrophysics Data System (ADS)

    Kishimoto, Naoki; Waizumi, Hiroki

    2017-10-01

    Stable conformers of L-cysteine and L,L-cystine were explored using an automated and efficient conformational searching method. The Gibbs energies of the stable conformers of L-cysteine and L,L-cystine were calculated with G4 and MP2 methods, respectively, at 450, 298.15, and 150 K. By assuming thermodynamic equilibrium and the barrier energies for the conformational isomerization pathways, the estimated ratios of the stable conformers of L-cysteine were compared with those determined by microwave spectroscopy in a previous study. Equilibrium structures of 1:1 and 2:1 cystine-Fe complexes were also calculated, and the energy of insertion of Fe into the disulfide bond was obtained.

  20. Comparison of Manual and Automated Measurements of Tracheobronchial Airway Geometry in Three Balb/c Mice.

    PubMed

    Islam, Asef; Oldham, Michael J; Wexler, Anthony S

    2017-11-01

    Mammalian lungs are comprised of large numbers of tracheobronchial airways that transition from the trachea to alveoli. Studies as wide ranging as pollutant deposition and lung development rely on accurate characterization of these airways. Advancements in CT imaging and the value of computational approaches in eliminating the burden of manual measurement are providing increased efficiency in obtaining this geometric data. In this study, we compare an automated method to a manual one for the first six generations of three Balb/c mouse lungs. We find good agreement between manual and automated methods and that much of the disagreement can be attributed to method precision. Using the automated method, we then provide anatomical data for the entire tracheobronchial airway tree from three Balb/C mice. Anat Rec, 2017. © 2017 Wiley Periodicals, Inc. Anat Rec, 300:2046-2057, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Automation and Intensity Modulated Radiation Therapy for Individualized High-Quality Tangent Breast Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario

    Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less

  2. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow

    PubMed Central

    Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.

    2012-01-01

    Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459

  3. Automation and intensity modulated radiation therapy for individualized high-quality tangent breast treatment plans.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B

    2014-11-01

    To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. On automating domain connectivity for overset grids

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau; Meakin, Robert L.

    1995-01-01

    An alternative method for domain connectivity among systems of overset grids is presented. Reference uniform Cartesian systems of points are used to achieve highly efficient domain connectivity, and form the basis for a future fully automated system. The Cartesian systems are used to approximate body surfaces and to map the computational space of component grids. By exploiting the characteristics of Cartesian systems, Chimera type hole-cutting and identification of donor elements for intergrid boundary points can be carried out very efficiently. The method is tested for a range of geometrically complex multiple-body overset grid systems. A dynamic hole expansion/contraction algorithm is also implemented to obtain optimum domain connectivity; however, it is tested only for geometry of generic shapes.

  5. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    NASA Astrophysics Data System (ADS)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well along the primarily sandy Delmarva inner continental shelf, and outlines a method that can be used to efficiently and consistently produce surficial geologic interpretations of the seafloor from ground-truthed geophysical or hydrographic data.

  6. Adaptive radial basis function mesh deformation using data reduction

    NASA Astrophysics Data System (ADS)

    Gillebaart, T.; Blom, D. S.; van Zuijlen, A. H.; Bijl, H.

    2016-09-01

    Radial Basis Function (RBF) mesh deformation is one of the most robust mesh deformation methods available. Using the greedy (data reduction) method in combination with an explicit boundary correction, results in an efficient method as shown in literature. However, to ensure the method remains robust, two issues are addressed: 1) how to ensure that the set of control points remains an accurate representation of the geometry in time and 2) how to use/automate the explicit boundary correction, while ensuring a high mesh quality. In this paper, we propose an adaptive RBF mesh deformation method, which ensures the set of control points always represents the geometry/displacement up to a certain (user-specified) criteria, by keeping track of the boundary error throughout the simulation and re-selecting when needed. Opposed to the unit displacement and prescribed displacement selection methods, the adaptive method is more robust, user-independent and efficient, for the cases considered. Secondly, the analysis of a single high aspect ratio cell is used to formulate an equation for the correction radius needed, depending on the characteristics of the correction function used, maximum aspect ratio, minimum first cell height and boundary error. Based on the analysis two new radial basis correction functions are derived and proposed. This proposed automated procedure is verified while varying the correction function, Reynolds number (and thus first cell height and aspect ratio) and boundary error. Finally, the parallel efficiency is studied for the two adaptive methods, unit displacement and prescribed displacement for both the CPU as well as the memory formulation with a 2D oscillating and translating airfoil with oscillating flap, a 3D flexible locally deforming tube and deforming wind turbine blade. Generally, the memory formulation requires less work (due to the large amount of work required for evaluating RBF's), but the parallel efficiency reduces due to the limited bandwidth available between CPU and memory. In terms of parallel efficiency/scaling the different studied methods perform similarly, with the greedy algorithm being the bottleneck. In terms of absolute computational work the adaptive methods are better for the cases studied due to their more efficient selection of the control points. By automating most of the RBF mesh deformation, a robust, efficient and almost user-independent mesh deformation method is presented.

  7. The Influence of Rater Effects in Training Sets on the Psychometric Quality of Automated Scoring for Writing Assessments

    ERIC Educational Resources Information Center

    Wind, Stefanie A.; Wolfe, Edward W.; Engelhard, George, Jr.; Foltz, Peter; Rosenstein, Mark

    2018-01-01

    Automated essay scoring engines (AESEs) are becoming increasingly popular as an efficient method for performance assessments in writing, including many language assessments that are used worldwide. Before they can be used operationally, AESEs must be "trained" using machine-learning techniques that incorporate human ratings. However, the…

  8. Automated sizing of large structures by mixed optimization methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  9. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  10. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    NASA Astrophysics Data System (ADS)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  11. [DNA Extraction from Old Bones by AutoMate Express™ System].

    PubMed

    Li, B; Lü, Z

    2017-08-01

    To establish a method for extracting DNA from old bones by AutoMate Express™ system. Bones were grinded into powder by freeze-mill. After extraction by AutoMate Express™, DNA were amplified and genotyped by Identifiler®Plus and MinFiler™ kits. DNA were extracted from 10 old bone samples, which kept in different environments with the postmortem interval from 10 to 20 years, in 3 hours by AutoMate Express™ system. Complete STR typing results were obtained from 8 samples. AutoMate Express™ system can quickly and efficiently extract DNA from old bones, which can be applied in forensic practice. Copyright© by the Editorial Department of Journal of Forensic Medicine

  12. An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.

  13. High-density grids for efficient data collection from multiple crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  14. High-density grids for efficient data collection from multiple crystals

    PubMed Central

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; Barnes, Christopher O.; Bonagura, Christopher A.; Brehmer, Winnie; Brunger, Axel T.; Calero, Guillermo; Caradoc-Davies, Tom T.; Chatterjee, Ruchira; Degrado, William F.; Fraser, James S.; Ibrahim, Mohamed; Kern, Jan; Kobilka, Brian K.; Kruse, Andrew C.; Larsson, Karl M.; Lemke, Heinrik T.; Lyubimov, Artem Y.; Manglik, Aashish; McPhillips, Scott E.; Norgren, Erik; Pang, Siew S.; Soltis, S. M.; Song, Jinhu; Thomaston, Jessica; Tsai, Yingssu; Weis, William I.; Woldeyes, Rahel A.; Yachandra, Vittal; Yano, Junko; Zouni, Athina; Cohen, Aina E.

    2016-01-01

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassette or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into the Blu-Ice/DCSS experimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. Crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures. PMID:26894529

  15. High-density grids for efficient data collection from multiple crystals

    DOE PAGES

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; ...

    2015-11-03

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  16. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    PubMed

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  17. Automated Solid-Phase Subcloning Based on Beads Brought into Proximity by Magnetic Force

    PubMed Central

    Hudson, Elton P.; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications. PMID:22624028

  18. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    PubMed Central

    2011-01-01

    Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease. PMID:21982080

  19. Deterministic binary vectors for efficient automated indexing of MEDLINE/PubMed abstracts.

    PubMed

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R; Bernstam, Elmer V; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI.

  20. Deterministic Binary Vectors for Efficient Automated Indexing of MEDLINE/PubMed Abstracts

    PubMed Central

    Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R.; Bernstam, Elmer V.; Cohen, Trevor

    2012-01-01

    The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI. PMID:23304369

  1. The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franconi, Ellen; Gee, Matt; Goldberg, Miriam

    Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines keymore » technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and detailed information. This information can be used to inform ongoing building operations, provide early input on energy efficiency program design, or assess the impact of efficiency by location and time of day. Stakeholders who can make use of such information include regulators, energy efficiency program administrators, program evaluators, contractors and aggregators, building owners, the investment community, and grid planners. Although each stakeholder has its own priorities and challenges related to savings measurement and verification, the potential exists for all to draw from a single set of efficiency valuation data. Such an integrated approach could provide a base consistency across stakeholder uses.« less

  2. Willingness to pay for more efficient irrigation techniques in the Lake Karla basin, Greece.

    NASA Astrophysics Data System (ADS)

    Mylopoulos, Nikitas; Fafoutis, Chrysostomos

    2014-05-01

    Thessaly, the second largest plain of Greece, is an intensively cultivated agricultural region. The intense and widespread agriculture of hydrophilic crops, such as cotton, has led to a remarkable water demand increase, which is usually covered by the overexploitation of groundwater resources. The Lake Karla basin is a prominent example of this unsustainable practice. Competition for the limited available freshwater resources in the Lake Karla basin is expected to increase in the near future as demand for irrigation water increases and drought years are expected to increase due to climate change. Together with the Unions of Agricultural Cooperatives, the Local Organizations of Land Reclamation is planning to introduce more efficient, water saving automated drip irrigation in the area among farmers who currently use non-automated drip irrigation, in order to ensure that these farmers can better cope with drought years and that water will be used more efficiently in crop production. Saving water use in irrigated agriculture is expected to be beneficial to both farmers and the restoration of Lake Karla and its wildlife like plants and birds. The aim of this study is to understand and record the farmers' opinions regarding the use of irrigation water and the restoration of Lake Karla, and to extract valuable conclusions and perform detailed analysis of the criteria for a new irrigation method. A general choice experiment with face-to-face interviews was conducted, using a random sample of 150 open field farmers from the study area. The farmers, who use the non-automated drip irrigation method and their farms are located within the watershed of Lake Karla, were interviewed regarding their willingness to switch to more efficient irrigation techniques, such as automated and controlled drip irrigation.The most important benefits of automated drip irrigation are an increase in crop yield, as plants are given water in a more precise way (based on their needs during the growing season) and a saving in water use. The choice experiment displays to the farmers two possible options for automated drip irrigation, described in terms of expected increase in crop yield, expected water saving, the duration of the restoration of Lake Karla to its original state before it was drained in the 1960s and the corresponding investment cost. The survey results show that socio-demographic factors and the average annual income influence the criteria and the views of farmers on a possible investment in the new method of automated drip irrigation. Moreover, there is a positive demand and willingness to pay for automated drip irrigation from the farmers in order to increase crop yield and speed up restoration of Lake Karla, considering that they are highly dependent on it.

  3. An entirely automated method to score DSS-induced colitis in mice by digital image analysis of pathology slides

    PubMed Central

    Kozlowski, Cleopatra; Jeet, Surinder; Beyer, Joseph; Guerrero, Steve; Lesch, Justin; Wang, Xiaoting; DeVoss, Jason; Diehl, Lauri

    2013-01-01

    SUMMARY The DSS (dextran sulfate sodium) model of colitis is a mouse model of inflammatory bowel disease. Microscopic symptoms include loss of crypt cells from the gut lining and infiltration of inflammatory cells into the colon. An experienced pathologist requires several hours per study to score histological changes in selected regions of the mouse gut. In order to increase the efficiency of scoring, Definiens Developer software was used to devise an entirely automated method to quantify histological changes in the whole H&E slide. When the algorithm was applied to slides from historical drug-discovery studies, automated scores classified 88% of drug candidates in the same way as pathologists’ scores. In addition, another automated image analysis method was developed to quantify colon-infiltrating macrophages, neutrophils, B cells and T cells in immunohistochemical stains of serial sections of the H&E slides. The timing of neutrophil and macrophage infiltration had the highest correlation to pathological changes, whereas T and B cell infiltration occurred later. Thus, automated image analysis enables quantitative comparisons between tissue morphology changes and cell-infiltration dynamics. PMID:23580198

  4. RNA isolation from mammalian cells using porous polymer monoliths: an approach for high-throughput automation.

    PubMed

    Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F

    2010-06-01

    The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.

  5. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  6. Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.

    PubMed

    Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian

    2018-03-26

    In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.

  7. Efficiency of an Automated Reception and Turnaround Time Management System for the Phlebotomy Room

    PubMed Central

    Yun, Soon Gyu; Park, Eun Su; Bang, Hae In; Kang, Jung Gu

    2016-01-01

    Background Recent advances in laboratory information systems have largely been focused on automation. However, the phlebotomy services have not been completely automated. To address this issue, we introduced an automated reception and turnaround time (TAT) management system, for the first time in Korea, whereby the patient's information is transmitted directly to the actual phlebotomy site and the TAT for each phlebotomy step can be monitored at a glance. Methods The GNT5 system (Energium Co., Ltd., Korea) was installed in June 2013. The automated reception and TAT management system has been in operation since February 2014. Integration of the automated reception machine with the GNT5 allowed for direct transmission of laboratory order information to the GNT5 without involving any manual reception step. We used the mean TAT from reception to actual phlebotomy as the parameter for evaluating the efficiency of our system. Results Mean TAT decreased from 5:45 min to 2:42 min after operationalization of the system. The mean number of patients in queue decreased from 2.9 to 1.0. Further, the number of cases taking more than five minutes from reception to phlebotomy, defined as the defect rate, decreased from 20.1% to 9.7%. Conclusions The use of automated reception and TAT management system was associated with a decrease of overall TAT and an improved workflow at the phlebotomy room. PMID:26522759

  8. Can a canopy temperature-based stress index enhance water use efficiency in irrigated wine grape under arid conditions?

    USDA-ARS?s Scientific Manuscript database

    Enhancement of irrigation water use efficiency and water productivity in arid wine grape production regions is hindered by a lack of automated, real-time methods for monitoring and interpreting vine water status. A normalized, water stress index calculated from real-time vine canopy temperature meas...

  9. Techniques for Increasing the Efficiency of Automation Systems in School Library Media Centers.

    ERIC Educational Resources Information Center

    Caffarella, Edward P.

    1996-01-01

    Discusses methods of managing queues (waiting lines) to optimize the use of student computer stations in school library media centers and to make searches more efficient and effective. The three major factors in queue management are arrival interval of the patrons, service time, and number of stations. (Author/LRW)

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shwehdi, M.H.; Khan, A.Z.

    Building automation technology is rapidly developing towards more reliable communication systems, devices that control electronic equipments. These equipment if controlled leads to efficient energy management, and savings on the monthly electricity bill. Power Line communication (PLC) has been one of the dreams of the electronics industry for decades, especially for building automation. It is the purpose of this paper to demonstrate communication methods among electronic control devices through an AC power line carrier within the buildings for more efficient energy control. The paper outlines methods of communication over a powerline, namely the X-10 and CE bus. It also introduces themore » spread spectrum technology as to increase speed to 100--150 times faster than the X-10 system. The powerline carrier has tremendous applications in the field of building automation. The paper presents an attempt to realize a smart house concept, so called, in which all home electronic devices from a coffee maker to a water heater microwave to chaos robots will be utilized by an intelligent network whenever one wishes to do so. The designed system may be applied very profitably to help in energy management for both customer and utility.« less

  11. Decision support system for the detection and grading of hard exudates from color fundus photographs

    NASA Astrophysics Data System (ADS)

    Jaafar, Hussain F.; Nandi, Asoke K.; Al-Nuaimy, Waleed

    2011-11-01

    Diabetic retinopathy is a major cause of blindness, and its earliest signs include damage to the blood vessels and the formation of lesions in the retina. Automated detection and grading of hard exudates from the color fundus image is a critical step in the automated screening system for diabetic retinopathy. We propose novel methods for the detection and grading of hard exudates and the main retinal structures. For exudate detection, a novel approach based on coarse-to-fine strategy and a new image-splitting method are proposed with overall sensitivity of 93.2% and positive predictive value of 83.7% at the pixel level. The average sensitivity of the blood vessel detection is 85%, and the success rate of fovea localization is 100%. For exudate grading, a polar fovea coordinate system is adopted in accordance with medical criteria. Because of its competitive performance and ability to deal efficiently with images of variable quality, the proposed technique offers promising and efficient performance as part of an automated screening system for diabetic retinopathy.

  12. Predicting Flows of Rarefied Gases

    NASA Technical Reports Server (NTRS)

    LeBeau, Gerald J.; Wilmoth, Richard G.

    2005-01-01

    DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.

  13. Automated basin delineation from digital terrain data

    NASA Technical Reports Server (NTRS)

    Marks, D.; Dozier, J.; Frew, J.

    1983-01-01

    While digital terrain grids are now in wide use, accurate delineation of drainage basins from these data is difficult to efficiently automate. A recursive order N solution to this problem is presented. The algorithm is fast because no point in the basin is checked more than once, and no points outside the basin are considered. Two applications for terrain analysis and one for remote sensing are given to illustrate the method, on a basin with high relief in the Sierra Nevada. This technique for automated basin delineation will enhance the utility of digital terrain analysis for hydrologic modeling and remote sensing.

  14. Efficient Permeability Measurement and Numerical Simulation of the Resin Flow in Low Permeability Preform Fabricated by Automated Dry Fiber Placement

    NASA Astrophysics Data System (ADS)

    Agogue, Romain; Chebil, Naziha; Deleglise-Lagardere, Mylène; Beauchene, Pierre; Park, Chung Hae

    2017-10-01

    We propose a new experimental method using a Hassler cell and air injection to measure the permeability of fiber preform while avoiding a race tracking effect. This method was proven to be particularly efficient to measure very low through-thickness permeability of preform fabricated by automated dry fiber placement. To validate the reliability of the permeability measurement, the experiments of viscous liquid infusion into the preform with or without a distribution medium were performed. The experimental data of flow front advancement was compared with the numerical simulation result using the permeability values obtained by the Hassler cell permeability measurement set-up as well as by the liquid infusion experiments. To address the computational cost issue, the model for the equivalent permeability of distribution medium was employed in the numerical simulation of liquid flow. The new concept using air injection and Hassler cell for the fiber preform permeability measurement was shown to be reliable and efficient.

  15. Efficient high-throughput biological process characterization: Definitive screening design with the ambr250 bioreactor system.

    PubMed

    Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam

    2015-01-01

    The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.

  16. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    PubMed

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  17. Efficient quantification of water content in edible oils by headspace gas chromatography with vapour phase calibration.

    PubMed

    Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian

    2018-06-01

    An automated and accurate headspace gas chromatographic (HS-GC) technique was investigated for rapidly quantifying water content in edible oils. In this method, multiple headspace extraction (MHE) procedures were used to analyse the integrated water content from the edible oil sample. A simple vapour phase calibration technique with an external vapour standard was used to calibrate both the water content in the gas phase and the total weight of water in edible oil sample. After that the water in edible oils can be quantified. The data showed that the relative standard deviation of the present HS-GC method in the precision test was less than 1.13%, the relative differences between the new method and a reference method (i.e. the oven-drying method) were no more than 1.62%. The present HS-GC method is automated, accurate, efficient, and can be a reliable tool for quantifying water content in edible oil related products and research. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  18. Cancer Detection Using Neural Computing Methodology

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad; Kohen, Hamid S.; Bearman, Gregory H.; Seligson, David B.

    2001-01-01

    This paper describes a novel learning methodology used to analyze bio-materials. The premise of this research is to help pathologists quickly identify anomalous cells in a cost efficient method. Skilled pathologists must methodically, efficiently and carefully analyze manually histopathologic materials for the presence, amount and degree of malignancy and/or other disease states. The prolonged attention required to accomplish this task induces fatigue that may result in a higher rate of diagnostic errors. In addition, automated image analysis systems to date lack a sufficiently intelligent means of identifying even the most general regions of interest in tissue based studies and this shortfall greatly limits their utility. An intelligent data understanding system that could quickly and accurately identify diseased tissues and/or could choose regions of interest would be expected to increase the accuracy of diagnosis and usher in truly automated tissue based image analysis.

  19. Does the use of automated fetal biometry improve clinical work flow efficiency?

    PubMed

    Espinoza, Jimmy; Good, Sara; Russell, Evie; Lee, Wesley

    2013-05-01

    This study was designed to compare the work flow efficiency of manual measurements of 5 fetal parameters with a novel technique that automatically measures these parameters from 2-dimensional sonograms. This prospective study included 200 singleton pregnancies between 15 and 40 weeks' gestation. Patients were randomly allocated to either manual (n = 100) or automatic (n = 100) fetal biometry. The automatic measurement was performed using a commercially available software application. A digital video recorder captured all on-screen activity associated with the sonographic examination. The examination time and number of steps required to obtain fetal measurements were compared between manual and automatic methods. The mean time required to obtain the biometric measurements was significantly shorter using the automated technique than the manual approach (P < .001 for all comparisons). Similarly, the mean number of steps required to perform these measurements was significantly fewer with automatic measurements compared to the manual technique (P < .001). In summary, automated biometry reduced the examination time required for standard fetal measurements. This approach may improve work flow efficiency in busy obstetric sonography practices.

  20. SU-G-TeP1-05: Development and Clinical Introduction of Automated Radiotherapy Treatment Planning for Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkel, D; Bol, GH; Asselen, B van

    Purpose: To develop an automated radiotherapy treatment planning and optimization workflow for prostate cancer in order to generate clinical treatment plans. Methods: A fully automated radiotherapy treatment planning and optimization workflow was developed based on the treatment planning system Monaco (Elekta AB, Stockholm, Sweden). To evaluate our method, a retrospective planning study (n=100) was performed on patients treated for prostate cancer with 5 field intensity modulated radiotherapy, receiving a dose of 35×2Gy to the prostate and vesicles and a simultaneous integrated boost of 35×0.2Gy to the prostate only. A comparison was made between the dosimetric values of the automatically andmore » manually generated plans. Operator time to generate a plan and plan efficiency was measured. Results: A comparison of the dosimetric values show that automatically generated plans yield more beneficial dosimetric values. In automatic plans reductions of 43% in the V72Gy of the rectum and 13% in the V72Gy of the bladder are observed when compared to the manually generated plans. Smaller variance in dosimetric values is seen, i.e. the intra- and interplanner variability is decreased. For 97% of the automatically generated plans and 86% of the clinical plans all criteria for target coverage and organs at risk constraints are met. The amount of plan segments and monitor units is reduced by 13% and 9% respectively. Automated planning requires less than one minute of operator time compared to over an hour for manual planning. Conclusion: The automatically generated plans are highly suitable for clinical use. The plans have less variance and a large gain in time efficiency has been achieved. Currently, a pilot study is performed, comparing the preference of the clinician and clinical physicist for the automatic versus manual plan. Future work will include expanding our automated treatment planning method to other tumor sites and develop other automated radiotherapy workflows.« less

  1. A longitudinal evaluation of performance of automated BCR-ABL1 quantitation using cartridge-based detection system.

    PubMed

    Enjeti, Anoop; Granter, Neil; Ashraf, Asma; Fletcher, Linda; Branford, Susan; Rowlings, Philip; Dooley, Susan

    2015-10-01

    An automated cartridge-based detection system (GeneXpert; Cepheid) is being widely adopted in low throughput laboratories for monitoring BCR-ABL1 transcript in chronic myelogenous leukaemia. This Australian study evaluated the longitudinal performance specific characteristics of the automated system.The automated cartridge-based system was compared prospectively with the manual qRT-PCR-based reference method at SA Pathology, Adelaide, over a period of 2.5 years. A conversion factor determination was followed by four re-validations. Peripheral blood samples (n = 129) with international scale (IS) values within detectable range were selected for assessment. The mean bias, proportion of results within specified fold difference (2-, 3- and 5-fold), the concordance rate of major molecular remission (MMR) and concordance across a range of IS values on paired samples were evaluated.The initial conversion factor for the automated system was determined as 0.43. Except for the second re-validation, where a negative bias of 1.9-fold was detected, all other biases fell within desirable limits. A cartridge-specific conversion factor and efficiency value was introduced and the conversion factor was confirmed to be stable in subsequent re-validation cycles. Concordance with the reference method/laboratory at >0.1-≤10 IS was 78.2% and at ≤0.001 was 80%, compared to 86.8% in the >0.01-≤0.1 IS range. The overall and MMR concordance were 85.7% and 94% respectively, for samples that fell within ± 5-fold of the reference laboratory value over the entire period of study.Conversion factor and performance specific characteristics for the automated system were longitudinally stable in the clinically relevant range, following introduction by the manufacturer of lot specific efficiency values.

  2. Automated classification of articular cartilage surfaces based on surface texture.

    PubMed

    Stachowiak, G P; Stachowiak, G W; Podsiadlo, P

    2006-11-01

    In this study the automated classification system previously developed by the authors was used to classify articular cartilage surfaces with different degrees of wear. This automated system classifies surfaces based on their texture. Plug samples of sheep cartilage (pins) were run on stainless steel discs under various conditions using a pin-on-disc tribometer. Testing conditions were specifically designed to produce different severities of cartilage damage due to wear. Environmental scanning electron microscope (SEM) (ESEM) images of cartilage surfaces, that formed a database for pattern recognition analysis, were acquired. The ESEM images of cartilage were divided into five groups (classes), each class representing different wear conditions or wear severity. Each class was first examined and assessed visually. Next, the automated classification system (pattern recognition) was applied to all classes. The results of the automated surface texture classification were compared to those based on visual assessment of surface morphology. It was shown that the texture-based automated classification system was an efficient and accurate method of distinguishing between various cartilage surfaces generated under different wear conditions. It appears that the texture-based classification method has potential to become a useful tool in medical diagnostics.

  3. The Location of Sources of Human Computer Processed Cerebral Potentials for the Automated Assessment of Visual Field Impairment

    PubMed Central

    Leisman, Gerald; Ashkenazi, Maureen

    1979-01-01

    Objective psychophysical techniques for investigating visual fields are described. The paper concerns methods for the collection and analysis of evoked potentials using a small laboratory computer and provides efficient methods for obtaining information about the conduction pathways of the visual system.

  4. Improving the detection efficiency in nuclear emulsion trackers

    NASA Astrophysics Data System (ADS)

    Alexandrov, A.; Bozza, C.; Buonaura, A.; Consiglio, L.; D`Ambrosio, N.; Lellis, G. De; De Serio, M.; Di Capua, F.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Fini, R. A.; Galati, G.; Giacomelli, G.; Grella, G.; Hosseini, B.; Kose, U.; Lauria, A.; Longhin, A.; Mandrioli, G.; Mauri, N.; Medinaceli, E.; Montesi, M. C.; Paoloni, A.; Pastore, A.; Patrizii, L.; Pozzato, M.; Pupilli, F.; Rescigno, R.; Roda, M.; Rosa, G.; Schembri, A.; Shchedrina, T.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Spinetti, M.; Stellacci, S. M.; Tenti, M.; Tioukov, V.

    2015-03-01

    Nuclear emulsion films are a tracking device with unique space resolution. Their use in nowadays large-scale experiments relies on the availability of automated microscope operating at very high speed. In this paper we describe the features and the latest improvements of the European Scanning System, a last-generation automated microscope for emulsion scanning. In particular, we present a new method for the recovery of tracking inefficiencies. Stacks of double coated emulsion films have been exposed to a 10 GeV/c pion beam. Efficiencies as high as 98% have been achieved for minimum ionising particle tracks perpendicular to the emulsion films and of 93% for tracks with tan(θ) ≃ 0.8.

  5. Research on automated disassembly technology for waste LCD

    NASA Astrophysics Data System (ADS)

    Qin, Qin; Zhu, Dongdong; Wang, Jingwei; Dou, Jianfang; Wang, Sujuan; Tu, Zimei

    2017-11-01

    In the field of Waste LCD disassembling and recycling, there are existing two major problems: 1) disassembling waste LCD mainly depends on manually mechanical crushing; 2) the resource level is not high. In order to deal with the above problems, in this paper, we develop an efficient, safe and automated waste LCD disassembling assembly line technology. This technology can disassembly and classify mainstream LCD into four components, which are liquid crystal display panels, housings and metal shield, PCB assembly. It can also disassembly many kinds of waste LCD. Compared with the traditional cooperation of manual labor and electric tools method, our proposed technology can significantly improve disassembling efficiency and demonstrate good prospects and promotional value.

  6. Development of Sub-Ischial Prosthetic Sockets with Vacuum-Assisted Suspension for Highly Active Persons with Transfemoral Amputations

    DTIC Science & Technology

    2011-10-01

    International Conference on Robotics and Automation, Pasadena CA, USA, May 19-23, 2008, p 3672-3677. APPENDICES A Socket Breakdown for Scanning...the LimbLogic is the more efficient of the two pumps. These tests also showed that the performance for both pumps was self -consistent over the...Donelan, J. M. Biomechanical Energy Harvesting: Apparatus and Method. IEEE International Conference on Robotics and Automation, May 19-23, 2008. Lyon

  7. Automated MRI parcellation of the frontal lobe

    PubMed Central

    Ranta, Marin E.; Chen, Min; Crocetti, Deana; Prince, Jerry L.; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E.; Mostofsky, Stewart H.

    2014-01-01

    Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. (2009) in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex (OFC) and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. PMID:23897577

  8. Automated structure determination of proteins with the SAIL-FLYA NMR method.

    PubMed

    Takeda, Mitsuhiro; Ikeya, Teppei; Güntert, Peter; Kainosho, Masatsune

    2007-01-01

    The labeling of proteins with stable isotopes enhances the NMR method for the determination of 3D protein structures in solution. Stereo-array isotope labeling (SAIL) provides an optimal stereospecific and regiospecific pattern of stable isotopes that yields sharpened lines, spectral simplification without loss of information, and the ability to collect rapidly and evaluate fully automatically the structural restraints required to solve a high-quality solution structure for proteins up to twice as large as those that can be analyzed using conventional methods. Here, we describe a protocol for the preparation of SAIL proteins by cell-free methods, including the preparation of S30 extract and their automated structure analysis using the FLYA algorithm and the program CYANA. Once efficient cell-free expression of the unlabeled or uniformly labeled target protein has been achieved, the NMR sample preparation of a SAIL protein can be accomplished in 3 d. A fully automated FLYA structure calculation can be completed in 1 d on a powerful computer system.

  9. High‐throughput automated scoring of Ki67 in breast cancer tissue microarrays from the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Daley, Frances; Zabaglo, Lila; McDuffus, Leigh‐Anne; Blows, Fiona; Coulson, Penny; Raza Ali, H; Benitez, Javier; Milne, Roger; Brenner, Herman; Stegmaier, Christa; Mannermaa, Arto; Chang‐Claude, Jenny; Rudolph, Anja; Sinn, Peter; Couch, Fergus J; Tollenaar, Rob A.E.M.; Devilee, Peter; Figueroa, Jonine; Sherman, Mark E; Lissowska, Jolanta; Hewitt, Stephen; Eccles, Diana; Hooning, Maartje J; Hollestelle, Antoinette; WM Martens, John; HM van Deurzen, Carolien; Investigators, kConFab; Bolla, Manjeet K; Wang, Qin; Jones, Michael; Schoemaker, Minouk; Broeks, Annegien; van Leeuwen, Flora E; Van't Veer, Laura; Swerdlow, Anthony J; Orr, Nick; Dowsett, Mitch; Easton, Douglas; Schmidt, Marjanka K; Pharoah, Paul D; Garcia‐Closas, Montserrat

    2016-01-01

    Abstract Automated methods are needed to facilitate high‐throughput and reproducible scoring of Ki67 and other markers in breast cancer tissue microarrays (TMAs) in large‐scale studies. To address this need, we developed an automated protocol for Ki67 scoring and evaluated its performance in studies from the Breast Cancer Association Consortium. We utilized 166 TMAs containing 16,953 tumour cores representing 9,059 breast cancer cases, from 13 studies, with information on other clinical and pathological characteristics. TMAs were stained for Ki67 using standard immunohistochemical procedures, and scanned and digitized using the Ariol system. An automated algorithm was developed for the scoring of Ki67, and scores were compared to computer assisted visual (CAV) scores in a subset of 15 TMAs in a training set. We also assessed the correlation between automated Ki67 scores and other clinical and pathological characteristics. Overall, we observed good discriminatory accuracy (AUC = 85%) and good agreement (kappa = 0.64) between the automated and CAV scoring methods in the training set. The performance of the automated method varied by TMA (kappa range= 0.37–0.87) and study (kappa range = 0.39–0.69). The automated method performed better in satisfactory cores (kappa = 0.68) than suboptimal (kappa = 0.51) cores (p‐value for comparison = 0.005); and among cores with higher total nuclei counted by the machine (4,000–4,500 cells: kappa = 0.78) than those with lower counts (50–500 cells: kappa = 0.41; p‐value = 0.010). Among the 9,059 cases in this study, the correlations between automated Ki67 and clinical and pathological characteristics were found to be in the expected directions. Our findings indicate that automated scoring of Ki67 can be an efficient method to obtain good quality data across large numbers of TMAs from multicentre studies. However, robust algorithm development and rigorous pre‐ and post‐analytical quality control procedures are necessary in order to ensure satisfactory performance. PMID:27499923

  10. Silicon solar cells by ion implantation and pulsed energy processing

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, A. R.; Minnucci, J. A.; Shaughnessy, T. S.; Greenwald, A. C.

    1976-01-01

    A new method for fabrication of silicon solar cells is being developed around ion implantation in conjunction with pulsed electron beam techniques to replace conventional furnace processing. Solar cells can be fabricated totally in a vacuum environment at room temperature. Cells with 10% AM0 efficiency have been demonstrated. High efficiency cells and effective automated processing capabilities are anticipated.

  11. Establishment of integrated protocols for automated high throughput kinetic chlorophyll fluorescence analyses.

    PubMed

    Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas

    2017-01-01

    Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.

  12. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Taylor, Cody

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less

  13. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that have resulted from this work. A review of computational aeroacoustics has recently been given by Lele.

  14. Recent Developments and Applications of the MMPBSA Method

    PubMed Central

    Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray

    2018-01-01

    The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919

  15. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  16. BoB, a best-of-breed automated text de-identification system for VHA clinical documents.

    PubMed

    Ferrández, Oscar; South, Brett R; Shen, Shuying; Friedlin, F Jeffrey; Samore, Matthew H; Meystre, Stéphane M

    2013-01-01

    De-identification allows faster and more collaborative clinical research while protecting patient confidentiality. Clinical narrative de-identification is a tedious process that can be alleviated by automated natural language processing methods. The goal of this research is the development of an automated text de-identification system for Veterans Health Administration (VHA) clinical documents. We devised a novel stepwise hybrid approach designed to improve the current strategies used for text de-identification. The proposed system is based on a previous study on the best de-identification methods for VHA documents. This best-of-breed automated clinical text de-identification system (aka BoB) tackles the problem as two separate tasks: (1) maximize patient confidentiality by redacting as much protected health information (PHI) as possible; and (2) leave de-identified documents in a usable state preserving as much clinical information as possible. We evaluated BoB with a manually annotated corpus of a variety of VHA clinical notes, as well as with the 2006 i2b2 de-identification challenge corpus. We present evaluations at the instance- and token-level, with detailed results for BoB's main components. Moreover, an existing text de-identification system was also included in our evaluation. BoB's design efficiently takes advantage of the methods implemented in its pipeline, resulting in high sensitivity values (especially for sensitive PHI categories) and a limited number of false positives. Our system successfully addressed VHA clinical document de-identification, and its hybrid stepwise design demonstrates robustness and efficiency, prioritizing patient confidentiality while leaving most clinical information intact.

  17. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    NASA Astrophysics Data System (ADS)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  18. Automated protein NMR structure determination using wavelet de-noised NOESY spectra.

    PubMed

    Dancea, Felician; Günther, Ulrich

    2005-11-01

    A major time-consuming step of protein NMR structure determination is the generation of reliable NOESY cross peak lists which usually requires a significant amount of manual interaction. Here we present a new algorithm for automated peak picking involving wavelet de-noised NOESY spectra in a process where the identification of peaks is coupled to automated structure determination. The core of this method is the generation of incremental peak lists by applying different wavelet de-noising procedures which yield peak lists of a different noise content. In combination with additional filters which probe the consistency of the peak lists, good convergence of the NOESY-based automated structure determination could be achieved. These algorithms were implemented in the context of the ARIA software for automated NOE assignment and structure determination and were validated for a polysulfide-sulfur transferase protein of known structure. The procedures presented here should be commonly applicable for efficient protein NMR structure determination and automated NMR peak picking.

  19. Design of automated oil sludge treatment unit

    NASA Astrophysics Data System (ADS)

    Chukhareva, N.; Korotchenko, T.; Yurkin, A.

    2015-11-01

    The article provides the feasibility study of contemporary oil sludge treatment methods. The basic parameters of a new resource-efficient oil sludge treatment unit that allows extracting as much oil as possible and disposing other components in efficient way have been outlined. Based on the calculation results, it has been revealed that in order to reduce the cost of the treatment unit and the expenses related to sludge disposal, it is essential to apply various combinations of the existing treatment methods.

  20. Opportunities for Energy Efficiency and Open Automated Demand Response in Wastewater Treatment Facilities in California -- Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee

    This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and thatmore » facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.« less

  1. Automation, Miniature Robotics and Sensors for Nondestructive Testing and Evaluation, Volume 4

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Y.; Baumgartner, E.; Backes, P.; Sherrit, S.; Bao, X.; Leary, S.; Kennedy, B.; Mavroidis, C.; Pfeiffer, C.; Culbert, C.; hide

    1999-01-01

    The development of NDE techniques has always been driven by the ongoing need for low-cost, rapid, user-friendly, reliable and efficient methods of detecting and characterizing flaws as well as determining material properties.

  2. Automated seamline detection along skeleton for remote sensing image mosaicking

    NASA Astrophysics Data System (ADS)

    Zhang, Hansong; Chen, Jianyu; Liu, Xin

    2015-08-01

    The automatic generation of seamline along the overlap region skeleton is a concerning problem for the mosaicking of Remote Sensing(RS) images. Along with the improvement of RS image resolution, it is necessary to ensure rapid and accurate processing under complex conditions. So an automated seamline detection method for RS image mosaicking based on image object and overlap region contour contraction is introduced. By this means we can ensure universality and efficiency of mosaicking. The experiments also show that this method can select seamline of RS images with great speed and high accuracy over arbitrary overlap regions, and realize RS image rapid mosaicking in surveying and mapping production.

  3. Efficient, Multi-Scale Designs Take Flight

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.

  4. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2.

    PubMed

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.

  5. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2

    PubMed Central

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154

  6. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    NASA Astrophysics Data System (ADS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  7. Automated MRI parcellation of the frontal lobe.

    PubMed

    Ranta, Marin E; Chen, Min; Crocetti, Deana; Prince, Jerry L; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E; Mostofsky, Stewart H

    2014-05-01

    Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here, we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. [2009]: Psychiatry Res 172:147-154 in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field, and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex [OFC] and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. Copyright © 2013 Wiley Periodicals, Inc.

  8. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  9. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE PAGES

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...

    2016-02-26

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  10. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  11. Visual Servoing-Based Nanorobotic System for Automated Electrical Characterization of Nanotubes inside SEM.

    PubMed

    Ding, Huiyang; Shi, Chaoyang; Ma, Li; Yang, Zhan; Wang, Mingyu; Wang, Yaqiong; Chen, Tao; Sun, Lining; Toshio, Fukuda

    2018-04-08

    The maneuvering and electrical characterization of nanotubes inside a scanning electron microscope (SEM) has historically been time-consuming and laborious for operators. Before the development of automated nanomanipulation-enabled techniques for the performance of pick-and-place and characterization of nanoobjects, these functions were still incomplete and largely operated manually. In this paper, a dual-probe nanomanipulation system vision-based feedback was demonstrated to automatically perform 3D nanomanipulation tasks, to investigate the electrical characterization of nanotubes. The XY-position of Atomic Force Microscope (AFM) cantilevers and individual carbon nanotubes (CNTs) were precisely recognized via a series of image processing operations. A coarse-to-fine positioning strategy in the Z-direction was applied through the combination of the sharpness-based depth estimation method and the contact-detection method. The use of nanorobotic magnification-regulated speed aided in improving working efficiency and reliability. Additionally, we proposed automated alignment of manipulator axes by visual tracking the movement trajectory of the end effector. The experimental results indicate the system's capability for automated measurement electrical characterization of CNTs. Furthermore, the automated nanomanipulation system has the potential to be extended to other nanomanipulation tasks.

  12. Visual Servoing-Based Nanorobotic System for Automated Electrical Characterization of Nanotubes inside SEM

    PubMed Central

    Ding, Huiyang; Shi, Chaoyang; Ma, Li; Yang, Zhan; Wang, Mingyu; Wang, Yaqiong; Chen, Tao; Sun, Lining; Toshio, Fukuda

    2018-01-01

    The maneuvering and electrical characterization of nanotubes inside a scanning electron microscope (SEM) has historically been time-consuming and laborious for operators. Before the development of automated nanomanipulation-enabled techniques for the performance of pick-and-place and characterization of nanoobjects, these functions were still incomplete and largely operated manually. In this paper, a dual-probe nanomanipulation system vision-based feedback was demonstrated to automatically perform 3D nanomanipulation tasks, to investigate the electrical characterization of nanotubes. The XY-position of Atomic Force Microscope (AFM) cantilevers and individual carbon nanotubes (CNTs) were precisely recognized via a series of image processing operations. A coarse-to-fine positioning strategy in the Z-direction was applied through the combination of the sharpness-based depth estimation method and the contact-detection method. The use of nanorobotic magnification-regulated speed aided in improving working efficiency and reliability. Additionally, we proposed automated alignment of manipulator axes by visual tracking the movement trajectory of the end effector. The experimental results indicate the system’s capability for automated measurement electrical characterization of CNTs. Furthermore, the automated nanomanipulation system has the potential to be extended to other nanomanipulation tasks. PMID:29642495

  13. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    PubMed

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  14. Inventory management and reagent supply for automated chemistry.

    PubMed

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  15. Deep convolutional neural network and 3D deformable approach for tissue segmentation in musculoskeletal magnetic resonance imaging.

    PubMed

    Liu, Fang; Zhou, Zhaoye; Jang, Hyungseok; Samsonov, Alexey; Zhao, Gengyan; Kijowski, Richard

    2018-04-01

    To describe and evaluate a new fully automated musculoskeletal tissue segmentation method using deep convolutional neural network (CNN) and three-dimensional (3D) simplex deformable modeling to improve the accuracy and efficiency of cartilage and bone segmentation within the knee joint. A fully automated segmentation pipeline was built by combining a semantic segmentation CNN and 3D simplex deformable modeling. A CNN technique called SegNet was applied as the core of the segmentation method to perform high resolution pixel-wise multi-class tissue classification. The 3D simplex deformable modeling refined the output from SegNet to preserve the overall shape and maintain a desirable smooth surface for musculoskeletal structure. The fully automated segmentation method was tested using a publicly available knee image data set to compare with currently used state-of-the-art segmentation methods. The fully automated method was also evaluated on two different data sets, which include morphological and quantitative MR images with different tissue contrasts. The proposed fully automated segmentation method provided good segmentation performance with segmentation accuracy superior to most of state-of-the-art methods in the publicly available knee image data set. The method also demonstrated versatile segmentation performance on both morphological and quantitative musculoskeletal MR images with different tissue contrasts and spatial resolutions. The study demonstrates that the combined CNN and 3D deformable modeling approach is useful for performing rapid and accurate cartilage and bone segmentation within the knee joint. The CNN has promising potential applications in musculoskeletal imaging. Magn Reson Med 79:2379-2391, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Computational methods for structural load and resistance modeling

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Millwater, H. R.; Harren, S. V.

    1991-01-01

    An automated capability for computing structural reliability considering uncertainties in both load and resistance variables is presented. The computations are carried out using an automated Advanced Mean Value iteration algorithm (AMV +) with performance functions involving load and resistance variables obtained by both explicit and implicit methods. A complete description of the procedures used is given as well as several illustrative examples, verified by Monte Carlo Analysis. In particular, the computational methods described in the paper are shown to be quite accurate and efficient for a material nonlinear structure considering material damage as a function of several primitive random variables. The results show clearly the effectiveness of the algorithms for computing the reliability of large-scale structural systems with a maximum number of resolutions.

  17. Sci—Thur PM: Planning and Delivery — 03: Automated delivery and quality assurance of a modulated electron radiation therapy plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, T; Papaconstadopoulos, P; Alexander, A

    2014-08-15

    Modulated electron radiation therapy (MERT) offers the potential to improve healthy tissue sparing through increased dose conformity. Challenges remain, however, in accurate beamlet dose calculation, plan optimization, collimation method and delivery accuracy. In this work, we investigate the accuracy and efficiency of an end-to-end MERT plan and automated-delivery workflow for the electron boost portion of a previously treated whole breast irradiation case. Dose calculations were performed using Monte Carlo methods and beam weights were determined using a research-based treatment planning system capable of inverse optimization. The plan was delivered to radiochromic film placed in a water equivalent phantom for verification,more » using an automated motorized tertiary collimator. The automated delivery, which covered 4 electron energies, 196 subfields and 6183 total MU was completed in 25.8 minutes, including 6.2 minutes of beam-on time with the remainder of the delivery time spent on collimator leaf motion and the automated interfacing with the accelerator in service mode. The delivery time could be reduced by 5.3 minutes with minor electron collimator modifications and the beam-on time could be reduced by and estimated factor of 2–3 through redesign of the scattering foils. Comparison of the planned and delivered film dose gave 3%/3 mm gamma pass rates of 62.1, 99.8, 97.8, 98.3, and 98.7 percent for the 9, 12, 16, 20 MeV, and combined energy deliveries respectively. Good results were also seen in the delivery verification performed with a MapCHECK 2 device. The results showed that accurate and efficient MERT delivery is possible with current technologies.« less

  18. Hierarchical extraction of urban objects from mobile laser scanning data

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Dong, Zhen; Zhao, Gang; Dai, Wenxia

    2015-01-01

    Point clouds collected in urban scenes contain a huge number of points (e.g., billions), numerous objects with significant size variability, complex and incomplete structures, and variable point densities, raising great challenges for the automated extraction of urban objects in the field of photogrammetry, computer vision, and robotics. This paper addresses these challenges by proposing an automated method to extract urban objects robustly and efficiently. The proposed method generates multi-scale supervoxels from 3D point clouds using the point attributes (e.g., colors, intensities) and spatial distances between points, and then segments the supervoxels rather than individual points by combining graph based segmentation with multiple cues (e.g., principal direction, colors) of the supervoxels. The proposed method defines a set of rules for merging segments into meaningful units according to types of urban objects and forms the semantic knowledge of urban objects for the classification of objects. Finally, the proposed method extracts and classifies urban objects in a hierarchical order ranked by the saliency of the segments. Experiments show that the proposed method is efficient and robust for extracting buildings, streetlamps, trees, telegraph poles, traffic signs, cars, and enclosures from mobile laser scanning (MLS) point clouds, with an overall accuracy of 92.3%.

  19. Study of Adaptive Mathematical Models for Deriving Automated Pilot Performance Measurement Techniques. Volume I. Model Development.

    ERIC Educational Resources Information Center

    Connelly, Edward A.; And Others

    A new approach to deriving human performance measures and criteria for use in automatically evaluating trainee performance is documented in this report. The ultimate application of the research is to provide methods for automatically measuring pilot performance in a flight simulator or from recorded in-flight data. An efficient method of…

  20. Two pass method and radiation interchange processing when applied to thermal-structural analysis of large space truss structures

    NASA Technical Reports Server (NTRS)

    Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.; Rogers, Karen M.

    1993-01-01

    A method of efficient and automated thermal-structural processing of very large space structures is presented. The method interfaces the finite element and finite difference techniques. It also results in a pronounced reduction of the quantity of computations, computer resources and manpower required for the task, while assuring the desired accuracy of the results.

  1. Automated Literature Searches for Longitudinal Tracking of Cancer Research Training Program Graduates.

    PubMed

    Padilla, Luz A; Desmond, Renee A; Brooks, C Michael; Waterbor, John W

    2018-06-01

    A key outcome measure of cancer research training programs is the number of cancer-related peer-reviewed publications after training. Because program graduates do not routinely report their publications, staff must periodically conduct electronic literature searches on each graduate. The purpose of this study is to compare findings of an innovative computer-based automated search program versus repeated manual literature searches to identify post-training peer-reviewed publications. In late 2014, manual searches for publications by former R25 students identified 232 cancer-related articles published by 112 of 543 program graduates. In 2016, a research assistant was instructed in performing Scopus literature searches for comparison with individual PubMed searches on our 543 program graduates. Through 2014, Scopus found 304 cancer publications, 220 of that had been retrieved manually plus an additional 84 papers. However, Scopus missed 12 publications found manually. Together, both methods found 316 publications. The automated method found 96.2 % of the 316 publications while individual searches found only 73.4 %. An automated search method such as using the Scopus database is a key tool for conducting comprehensive literature searches, but it must be supplemented with periodic manual searches to find the initial publications of program graduates. A time-saving feature of Scopus is the periodic automatic alerts of new publications. Although a training period is needed and initial costs can be high, an automated search method is worthwhile due to its high sensitivity and efficiency in the long term.

  2. Volumetric analysis of pelvic hematomas after blunt trauma using semi-automated seeded region growing segmentation: a method validation study.

    PubMed

    Dreizin, David; Bodanapally, Uttam K; Neerchal, Nagaraj; Tirada, Nikki; Patlas, Michael; Herskovits, Edward

    2016-11-01

    Manually segmented traumatic pelvic hematoma volumes are strongly predictive of active bleeding at conventional angiography, but the method is time intensive, limiting its clinical applicability. We compared volumetric analysis using semi-automated region growing segmentation to manual segmentation and diameter-based size estimates in patients with pelvic hematomas after blunt pelvic trauma. A 14-patient cohort was selected in an anonymous randomized fashion from a dataset of patients with pelvic binders at MDCT, collected retrospectively as part of a HIPAA-compliant IRB-approved study from January 2008 to December 2013. To evaluate intermethod differences, one reader (R1) performed three volume measurements using the manual technique and three volume measurements using the semi-automated technique. To evaluate interobserver differences for semi-automated segmentation, a second reader (R2) performed three semi-automated measurements. One-way analysis of variance was used to compare differences in mean volumes. Time effort was also compared. Correlation between the two methods as well as two shorthand appraisals (greatest diameter, and the ABC/2 method for estimating ellipsoid volumes) was assessed with Spearman's rho (r). Intraobserver variability was lower for semi-automated compared to manual segmentation, with standard deviations ranging between ±5-32 mL and ±17-84 mL, respectively (p = 0.0003). There was no significant difference in mean volumes between the two readers' semi-automated measurements (p = 0.83); however, means were lower for the semi-automated compared with the manual technique (manual: mean and SD 309.6 ± 139 mL; R1 semi-auto: 229.6 ± 88.2 mL, p = 0.004; R2 semi-auto: 243.79 ± 99.7 mL, p = 0.021). Despite differences in means, the correlation between the two methods was very strong and highly significant (r = 0.91, p < 0.001). Correlations with diameter-based methods were only moderate and nonsignificant. Mean semi-automated segmentation time effort was 2 min and 6 s and 2 min and 35 s for R1 and R2, respectively, vs. 22 min and 8 s for manual segmentation. Semi-automated pelvic hematoma volumes correlate strongly with manually segmented volumes. Since semi-automated segmentation can be performed reliably and efficiently, volumetric analysis of traumatic pelvic hematomas is potentially valuable at the point-of-care.

  3. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  4. Development of the automated circulating tumor cell recovery system with microcavity array.

    PubMed

    Negishi, Ryo; Hosokawa, Masahito; Nakamura, Seita; Kanbara, Hisashige; Kanetomo, Masafumi; Kikuhara, Yoshihito; Tanaka, Tsuyoshi; Matsunaga, Tadashi; Yoshino, Tomoko

    2015-05-15

    Circulating tumor cells (CTCs) are well recognized as useful biomarker for cancer diagnosis and potential target of drug discovery for metastatic cancer. Efficient and precise recovery of extremely low concentrations of CTCs from blood has been required to increase the detection sensitivity. Here, an automated system equipped with a microcavity array (MCA) was demonstrated for highly efficient and reproducible CTC recovery. The use of MCA allows selective recovery of cancer cells from whole blood on the basis of differences in size between tumor and blood cells. Intra- and inter-assays revealed that the automated system achieved high efficiency and reproducibility equal to the assay manually performed by well-trained operator. Under optimized assay workflow, the automated system allows efficient and precise cell recovery for non-small cell lung cancer cells spiked in whole blood. The automated CTC recovery system will contribute to high-throughput analysis in the further clinical studies on large cohort of cancer patients. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Comparison of automated erythrocytapheresis versus manual exchange transfusion to treat cerebral macrovasculopathy in sickle cell anemia.

    PubMed

    Koehl, Bérengère; Sommet, Julie; Holvoet, Laurent; Abdoul, Hendy; Boizeau, Priscilla; Ithier, Ghislaine; Missud, Florence; Couque, Nathalie; Verlhac, Suzanne; Voultoury, Pauline; Sellami, Fatiha; Baruchel, André; Benkerrou, Malika

    2016-05-01

    Chronic exchange transfusion is effective for primary and secondary prevention of stroke in children with sickle cell anemia (SCA). Erythrocytapheresis is recognized to be the most efficient approach; however, it is not widely implemented and is not suitable for all patients. The aim of our study was to compare automated exchange transfusion (AET) with our manual method of exchange transfusion and, in particular, to evaluate the efficacy, safety, and cost of our manual method. Thirty-nine SCA children with stroke and/or abnormal findings on transcranial Doppler were included in the study. We retrospectively analyzed 1353 exchange sessions, including 333 sessions of AET and 1020 sessions of manual exchange transfusion (MET). Both methods were well tolerated. The median decrease in hemoglobin (Hb)S per session was 21.5% with AET and 18.8% with our manual method (p < 0.0001) with no major increase in red blood cell consumption. Iron overload was well controlled, even with the manual method, with a median (interquartile range) ferritin level of 312 (152-994) µg/L after 24 months of transfusions. The main differences in annual cost relate to equipment costs, which were 74 times higher with the automated method. Our study shows that continuous MET has comparable efficacy to the automated method in terms of stroke prevention, decrease in HbS, and iron overload prevention. It is feasible in all hospital settings and is often combined with AET successively over time. © 2016 AABB.

  6. Scarless assembly of unphosphorylated DNA fragments with a simplified DATEL method.

    PubMed

    Ding, Wenwen; Weng, Huanjiao; Jin, Peng; Du, Guocheng; Chen, Jian; Kang, Zhen

    2017-05-04

    Efficient assembly of multiple DNA fragments is a pivotal technology for synthetic biology. A scarless and sequence-independent DNA assembly method (DATEL) using thermal exonucleases has been developed recently. Here, we present a simplified DATEL (sDATEL) for efficient assembly of unphosphorylated DNA fragments with low cost. The sDATEL method is only dependent on Taq DNA polymerase and Taq DNA ligase. After optimizing the committed parameters of the reaction system such as pH and the concentration of Mg 2+ and NAD+, the assembly efficiency was increased by 32-fold. To further improve the assembly capacity, the number of thermal cycles was optimized, resulting in successful assembly 4 unphosphorylated DNA fragments with an accuracy of 75%. sDATEL could be a desirable method for routine manual and automated assembly.

  7. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer

    PubMed Central

    Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. PMID:28467468

  8. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    PubMed

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  9. Automated clinical trial eligibility prescreening: increasing the efficiency of patient identification for clinical trials in the emergency department

    PubMed Central

    Ni, Yizhao; Kennebeck, Stephanie; Dexheimer, Judith W; McAneney, Constance M; Tang, Huaxiu; Lingren, Todd; Li, Qi; Zhai, Haijun; Solti, Imre

    2015-01-01

    Objectives (1) To develop an automated eligibility screening (ES) approach for clinical trials in an urban tertiary care pediatric emergency department (ED); (2) to assess the effectiveness of natural language processing (NLP), information extraction (IE), and machine learning (ML) techniques on real-world clinical data and trials. Data and methods We collected eligibility criteria for 13 randomly selected, disease-specific clinical trials actively enrolling patients between January 1, 2010 and August 31, 2012. In parallel, we retrospectively selected data fields including demographics, laboratory data, and clinical notes from the electronic health record (EHR) to represent profiles of all 202795 patients visiting the ED during the same period. Leveraging NLP, IE, and ML technologies, the automated ES algorithms identified patients whose profiles matched the trial criteria to reduce the pool of candidates for staff screening. The performance was validated on both a physician-generated gold standard of trial–patient matches and a reference standard of historical trial–patient enrollment decisions, where workload, mean average precision (MAP), and recall were assessed. Results Compared with the case without automation, the workload with automated ES was reduced by 92% on the gold standard set, with a MAP of 62.9%. The automated ES achieved a 450% increase in trial screening efficiency. The findings on the gold standard set were confirmed by large-scale evaluation on the reference set of trial–patient matches. Discussion and conclusion By exploiting the text of trial criteria and the content of EHRs, we demonstrated that NLP-, IE-, and ML-based automated ES could successfully identify patients for clinical trials. PMID:25030032

  10. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  11. Optimization of Composite Structures with Curved Fiber Trajectories

    NASA Astrophysics Data System (ADS)

    Lemaire, Etienne; Zein, Samih; Bruyneel, Michael

    2014-06-01

    This paper studies the problem of optimizing composites shells manufactured using Automated Tape Layup (ATL) or Automated Fiber Placement (AFP) processes. The optimization procedure relies on a new approach to generate equidistant fiber trajectories based on Fast Marching Method. Starting with a (possibly curved) reference fiber direction defined on a (possibly curved) meshed surface, the new method allows determining fibers orientation resulting from a uniform thickness layup. The design variables are the parameters defining the position and the shape of the reference curve which results in very few design variables. Thanks to this efficient parameterization, maximum stiffness optimization numerical applications are proposed. The shape of the design space is discussed, regarding local and global optimal solutions.

  12. ICSH guidelines for the verification and performance of automated cell counters for body fluids.

    PubMed

    Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B

    2014-12-01

    One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.

  13. Application of the H-Mode, a Design and Interaction Concept for Highly Automated Vehicles, to Aircraft

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Flemisch, Frank O.; Schutte, Paul C.; Williams, Ralph A.

    2006-01-01

    Driven by increased safety, efficiency, and airspace capacity, automation is playing an increasing role in aircraft operations. As aircraft become increasingly able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help us understand their use and guide their design using new forms of automation and interaction. We propose a novel design metaphor to aid the conceptualization, design, and operation of highly-automated aircraft. Design metaphors transfer meaning from common experiences to less familiar applications or functions. A notable example is the "Desktop metaphor" for manipulating files on a computer. This paper describes a metaphor for highly automated vehicles known as the H-metaphor and a specific embodiment of the metaphor known as the H-mode as applied to aircraft. The fundamentals of the H-metaphor are reviewed followed by an overview of an exploratory usability study investigating human-automation interaction issues for a simple H-mode implementation. The envisioned application of the H-mode concept to aircraft is then described as are two planned evaluations.

  14. The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency

    ERIC Educational Resources Information Center

    Oder, Karl; Pittman, Stephanie

    2015-01-01

    Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should…

  15. Automated Assessment of Child Vocalization Development Using LENA

    ERIC Educational Resources Information Center

    Richards, Jeffrey A.; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-01-01

    Purpose: To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Method: Assessment was based on full-day audio…

  16. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    NASA Astrophysics Data System (ADS)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  17. An advanced method for classifying atmospheric circulation types based on prototypes connectivity graph

    NASA Astrophysics Data System (ADS)

    Zagouras, Athanassios; Argiriou, Athanassios A.; Flocas, Helena A.; Economou, George; Fotopoulos, Spiros

    2012-11-01

    Classification of weather maps at various isobaric levels as a methodological tool is used in several problems related to meteorology, climatology, atmospheric pollution and to other fields for many years. Initially the classification was performed manually. The criteria used by the person performing the classification are features of isobars or isopleths of geopotential height, depending on the type of maps to be classified. Although manual classifications integrate the perceptual experience and other unquantifiable qualities of the meteorology specialists involved, these are typically subjective and time consuming. Furthermore, during the last years different approaches of automated methods for atmospheric circulation classification have been proposed, which present automated and so-called objective classifications. In this paper a new method of atmospheric circulation classification of isobaric maps is presented. The method is based on graph theory. It starts with an intelligent prototype selection using an over-partitioning mode of fuzzy c-means (FCM) algorithm, proceeds to a graph formulation for the entire dataset and produces the clusters based on the contemporary dominant sets clustering method. Graph theory is a novel mathematical approach, allowing a more efficient representation of spatially correlated data, compared to the classical Euclidian space representation approaches, used in conventional classification methods. The method has been applied to the classification of 850 hPa atmospheric circulation over the Eastern Mediterranean. The evaluation of the automated methods is performed by statistical indexes; results indicate that the classification is adequately comparable with other state-of-the-art automated map classification methods, for a variable number of clusters.

  18. Opportunities for Energy Efficiency and Automated Demand Response in Industrial Refrigerated Warehouses in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee

    2009-05-11

    This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiencymore » measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.« less

  19. Non-linear Multidimensional Optimization for use in Wire Scanner Fitting

    NASA Astrophysics Data System (ADS)

    Henderson, Alyssa; Terzic, Balsa; Hofler, Alicia; Center Advanced Studies of Accelerators Collaboration

    2014-03-01

    To ensure experiment efficiency and quality from the Continuous Electron Beam Accelerator at Jefferson Lab, beam energy, size, and position must be measured. Wire scanners are devices inserted into the beamline to produce measurements which are used to obtain beam properties. Extracting physical information from the wire scanner measurements begins by fitting Gaussian curves to the data. This study focuses on optimizing and automating this curve-fitting procedure. We use a hybrid approach combining the efficiency of Newton Conjugate Gradient (NCG) method with the global convergence of three nature-inspired (NI) optimization approaches: genetic algorithm, differential evolution, and particle-swarm. In this Python-implemented approach, augmenting the locally-convergent NCG with one of the globally-convergent methods ensures the quality, robustness, and automation of curve-fitting. After comparing the methods, we establish that given an initial data-derived guess, each finds a solution with the same chi-square- a measurement of the agreement of the fit to the data. NCG is the fastest method, so it is the first to attempt data-fitting. The curve-fitting procedure escalates to one of the globally-convergent NI methods only if NCG fails, thereby ensuring a successful fit. This method allows for the most optimal signal fit and can be easily applied to similar problems.

  20. SU-E-T-406: Use of TrueBeam Developer Mode and API to Increase the Efficiency and Accuracy of Commissioning Measurements for the Varian EDGE Stereotactic Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, S; Gulam, M; Song, K

    2014-06-01

    Purpose: The Varian EDGE machine is a new stereotactic platform, combining Calypso and VisionRT localization systems with a stereotactic linac. The system includes TrueBeam DeveloperMode, making possible the use of XML-scripting for automation of linac-related tasks. This study details the use of DeveloperMode to automate commissioning tasks for Varian EDGE, thereby improving efficiency and measurement consistency. Methods: XML-scripting was used for various commissioning tasks,including couch model verification,beam-scanning,and isocenter verification. For couch measurements, point measurements were acquired for several field sizes (2×2,4×4,10×10cm{sup 2}) at 42 gantry angles for two couch-models. Measurements were acquired with variations in couch position(rails in/out,couch shifted inmore » each of motion axes) compared to treatment planning system(TPS)-calculated values,which were logged automatically through advanced planning interface(API) scripting functionality. For beam scanning, XML-scripts were used to create custom MLC-apertures. For isocenter verification, XML-scripts were used to automate various Winston-Lutz-type tests. Results: For couch measurements, the time required for each set of angles was approximately 9 minutes. Without scripting, each set required approximately 12 minutes. Automated measurements required only one physicist, while manual measurements required at least two physicists to handle linac positions/beams and data recording. MLC apertures were generated outside of the TPS,and with the .xml file format, double-checking without use of TPS/operator console was possible. Similar time efficiency gains were found for isocenter verification measurements Conclusion: The use of XML scripting in TrueBeam DeveloperMode allows for efficient and accurate data acquisition during commissioning. The efficiency improvement is most pronounced for iterative measurements, exemplified by the time savings for couch modeling measurements(approximately 10 hours). The scripting also allowed for creation of the files in advance without requiring access to TPS. The API scripting functionality enabled efficient creation/mining of TPS data. Finally, automation reduces the potential for human error in entering linac values at the machine console,and the script provides a log of measurements acquired for each session. This research was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  1. A new, fast and semi-automated size determination method (SASDM) for studying multicellular tumor spheroids

    PubMed Central

    Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats

    2005-01-01

    Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948

  2. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    PubMed

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. © 2014 Society for Laboratory Automation and Screening.

  3. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Managing the Implementation of Mission Operations Automation

    NASA Technical Reports Server (NTRS)

    Sodano, R.; Crouse, P.; Odendahl, S.; Fatig, M.; McMahon, K.; Lakin, J.

    2006-01-01

    Reducing the cost of mission operations has necessitated a high level of automation both on spacecraft and ground systems. While automation on spacecraft is implemented during the design phase, ground system automation tends to be implemented during the prime mission operations phase. Experience has shown that this tendency for late automation development can be hindered by several factors: additional hardware and software resources may need to be procured; software must be developed and tested on a non-interference basis with primary operations with limited manpower; and established procedures may not be suited for automation requiring substantial rework. In this paper we will review the experience of successfully automating mission operations for seven on-orbit missions: the Compton Gamma Ray Observatory (CGRO), the Rossi X-Ray Timing Explorer (RXTE), the Advanced Composition Explorer (ACE), the Far Ultraviolet Spectroscopic Explorer (FUSE), Interplanetary Physics Laboratory (WIND), Polar Plasma Laboratory (POLAR), and the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE). We will provide lessons learned in areas such as: spacecraft recorder management, procedure development, lights out commanding from the ground system vs. stored command loads, spacecraft contingency response time, and ground station interfaces. Implementing automation strategies during the mission concept and spacecraft integration and test phase as the most efficient method will be discussed.

  5. Automated brain tumor segmentation in magnetic resonance imaging based on sliding-window technique and symmetry analysis.

    PubMed

    Lian, Yanyun; Song, Zhijian

    2014-01-01

    Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.

  6. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.

  7. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  8. Manual versus automated methods for cleaning reusable accessory devices used for minimally invasive surgical procedures.

    PubMed

    Alfa, M J; Nemes, R

    2004-09-01

    We undertook a simulated-use study using quantitative methods to evaluate the cleaning efficacy of ported and non-ported accessory devices used in minimally invasive surgery. We chose laparoscopic scissors and forceps to represent worst-case devices which were inoculated with artificial test soil containing 10(6) cfu/mL Enterococcus faecalis and Geobacillus stearothermophilus and allowed to dry for 1 h. Cleaning was performed manually, as well as by the automated SI-Auto Narrow lumen cleaner. Manual cleaning left two- to 50-fold more soil residuals (protein, haemoglobin and carbohydrate) inside the lumen of non-ported versus ported laparoscopic accessory devices. The SI-Auto Narrow lumen cleaner was more efficient than manual cleaning and achieved >99% reduction in soil parameters in both non-ported (using retro-flushing) and ported laparoscopic devices. Only the automated cleaning of ported devices achieved 10(3)-10(4)-fold reduction in bacterial numbers. Sonication alone (no flushing of inner channel) did not effectively remove soil or organisms from the inner channel. Our findings indicate that non-ported accessory devices cannot be as reliably cleaned as ported devices regardless of the cleaning method used. If non-ported accessory devices are reprocessed, they should be cleaned using retro-flushing in an automated narrow lumen cleaner.

  9. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    NASA Astrophysics Data System (ADS)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  10. Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology

    NASA Astrophysics Data System (ADS)

    Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.

    2015-03-01

    In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.

  11. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    NASA Technical Reports Server (NTRS)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris

    2014-01-01

    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  12. Quantitative Radiology: Automated CT Liver Volumetry Compared With Interactive Volumetry and Manual Volumetry

    PubMed Central

    Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Garg, Shailesh; Hori, Masatoshi; Oto, Aytekin; Baron, Richard L.

    2014-01-01

    OBJECTIVE The purpose of this study was to evaluate automated CT volumetry in the assessment of living-donor livers for transplant and to compare this technique with software-aided interactive volumetry and manual volumetry. MATERIALS AND METHODS Hepatic CT scans of 18 consecutively registered prospective liver donors were obtained under a liver transplant protocol. Automated liver volumetry was developed on the basis of 3D active-contour segmentation. To establish reference standard liver volumes, a radiologist manually traced the contour of the liver on each CT slice. We compared the results obtained with automated and interactive volumetry with those obtained with the reference standard for this study, manual volumetry. RESULTS The average interactive liver volume was 1553 ± 343 cm3, and the average automated liver volume was 1520 ± 378 cm3. The average manual volume was 1486 ± 343 cm3. Both interactive and automated volumetric results had excellent agreement with manual volumetric results (intraclass correlation coefficients, 0.96 and 0.94). The average user time for automated volumetry was 0.57 ± 0.06 min/case, whereas those for interactive and manual volumetry were 27.3 ± 4.6 and 39.4 ± 5.5 min/case, the difference being statistically significant (p < 0.05). CONCLUSION Both interactive and automated volumetry are accurate for measuring liver volume with CT, but automated volumetry is substantially more efficient. PMID:21940543

  13. Comparison of manual and automated nucleic acid extraction methods from clinical specimens for microbial diagnosis purposes.

    PubMed

    Wozniak, Aniela; Geoffroy, Enrique; Miranda, Carolina; Castillo, Claudia; Sanhueza, Francia; García, Patricia

    2016-11-01

    The choice of nucleic acids (NAs) extraction method for molecular diagnosis in microbiology is of major importance because of the low microbial load, different nature of microorganisms, and clinical specimens. The NA yield of different extraction methods has been mostly studied using spiked samples. However, information from real human clinical specimens is scarce. The purpose of this study was to compare the performance of a manual low-cost extraction method (Qiagen kit or salting-out extraction method) with the automated high-cost MagNAPure Compact method. According to cycle threshold values for different pathogens, MagNAPure is as efficient as Qiagen for NA extraction from noncomplex clinical specimens (nasopharyngeal swab, skin swab, plasma, respiratory specimens). In contrast, according to cycle threshold values for RNAseP, MagNAPure method may not be an appropriate method for NA extraction from blood. We believe that MagNAPure versatility reduced risk of cross-contamination and reduced hands-on time compensates its high cost. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. An automated method to find reaction mechanisms and solve the kinetics in organometallic catalysis.

    PubMed

    Varela, J A; Vázquez, S A; Martínez-Núñez, E

    2017-05-01

    A novel computational method is proposed in this work for use in discovering reaction mechanisms and solving the kinetics of transition metal-catalyzed reactions. The method does not rely on either chemical intuition or assumed a priori mechanisms, and it works in a fully automated fashion. Its core is a procedure, recently developed by one of the authors, that combines accelerated direct dynamics with an efficient geometry-based post-processing algorithm to find transition states (Martinez-Nunez, E., J. Comput. Chem. 2015 , 36 , 222-234). In the present work, several auxiliary tools have been added to deal with the specific features of transition metal catalytic reactions. As a test case, we chose the cobalt-catalyzed hydroformylation of ethylene because of its well-established mechanism, and the fact that it has already been used in previous automated computational studies. Besides the generally accepted mechanism of Heck and Breslow, several side reactions, such as hydrogenation of the alkene, emerged from our calculations. Additionally, the calculated rate law for the hydroformylation reaction agrees reasonably well with those obtained in previous experimental and theoretical studies.

  15. Semi-automating the manual literature search for systematic reviews increases efficiency.

    PubMed

    Chapman, Andrea L; Morgan, Laura C; Gartlehner, Gerald

    2010-03-01

    To minimise retrieval bias, manual literature searches are a key part of the search process of any systematic review. Considering the need to have accurate information, valid results of the manual literature search are essential to ensure scientific standards; likewise efficient approaches that minimise the amount of personnel time required to conduct a manual literature search are of great interest. The objective of this project was to determine the validity and efficiency of a new manual search method that utilises the scopus database. We used the traditional manual search approach as the gold standard to determine the validity and efficiency of the proposed scopus method. Outcome measures included completeness of article detection and personnel time involved. Using both methods independently, we compared the results based on accuracy of the results, validity and time spent conducting the search, efficiency. Regarding accuracy, the scopus method identified the same studies as the traditional approach indicating its validity. In terms of efficiency, using scopus led to a time saving of 62.5% compared with the traditional approach (3 h versus 8 h). The scopus method can significantly improve the efficiency of manual searches and thus of systematic reviews.

  16. Segmentation of nuclear images in automated cervical cancer screening

    NASA Astrophysics Data System (ADS)

    Dadeshidze, Vladimir; Olsson, Lars J.; Domanik, Richard A.

    1995-08-01

    This paper describes an efficient method of segmenting cell nuclei from complex scenes based upon the use of adaptive region growing in conjuction with nucleus-specific filters. Results of segmenting potentially abnormal (cancer or neoplastic) cell nuclei in Papanicolaou smears from 0.8 square micrometers resolution images are also presented.

  17. Automated Analysis of Renewable Energy Datasets ('EE/RE Data Mining')

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian; Elmore, Ryan; Getman, Dan

    This poster illustrates methods to substantially improve the understanding of renewable energy data sets and the depth and efficiency of their analysis through the application of statistical learning methods ('data mining') in the intelligent processing of these often large and messy information sources. The six examples apply methods for anomaly detection, data cleansing, and pattern mining to time-series data (measurements from metering points in buildings) and spatiotemporal data (renewable energy resource datasets).

  18. Robust design of microchannel cooler

    NASA Astrophysics Data System (ADS)

    He, Ye; Yang, Tao; Hu, Li; Li, Leimin

    2005-12-01

    Microchannel cooler has offered a new method for the cooling of high power diode lasers, with the advantages of small volume, high efficiency of thermal dissipation and low cost when mass-produced. In order to reduce the sensitivity of design to manufacture errors or other disturbances, Taguchi method that is one of robust design method was chosen to optimize three parameters important to the cooling performance of roof-like microchannel cooler. The hydromechanical and thermal mathematical model of varying section microchannel was calculated using finite volume method by FLUENT. A special program was written to realize the automation of the design process for improving efficiency. The optimal design is presented which compromises between optimal cooling performance and its robustness. This design method proves to be available.

  19. Radio Frequency Identification and Motion-sensitive Video Efficiently Automate Recording of Unrewarded Choice Behavior by Bumblebees

    PubMed Central

    Orbán, Levente L.; Plowright, Catherine M.S.

    2014-01-01

    We present two methods for observing bumblebee choice behavior in an enclosed testing space. The first method consists of Radio Frequency Identification (RFID) readers built into artificial flowers that display various visual cues, and RFID tags (i.e., passive transponders) glued to the thorax of bumblebee workers. The novelty in our implementation is that RFID readers are built directly into artificial flowers that are capable of displaying several distinct visual properties such as color, pattern type, spatial frequency (i.e., “busyness” of the pattern), and symmetry (spatial frequency and symmetry were not manipulated in this experiment). Additionally, these visual displays in conjunction with the automated systems are capable of recording unrewarded and untrained choice behavior. The second method consists of recording choice behavior at artificial flowers using motion-sensitive high-definition camcorders. Bumblebees have number tags glued to their thoraces for unique identification. The advantage in this implementation over RFID is that in addition to observing landing behavior, alternate measures of preference such as hovering and antennation may also be observed. Both automation methods increase experimental control, and internal validity by allowing larger scale studies that take into account individual differences. External validity is also improved because bees can freely enter and exit the testing environment without constraints such as the availability of a research assistant on-site. Compared to human observation in real time, the automated methods are more cost-effective and possibly less error-prone. PMID:25489677

  20. Radio Frequency Identification and motion-sensitive video efficiently automate recording of unrewarded choice behavior by bumblebees.

    PubMed

    Orbán, Levente L; Plowright, Catherine M S

    2014-11-15

    We present two methods for observing bumblebee choice behavior in an enclosed testing space. The first method consists of Radio Frequency Identification (RFID) readers built into artificial flowers that display various visual cues, and RFID tags (i.e., passive transponders) glued to the thorax of bumblebee workers. The novelty in our implementation is that RFID readers are built directly into artificial flowers that are capable of displaying several distinct visual properties such as color, pattern type, spatial frequency (i.e., "busyness" of the pattern), and symmetry (spatial frequency and symmetry were not manipulated in this experiment). Additionally, these visual displays in conjunction with the automated systems are capable of recording unrewarded and untrained choice behavior. The second method consists of recording choice behavior at artificial flowers using motion-sensitive high-definition camcorders. Bumblebees have number tags glued to their thoraces for unique identification. The advantage in this implementation over RFID is that in addition to observing landing behavior, alternate measures of preference such as hovering and antennation may also be observed. Both automation methods increase experimental control, and internal validity by allowing larger scale studies that take into account individual differences. External validity is also improved because bees can freely enter and exit the testing environment without constraints such as the availability of a research assistant on-site. Compared to human observation in real time, the automated methods are more cost-effective and possibly less error-prone.

  1. Efficient production of a gene mutant cell line through integrating TALENs and high-throughput cell cloning.

    PubMed

    Sun, Changhong; Fan, Yu; Li, Juan; Wang, Gancheng; Zhang, Hanshuo; Xi, Jianzhong Jeff

    2015-02-01

    Transcription activator-like effectors (TALEs) are becoming powerful DNA-targeting tools in a variety of mammalian cells and model organisms. However, generating a stable cell line with specific gene mutations in a simple and rapid manner remains a challenging task. Here, we report a new method to efficiently produce monoclonal cells using integrated TALE nuclease technology and a series of high-throughput cell cloning approaches. Following this method, we obtained three mTOR mutant 293T cell lines within 2 months, which included one homozygous mutant line. © 2014 Society for Laboratory Automation and Screening.

  2. An automated system for reduction of the firm's employees under maximal overall efficiency

    NASA Astrophysics Data System (ADS)

    Yonchev, Yoncho; Nikolov, Simeon; Baeva, Silvia

    2012-11-01

    Achieving maximal overall efficiency is a priority in all companies. This problem is formulated as a knap-sack problem and afterwards as a linear assignment problem. An automated system is created for solving of this problem.

  3. Automated variance reduction for MCNP using deterministic methods.

    PubMed

    Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B

    2005-01-01

    In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.

  4. A Computational Framework for Automation of Point Defect Calculations

    NASA Astrophysics Data System (ADS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration

    A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.

  5. Automated Pollution Control

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Patterned after the Cassini Resource Exchange (CRE), Sholtz and Associates established the Automated Credit Exchange (ACE), an Internet-based concept that automates the auctioning of "pollution credits" in Southern California. An early challenge of the Jet Propulsion Laboratory's Cassini mission was allocating the spacecraft's resources. To support the decision-making process, the CRE was developed. The system removes the need for the science instrument manager to know the individual instruments' requirements for the spacecraft resources. Instead, by utilizing principles of exchange, the CRE induces the instrument teams to reveal their requirements. In doing so, they arrive at an efficient allocation of spacecraft resources by trading among themselves. A Southern California RECLAIM air pollution credit trading market has been set up using same bartering methods utilized in the Cassini mission in order to help companies keep pollution and costs down.

  6. A modified method for determining tannin-protein precipitation capacity using accelerated solvent extraction (ASE) and microplate gel filtration.

    PubMed

    McArt, Scott H; Spalinger, Donald E; Kennish, John M; Collins, William B

    2006-06-01

    The protein precipitation assay used by Robbins et al., (1987) Ecology 68:98-107 has been shown to predict successfully the reduction in protein availability to some ruminants due to tannins. The procedure, however, is expensive and laborious, which limits its utility, especially for quantitative ecological or nutritional applications where large numbers of assays may be required. We have modified the method to decrease its cost and increase laboratory efficiency by: (1) automating the extraction by using Accelerated Solvent Extraction (ASE); and (2) by scaling and automating the precipitation reaction, chromatography, and spectrometry with microplate gel filtration and an automated UV-VIS microplate spectrometer. ASE extraction is shown to be as effective at extracting tannins as the hot methanol technique. Additionally, the microplate assay is sensitive and precise. We show that the results from the new technique correspond in a nearly 1:1 relationship to the results of the previous technique. Hence, this method could reliably replace the older method with no loss in relevance to herbivore protein digestion. Moreover, the ASE extraction technique should be applicable to other tannin-protein precipitation assays and possibly other phenolic assays.

  7. A computational framework for automation of point defect calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  8. A mixed optimization method for automated design of fuselage structures.

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1972-01-01

    A procedure for automating the design of transport aircraft fuselage structures has been developed and implemented in the form of an operational program. The structure is designed in two stages. First, an overall distribution of structural material is obtained by means of optimality criteria to meet strength and displacement constraints. Subsequently, the detailed design of selected rings and panels consisting of skin and stringers is performed by mathematical optimization accounting for a set of realistic design constraints. The practicality and computer efficiency of the procedure is demonstrated on cylindrical and area-ruled large transport fuselages.

  9. A computational framework for automation of point defect calculations

    DOE PAGES

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...

    2017-01-13

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  10. A Fully Automated Drosophila Olfactory Classical Conditioning and Testing System for Behavioral Learning and Memory Assessment

    PubMed Central

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal

    2016-01-01

    Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. PMID:26703418

  11. Automated indirect immunofluorescence evaluation of antinuclear autoantibodies on HEp-2 cells.

    PubMed

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability.

  12. Automated Indirect Immunofluorescence Evaluation of Antinuclear Autoantibodies on HEp-2 Cells

    PubMed Central

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability. PMID:23251220

  13. Production technology for high efficiency ion implanted solar cells

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, A. R.; Minnucci, J. A.; Greenwald, A. C.; Josephs, R. H.

    1978-01-01

    Ion implantation is being developed for high volume automated production of silicon solar cells. An implanter designed for solar cell processing and able to properly implant up to 300 4-inch wafers per hour is now operational. A machine to implant 180 sq m/hr of solar cell material has been designed. Implanted silicon solar cells with efficiencies exceeding 16% AM1 are now being produced and higher efficiencies are expected. Ion implantation and transient processing by pulsed electron beams are being integrated with electrostatic bonding to accomplish a simple method for large scale, low cost production of high efficiency solar cell arrays.

  14. Energy conservation and management system using efficient building automation

    NASA Astrophysics Data System (ADS)

    Ahmed, S. Faiz; Hazry, D.; Tanveer, M. Hassan; Joyo, M. Kamran; Warsi, Faizan A.; Kamarudin, H.; Wan, Khairunizam; Razlan, Zuradzman M.; Shahriman A., B.; Hussain, A. T.

    2015-05-01

    In countries where the demand and supply gap of electricity is huge and the people are forced to endure increasing hours of load shedding, unnecessary consumption of electricity makes matters even worse. So the importance and need for electricity conservation increases exponentially. This paper outlines a step towards the conservation of energy in general and electricity in particular by employing efficient Building Automation technique. It should be noted that by careful designing and implementation of the Building Automation System, up to 30% to 40% of energy consumption can be reduced, which makes a huge difference for energy saving. In this study above mentioned concept is verified by performing experiment on a prototype experimental room and by implementing efficient building automation technique. For the sake of this efficient automation, Programmable Logic Controller (PLC) is employed as a main controller, monitoring various system parameters and controlling appliances as per required. The hardware test run and experimental findings further clarifies and proved the concept. The added advantage of this project is that it can be implemented to both small and medium level domestic homes thus greatly reducing the overall unnecessary load on the Utility provider.

  15. Non-linear Multidimensional Optimization for use in Wire Scanner Fitting

    NASA Astrophysics Data System (ADS)

    Henderson, Alyssa; Terzic, Balsa; Hofler, Alicia; CASA and Accelerator Ops Collaboration

    2013-10-01

    To ensure experiment efficiency and quality from the Continuous Electron Beam Accelerator at Jefferson Lab, beam energy, size, and position must be measured. Wire scanners are devices inserted into the beamline to produce measurements which are used to obtain beam properties. Extracting physical information from the wire scanner measurements begins by fitting Gaussian curves to the data. This study focuses on optimizing and automating this curve-fitting procedure. We use a hybrid approach combining the efficiency of Newton Conjugate Gradient (NCG) method with the global convergence of three nature-inspired (NI) optimization approaches: genetic algorithm, differential evolution, and particle-swarm. In this Python-implemented approach, augmenting the locally-convergent NCG with one of the globally-convergent methods ensures the quality, robustness, and automation of curve-fitting. After comparing the methods, we establish that given an initial data-derived guess, each finds a solution with the same chi-square- a measurement of the agreement of the fit to the data. NCG is the fastest method, so it is the first to attempt data-fitting. The curve-fitting procedure escalates to one of the globally-convergent NI methods only if NCG fails, thereby ensuring a successful fit. This method allows for the most optimal signal fit and can be easily applied to similar problems. Financial support from DoE, NSF, ODU, DoD, and Jefferson Lab.

  16. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  17. Individual differences and their impact on the safety and the efficiency of human-wheelchair systems.

    PubMed

    Jipp, Meike

    2012-12-01

    The extent to which individual differences in fine motor abilities affect indoor safety and efficiency of human-wheelchair systems was examined. To reduce the currently large number of indoor wheelchair accidents, assistance systems with a high level of automation were developed. It was proposed to adapt the wheelchair's level of automation to the user's ability to steer the device to avoid drawbacks of highly automated wheelchairs. The state of the art, however, lacks an empirical identification of those abilities. A study with 23 participants is described. The participants drove through various sections of a course with a powered wheelchair. Repeatedly measured criteria were safety (numbers of collisions) and efficiency (times required for reaching goals). As covariates, the participants' fine motor abilities were assessed. A random coefficient modeling approach was conducted to analyze the data,which were available on two levels as course sections were nested within participants.The participants' aiming, precision, and armhand speed contributed significantly to both criteria: Participants with lower fine motor abilities had more collisions and required more time for reaching goals. Adapting the wheelchair's level of automation to these fine motor abilities can improve indoor safety and efficiency. In addition, the results highlight the need to further examine the impact of individual differences on the design of automation features for powered wheelchairs as well as other applications of automation. The results facilitate the improvement of current wheelchair technology.

  18. Capillary electrophoresis method to determine siRNA complexation with cationic liposomes.

    PubMed

    Furst, Tania; Bettonville, Virginie; Farcas, Elena; Frere, Antoine; Lechanteur, Anna; Evrard, Brigitte; Fillet, Marianne; Piel, Géraldine; Servais, Anne-Catherine

    2016-10-01

    Small interfering RNA (siRNA) inducing gene silencing has great potential to treat many human diseases. To ensure effective siRNA delivery, it must be complexed with an appropriate vector, generally nanoparticles. The nanoparticulate complex requires an optimal physiochemical characterization and the complexation efficiency has to be precisely determined. The methods usually used to measure complexation in gel electrophoresis and RiboGreen ® fluorescence-based assay. However, those approaches are not automated and present some drawbacks such as the low throughput and the use of carcinogenic reagents. The aim of this study is to develop a new simple and fast method to accurately quantify the complexation efficiency. In this study, capillary electrophoresis (CE) was used to determine the siRNA complexation with cationic liposomes. The short-end injection mode applied enabled siRNA detection in less than 5 min. Moreover, the CE technique offers many advantages compared with the other classical methods. It is automated, does not require sample preparation and expensive reagents. Moreover, no mutagenic risk is associated with the CE approach since no carcinogenic product is used. Finally, this methodology can also be extended for the characterization of other types of nanoparticles encapsulating siRNA, such as cationic polymeric nanoparticles. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Enhanced mobility for aging populations using automated vehicles : [summary].

    DOT National Transportation Integrated Search

    2016-01-01

    Studies show that aging adults have travel needs that can be inadequately addressed by todays : transportation system. Automated vehicles (AVs), ranging from assistive technologies to full : automation, may offer a safe and efficient transportatio...

  20. Coutts/Sweetgrass automated border crossing : phase I

    DOT National Transportation Integrated Search

    1999-03-01

    The Coutts/Sweetgrass Automated Border Crossing Project was intended to improve operational efficiency of this rural border crossing facility using ITS applications. Phase I of the Coutts/Sweetgrass Automated Border Crossing Project was intended to r...

  1. SCSD: The Project and the Schools. A Report from Educational Facilities Laboratories.

    ERIC Educational Resources Information Center

    Benet, James; And Others

    SCSD, a structurally coordinated school building components system, is a highly automated method of building new schools that creatively meet the needs of the ever changing educational environment through functional and flexible planning. Examples of why SCSD high schools are efficient, flexible, and spatially planned, are cited. Environmental…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less

  3. A Novel Secure IoT-Based Smart Home Automation System Using a Wireless Sensor Network.

    PubMed

    Pirbhulal, Sandeep; Zhang, Heye; E Alahi, Md Eshrat; Ghayvat, Hemant; Mukhopadhyay, Subhas Chandra; Zhang, Yuan-Ting; Wu, Wanqing

    2016-12-30

    Wireless sensor networks (WSNs) provide noteworthy benefits over traditional approaches for several applications, including smart homes, healthcare, environmental monitoring, and homeland security. WSNs are integrated with the Internet Protocol (IP) to develop the Internet of Things (IoT) for connecting everyday life objects to the internet. Hence, major challenges of WSNs include: (i) how to efficiently utilize small size and low-power nodes to implement security during data transmission among several sensor nodes; (ii) how to resolve security issues associated with the harsh and complex environmental conditions during data transmission over a long coverage range. In this study, a secure IoT-based smart home automation system was developed. To facilitate energy-efficient data encryption, a method namely Triangle Based Security Algorithm (TBSA) based on efficient key generation mechanism was proposed. The proposed TBSA in integration of the low power Wi-Fi were included in WSNs with the Internet to develop a novel IoT-based smart home which could provide secure data transmission among several associated sensor nodes in the network over a long converge range. The developed IoT based system has outstanding performance by fulfilling all the necessary security requirements. The experimental results showed that the proposed TBSA algorithm consumed less energy in comparison with some existing methods.

  4. A Novel Secure IoT-Based Smart Home Automation System Using a Wireless Sensor Network

    PubMed Central

    Pirbhulal, Sandeep; Zhang, Heye; E Alahi, Md Eshrat; Ghayvat, Hemant; Mukhopadhyay, Subhas Chandra; Zhang, Yuan-Ting; Wu, Wanqing

    2016-01-01

    Wireless sensor networks (WSNs) provide noteworthy benefits over traditional approaches for several applications, including smart homes, healthcare, environmental monitoring, and homeland security. WSNs are integrated with the Internet Protocol (IP) to develop the Internet of Things (IoT) for connecting everyday life objects to the internet. Hence, major challenges of WSNs include: (i) how to efficiently utilize small size and low-power nodes to implement security during data transmission among several sensor nodes; (ii) how to resolve security issues associated with the harsh and complex environmental conditions during data transmission over a long coverage range. In this study, a secure IoT-based smart home automation system was developed. To facilitate energy-efficient data encryption, a method namely Triangle Based Security Algorithm (TBSA) based on efficient key generation mechanism was proposed. The proposed TBSA in integration of the low power Wi-Fi were included in WSNs with the Internet to develop a novel IoT-based smart home which could provide secure data transmission among several associated sensor nodes in the network over a long converge range. The developed IoT based system has outstanding performance by fulfilling all the necessary security requirements. The experimental results showed that the proposed TBSA algorithm consumed less energy in comparison with some existing methods. PMID:28042831

  5. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment

    PubMed Central

    2011-01-01

    Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025

  6. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment.

    PubMed

    Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott

    2011-07-28

    Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.

  7. Fast and accurate determination of the detergent efficiency by optical fiber sensors

    NASA Astrophysics Data System (ADS)

    Patitsa, Maria; Pfeiffer, Helge; Wevers, Martine

    2011-06-01

    An optical fiber sensor was developed to control the cleaning efficiency of surfactants. Prior to the measurements, the sensing part of the probe is covered with a uniform standardized soil layer (lipid multilayer), and a gold mirror is deposited at the end of the optical fiber. For the lipid multilayer deposition on the fiber, Langmuir-Blodgett technique was used and the progress of deposition was followed online by ultraviolet spectroscopy. The invention provides a miniaturized Surface Plasmon Resonance dip-sensor for automated on-line testing that can replace the cost and time consuming existing methods and develop a breakthrough in detergent testing in combining optical sensing, surface chemistry and automated data acquisition. The sensor is to be used to evaluate detergency of different cleaning products and also indicate how formulation, concentration, lipid nature and temperature affect the cleaning behavior of a surfactant.

  8. LAMMPS integrated materials engine (LIME) for efficient automation of particle-based simulations: application to equation of state generation

    NASA Astrophysics Data System (ADS)

    Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.

    2017-07-01

    We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.

  9. Wireless energizing system for an automated implantable sensor.

    PubMed

    Swain, Biswaranjan; Nayak, Praveen P; Kar, Durga P; Bhuyan, Satyanarayan; Mishra, Laxmi P

    2016-07-01

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonant frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.

  10. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  11. Designing of smart home automation system based on Raspberry Pi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pinsmore » of Raspberry Pi by pressing the corresponding key for turning “on” and “off” of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.« less

  12. Workload Capacity: A Response Time-Based Measure of Automation Dependence.

    PubMed

    Yamani, Yusuke; McCarley, Jason S

    2016-05-01

    An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.

  13. Automation: the competitive edge for HMOs and other alternative delivery systems.

    PubMed

    Prussin, J A

    1987-12-01

    Until recently, many, if not most, Health Maintenance Organizations (HMO) were not automated. Moreover, HMOs that were automated tended to be automated only on a limited basis. Recently, however, the highly competitive marketplace within which HMOs and other Alternative Delivery Systems (ADS) exist has required that they operate at a maximum effectiveness and efficiency. Given the complex nature of ADSs, the volume of transactions in ADSs, the large number of members served by ADSs, and the numerous providers who are paid at different rates and on different bases by ADSs, it is impossible for an ADS to operate effectively or efficiently, let alone show optimal performance, without a sophisticated, comprehensive automated system. Reliable automated systems designed specifically to address ADS functions such as enrollment and premium billing, finance and accounting, medical information and patient management, and marketing have recently become available at a reasonable cost.

  14. EpHLA software: a timesaving and accurate tool for improving identification of acceptable mismatches for clinical purposes.

    PubMed

    Filho, Herton Luiz Alves Sales; da Mata Sousa, Luiz Claudio Demes; von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; dos Santos Neto, Pedro de Alcântara; do Nascimento, Ferraz; de Castro, Adail Fonseca; do Nascimento, Liliane Machado; Kneib, Carolina; Bianchi Cazarote, Helena; Mayumi Kitamura, Daniele; Torres, Juliane Roberta Dias; da Cruz Lopes, Laiane; Barros, Aryela Loureiro; da Silva Edlin, Evelin Nildiane; de Moura, Fernanda Sá Leal; Watanabe, Janine Midori Figueiredo; do Monte, Semiramis Jamil Hadad

    2012-06-01

    The HLAMatchmaker algorithm, which allows the identification of “safe” acceptable mismatches (AMMs) for recipients of solid organ and cell allografts, is rarely used in part due to the difficulty in using it in the current Excel format. The automation of this algorithm may universalize its use to benefit the allocation of allografts. Recently, we have developed a new software called EpHLA, which is the first computer program automating the use of the HLAMatchmaker algorithm. Herein, we present the experimental validation of the EpHLA program by showing the time efficiency and the quality of operation. The same results, obtained by a single antigen bead assay with sera from 10 sensitized patients waiting for kidney transplants, were analyzed either by conventional HLAMatchmaker or by automated EpHLA method. Users testing these two methods were asked to record: (i) time required for completion of the analysis (in minutes); (ii) number of eplets obtained for class I and class II HLA molecules; (iii) categorization of eplets as reactive or non-reactive based on the MFI cutoff value; and (iv) determination of AMMs based on eplets' reactivities. We showed that although both methods had similar accuracy, the automated EpHLA method was over 8 times faster in comparison to the conventional HLAMatchmaker method. In particular the EpHLA software was faster and more reliable but equally accurate as the conventional method to define AMMs for allografts. The EpHLA software is an accurate and quick method for the identification of AMMs and thus it may be a very useful tool in the decision-making process of organ allocation for highly sensitized patients as well as in many other applications.

  15. Automating Content Analysis of Open-Ended Responses: Wordscores and Affective Intonation

    PubMed Central

    Baek, Young Min; Cappella, Joseph N.; Bindman, Alyssa

    2014-01-01

    This study presents automated methods for predicting valence and quantifying valenced thoughts of a text. First, it examines whether Wordscores, developed by Laver, Benoit, and Garry (2003), can be adapted to reliably predict the valence of open-ended responses in a survey about bioethical issues in genetics research, and then tests a complementary and novel technique for coding the number of valenced thoughts in open-ended responses, termed Affective Intonation. Results show that Wordscores successfully predicts the valence of brief and grammatically imperfect open-ended responses, and Affective Intonation achieves comparable performance to human coders when estimating number of valenced thoughts. Both Wordscores and Affective Intonation have promise as reliable, effective, and efficient methods when researchers content-analyze large amounts of textual data systematically. PMID:25558294

  16. SU-F-T-247: Collision Risks in a Modern Radiation Oncology Department: An Efficient Approach to Failure Modes and Effects Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schubert, L; Westerly, D; Vinogradskiy, Y

    Purpose: Collisions between treatment equipment and patients are potentially catastrophic. Modern technology now commonly involves automated remote motion during imaging and treatment, yet a systematic assessment to identify and mitigate collision risks has yet to be performed. Failure modes and effects analysis (FMEA) is a method of risk assessment that has been increasingly used in healthcare, yet can be resource intensive. This work presents an efficient approach to FMEA to identify collision risks and implement practical interventions within a modern radiation therapy department. Methods: Potential collisions (e.g. failure modes) were assessed for all treatment and simulation rooms by teams consistingmore » of physicists, therapists, and radiation oncologists. Failure modes were grouped into classes according to similar characteristics. A single group meeting was held to identify implementable interventions for the highest priority classes of failure modes. Results: A total of 60 unique failure modes were identified by 6 different teams of physicists, therapists, and radiation oncologists. Failure modes were grouped into four main classes: specific patient setups, automated equipment motion, manual equipment motion, and actions in QA or service mode. Two of these classes, unusual patient setups and automated machine motion, were identified as being high priority in terms severity of consequence and addressability by interventions. The two highest risk classes consisted of 33 failure modes (55% of the total). In a single one hour group meeting, 6 interventions were identified. Those interventions addressed 100% of the high risk classes of failure modes (55% of all failure modes identified). Conclusion: A class-based approach to FMEA was developed to efficiently identify collision risks and implement interventions in a modern radiation oncology department. Failure modes and interventions will be listed, and a comparison of this approach against traditional FMEA methods will be presented.« less

  17. Efficient determination of average valence of manganese in manganese oxides by reaction headspace gas chromatography.

    PubMed

    Xie, Wei-Qi; Gong, Yi-Xian; Yu, Kong-Xian

    2017-08-18

    This work investigates a new reaction headspace gas chromatographic (HS-GC) technique for efficient quantifying average valence of manganese (Mn) in manganese oxides. This method is on the basis of the oxidation reaction between manganese oxides and sodium oxalate under the acidic condition. The carbon dioxide (CO 2 ) formed from the oxidation reaction can be quantitatively analyzed by headspace gas chromatography. The data showed that the reaction in the closed headspace vial can be completed in 20min at 80°C. The relative standard deviation of this reaction HS-GC method in the precision testing was within 1.08%, the relative differences between the new method and the reference method (titration method) were no more than 5.71%. The new HS-GC method is automated, efficient, and can be a reliable tool for the quantitative analysis of average valence of manganese in the manganese oxide related research and applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Advertisement recognition using mode voting acoustic fingerprint

    NASA Astrophysics Data System (ADS)

    Fahmi, Reza; Abedi Firouzjaee, Hosein; Janalizadeh Choobbasti, Ali; Mortazavi Najafabadi, S. H. E.; Safavi, Saeid

    2017-12-01

    Emergence of media outlets and public relations tools such as TV, radio and the Internet since the 20th century provided the companies with a good platform for advertising their goods and services. Advertisement recognition is an important task that can help companies measure the efficiency of their advertising campaigns in the market and make it possible to compare their performance with competitors in order to get better business insights. Advertisement recognition is usually performed manually with help of human labor or is done through automated methods that are mainly based on heuristics features, these methods usually lack abilities such as scalability, being able to be generalized and be used in different situations. In this paper, we present an automated method for advertisement recognition based on audio processing method that could make this process fairly simple and eliminate the human factor out of the equation. This method has ultimately been used in Miras information technology in order to monitor 56 TV channels to detect all ad video clips broadcast over some networks.

  19. An accurate and efficient method for evaluating the kernel of the integral equation relating pressure to normalwash in unsteady potential flow

    NASA Technical Reports Server (NTRS)

    Desmarais, R. N.

    1982-01-01

    This paper describes an accurate economical method for generating approximations to the kernel of the integral equation relating unsteady pressure to normalwash in nonplanar flow. The method is capable of generating approximations of arbitrary accuracy. It is based on approximating the algebraic part of the non elementary integrals in the kernel by exponential approximations and then integrating termwise. The exponent spacing in the approximation is a geometric sequence. The coefficients and exponent multiplier of the exponential approximation are computed by least squares so the method is completely automated. Exponential approximates generated in this manner are two orders of magnitude more accurate than the exponential approximation that is currently most often used for this purpose. Coefficients for 8, 12, 24, and 72 term approximations are tabulated in the report. Also, since the method is automated, it can be used to generate approximations to attain any desired trade-off between accuracy and computing cost.

  20. Scalable Device for Automated Microbial Electroporation in a Digital Microfluidic Platform.

    PubMed

    Madison, Andrew C; Royal, Matthew W; Vigneault, Frederic; Chen, Liji; Griffin, Peter B; Horowitz, Mark; Church, George M; Fair, Richard B

    2017-09-15

    Electrowetting-on-dielectric (EWD) digital microfluidic laboratory-on-a-chip platforms demonstrate excellent performance in automating labor-intensive protocols. When coupled with an on-chip electroporation capability, these systems hold promise for streamlining cumbersome processes such as multiplex automated genome engineering (MAGE). We integrated a single Ti:Au electroporation electrode into an otherwise standard parallel-plate EWD geometry to enable high-efficiency transformation of Escherichia coli with reporter plasmid DNA in a 200 nL droplet. Test devices exhibited robust operation with more than 10 transformation experiments performed per device without cross-contamination or failure. Despite intrinsic electric-field nonuniformity present in the EP/EWD device, the peak on-chip transformation efficiency was measured to be 8.6 ± 1.0 × 10 8 cfu·μg -1 for an average applied electric field strength of 2.25 ± 0.50 kV·mm -1 . Cell survival and transformation fractions at this electroporation pulse strength were found to be 1.5 ± 0.3 and 2.3 ± 0.1%, respectively. Our work expands the EWD toolkit to include on-chip microbial electroporation and opens the possibility of scaling advanced genome engineering methods, like MAGE, into the submicroliter regime.

  1. PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Rogers, Stuart E.; Dietz, William E.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    An all new, automated version of the PEGASUS software has been developed and tested. PEGASUS provides the hole-cutting and connectivity information between overlapping grids, and is used as the final part of the grid generation process for overset-grid computational fluid dynamics approaches. The new PEGASUS code (Version 5) has many new features: automated hole cutting; a projection scheme for fixing gaps in overset surfaces; more efficient interpolation search methods using an alternating digital tree; hole-size optimization based on adding additional layers of fringe points; and an automatic restart capability. The new code has also been parallelized using the Message Passing Interface standard. The parallelization performance provides efficient speed-up of the execution time by an order of magnitude, and up to a factor of 30 for very large problems. The results of three example cases are presented: a three-element high-lift airfoil, a generic business jet configuration, and a complete Boeing 777-200 aircraft in a high-lift landing configuration. Comparisons of the computed flow fields for the airfoil and 777 test cases between the old and new versions of the PEGASUS codes show excellent agreement with each other and with experimental results.

  2. Extension Master Gardener Intranet: Automating Administration, Motivating Volunteers, Increasing Efficiency, and Facilitating Impact Reporting

    ERIC Educational Resources Information Center

    Bradley, Lucy K.; Cook, Jonneen; Cook, Chris

    2011-01-01

    North Carolina State University has incorporated many aspects of volunteer program administration and reporting into an on-line solution that integrates impact reporting into daily program management. The Extension Master Gardener Intranet automates many of the administrative tasks associated with volunteer management, increasing efficiency, and…

  3. Integrated Communications and Work Efficiency: Impacts on Organizational Structure and Power.

    ERIC Educational Resources Information Center

    Wigand, Rolf T.

    This paper reviews the work environment surrounding integrated office systems, synthesizes the known effects of automated office technologies, and discusses their impact on work efficiency in office environments. Particular attention is given to the effect of automated technologies on networks, workflow/processes, and organizational structure and…

  4. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ticknor, Brian W.; Metzger, Shalina C.; McBay, Eddy H.

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oakmore » Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.« less

  5. How to sharpen your automated tools.

    DOT National Transportation Integrated Search

    2014-12-01

    New programs that claim to make flying more efficient have several things in common, new tasks for pilots, new flight deck displays, automated support tools, changes to ground automation, and displays for air traffic control. Training is one of the t...

  6. Root Gravitropism: Quantification, Challenges, and Solutions.

    PubMed

    Muller, Lukas; Bennett, Malcolm J; French, Andy; Wells, Darren M; Swarup, Ranjan

    2018-01-01

    Better understanding of root traits such as root angle and root gravitropism will be crucial for development of crops with improved resource use efficiency. This chapter describes a high-throughput, automated image analysis method to trace Arabidopsis (Arabidopsis thaliana) seedling roots grown on agar plates. The method combines a "particle-filtering algorithm with a graph-based method" to trace the center line of a root and can be adopted for the analysis of several root parameters such as length, curvature, and stimulus from original root traces.

  7. An Intelligent Automation Platform for Rapid Bioprocess Design.

    PubMed

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  8. Defining the drivers for accepting decision making automation in air traffic management.

    PubMed

    Bekier, Marek; Molesworth, Brett R C; Williamson, Ann

    2011-04-01

    Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.

  9. Automation in an addiction treatment research clinic: computerised contingency management, ecological momentary assessment and a protocol workflow system.

    PubMed

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L

    2009-01-01

    A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.

  10. Automated dynamic analytical model improvement for damped structures

    NASA Technical Reports Server (NTRS)

    Fuh, J. S.; Berman, A.

    1985-01-01

    A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.

  11. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals were met or exceeded. The results of the research extended knowledge of automation-related performance decrements in pilots and demonstrated the positive effects of adaptive task allocation. In addition, several practical implications for cockpit automation design were drawn from the research conducted. A total of 12 articles deriving from the project were published.

  12. Automated fault-management in a simulated spaceflight micro-world

    NASA Technical Reports Server (NTRS)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  13. Automated determination of cup-to-disc ratio for classification of glaucomatous and normal eyes on stereo retinal fundus images

    NASA Astrophysics Data System (ADS)

    Muramatsu, Chisako; Nakagawa, Toshiaki; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2011-09-01

    Early diagnosis of glaucoma, which is the second leading cause of blindness in the world, can halt or slow the progression of the disease. We propose an automated method for analyzing the optic disc and measuring the cup-to-disc ratio (CDR) on stereo retinal fundus images to improve ophthalmologists' diagnostic efficiency and potentially reduce the variation on the CDR measurement. The method was developed using 80 retinal fundus image pairs, including 25 glaucomatous, and 55 nonglaucomatous eyes, obtained at our institution. A disc region was segmented using the active contour method with the brightness and edge information. The segmentation of a cup region was performed using a depth map of the optic disc, which was reconstructed on the basis of the stereo disparity. The CDRs were measured and compared with those determined using the manual segmentation results by an expert ophthalmologist. The method was applied to a new database which consisted of 98 stereo image pairs including 60 and 30 pairs with and without signs of glaucoma, respectively. Using the CDRs, an area under the receiver operating characteristic curve of 0.90 was obtained for classification of the glaucomatous and nonglaucomatous eyes. The result indicates potential usefulness of the automated determination of CDRs for the diagnosis of glaucoma.

  14. Study of Adaptive Mathematical Models for Deriving Automated Pilot Performance Measurement Techniques. Volume II. Appendices. Final Report.

    ERIC Educational Resources Information Center

    Connelly, E. M.; And Others

    A new approach to deriving human performance measures and criteria for use in automatically evaluating trainee performance is described. Ultimately, this approach will allow automatic measurement of pilot performance in a flight simulator or from recorded in-flight data. An efficient method of representing performance data within a computer is…

  15. Sequential-Injection Analysis: Principles, Instrument Construction, and Demonstration by a Simple Experiment

    ERIC Educational Resources Information Center

    Economou, A.; Tzanavaras, P. D.; Themelis, D. G.

    2005-01-01

    The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…

  16. Improving treatment plan evaluation with automation.

    PubMed

    Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M

    2016-11-08

    The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the phys-ics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was suc-cessfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. © 2016 The Authors.

  17. Stromal vascular fraction isolated from lipo-aspirates using an automated processing system: bench and bed analysis.

    PubMed

    Doi, Kentaro; Tanaka, Shinsuke; Iida, Hideo; Eto, Hitomi; Kato, Harunosuke; Aoi, Noriyuki; Kuno, Shinichiro; Hirohi, Toshitsugu; Yoshimura, Kotaro

    2013-11-01

    The heterogeneous stromal vascular fraction (SVF), containing adipose-derived stem/progenitor cells (ASCs), can be easily isolated through enzymatic digestion of aspirated adipose tissue. In clinical settings, however, strict control of technical procedures according to standard operating procedures and validation of cell-processing conditions are required. Therefore, we evaluated the efficiency and reliability of an automated system for SVF isolation from adipose tissue. SVF cells, freshly isolated using the automated procedure, showed comparable number and viability to those from manual isolation. Flow cytometric analysis confirmed an SVF cell composition profile similar to that after manual isolation. In addition, the ASC yield after 1 week in culture was also not significantly different between the two groups. Our clinical study, in which SVF cells isolated with the automated system were transplanted with aspirated fat tissue for soft tissue augmentation/reconstruction in 42 patients, showed satisfactory outcomes with no serious side-effects. Taken together, our results suggested that the automated isolation system is as reliable a method as manual isolation and may also be useful in clinical settings. Automated isolation is expected to enable cell-based clinical trials in small facilities with an aseptic room, without the necessity of a good manufacturing practice-level cell processing area. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Automation of Cataloging: Effects on Use of Staff, Efficiency, and Service to Patrons.

    ERIC Educational Resources Information Center

    Bednar, Marie

    1988-01-01

    Describes the effects of the automation of cataloging processes at Pennsylvania State University. The discussion covers the reorganization of professional and paraprofessional personnel and job responsibilities, staff reactions to the changes, the impact on cataloging quality and efficiency, and patron satisfaction with the services offered. (15…

  19. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    NASA Astrophysics Data System (ADS)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  20. Improving automatic peptide mass fingerprint protein identification by combining many peak sets.

    PubMed

    Rögnvaldsson, Thorsteinn; Häkkinen, Jari; Lindberg, Claes; Marko-Varga, György; Potthast, Frank; Samuelsson, Jim

    2004-08-05

    An automated peak picking strategy is presented where several peak sets with different signal-to-noise levels are combined to form a more reliable statement on the protein identity. The strategy is compared against both manual peak picking and industry standard automated peak picking on a set of mass spectra obtained after tryptic in gel digestion of 2D-gel samples from human fetal fibroblasts. The set of spectra contain samples ranging from strong to weak spectra, and the proposed multiple-scale method is shown to be much better on weak spectra than the industry standard method and a human operator, and equal in performance to these on strong and medium strong spectra. It is also demonstrated that peak sets selected by a human operator display a considerable variability and that it is impossible to speak of a single "true" peak set for a given spectrum. The described multiple-scale strategy both avoids time-consuming parameter tuning and exceeds the human operator in protein identification efficiency. The strategy therefore promises reliable automated user-independent protein identification using peptide mass fingerprints.

  1. Object-based classification of semi-arid wetlands

    NASA Astrophysics Data System (ADS)

    Halabisky, Meghan; Moskal, L. Monika; Hall, Sonia A.

    2011-01-01

    Wetlands are valuable ecosystems that benefit society. However, throughout history wetlands have been converted to other land uses. For this reason, timely wetland maps are necessary for developing strategies to protect wetland habitat. The goal of this research was to develop a time-efficient, automated, low-cost method to map wetlands in a semi-arid landscape that could be scaled up for use at a county or state level, and could lay the groundwork for expanding to forested areas. Therefore, it was critical that the research project contain two components: accurate automated feature extraction and the use of low-cost imagery. For that reason, we tested the effectiveness of geographic object-based image analysis (GEOBIA) to delineate and classify wetlands using freely available true color aerial photographs provided through the National Agriculture Inventory Program. The GEOBIA method produced an overall accuracy of 89% (khat = 0.81), despite the absence of infrared spectral data. GEOBIA provides the automation that can save significant resources when scaled up while still providing sufficient spatial resolution and accuracy to be useful to state and local resource managers and policymakers.

  2. Automated reporting of pharmacokinetic study results: gaining efficiency downstream from the laboratory.

    PubMed

    Schaefer, Peter

    2011-07-01

    The purpose of bioanalysis in the pharmaceutical industry is to provide 'raw' data about the concentration of a drug candidate and its metabolites as input for studies of drug properties such as pharmacokinetic (PK), toxicokinetic, bioavailability/bioequivalence and other studies. Building a seamless workflow from the laboratory to final reports is an ongoing challenge for IT groups and users alike. In such a workflow, PK automation can provide companies with the means to vastly increase the productivity of their scientific staff while improving the quality and consistency of their reports on PK analyses. This report presents the concept and benefits of PK automation and discuss which features of an automated reporting workflow should be translated into software requirements that pharmaceutical companies can use to select or build an efficient and effective PK automation solution that best meets their needs.

  3. Automated Passive Capillary Lysimeters for Estimating Water Drainage in the Vadose Zone

    NASA Astrophysics Data System (ADS)

    Jabro, J.; Evans, R.

    2009-04-01

    In this study, we demonstrated and evaluated the performance and accuracy of an automated PCAP lysimeters that we designed for in-situ continuous measuring and estimating of drainage water below the rootzone of a sugarbeet-potato-barley rotation under two irrigation frequencies. Twelve automated PCAPs with sampling surface dimensions of 31 cm width * 91 cm long and 87 cm in height were placed 90 cm below the soil surface in a Lihen sandy loam. Our state-of-the-art design incorporated Bluetooth wireless technology to enable an automated datalogger to transmit drainage water data simultaneously every 15 minutes to a remote host and had a greater efficiency than other types of lysimeters. It also offered a significantly larger coverage area (2700 cm2) than similarly designed vadose zone lysimeters. The cumulative manually extracted drainage water was compared with the cumulative volume of drainage water recorded by the datalogger from the tipping bucket using several statistical methods. Our results indicated that our automated PCAPs are accurate and provided convenient means for estimating water drainage in the vadose zone without the need for costly and manually time-consuming supportive systems.

  4. Automated data mining: an innovative and efficient web-based approach to maintaining resident case logs.

    PubMed

    Bhattacharya, Pratik; Van Stavern, Renee; Madhavan, Ramesh

    2010-12-01

    Use of resident case logs has been considered by the Residency Review Committee for Neurology of the Accreditation Council for Graduate Medical Education (ACGME). This study explores the effectiveness of a data-mining program for creating resident logs and compares the results to a manual data-entry system. Other potential applications of data mining to enhancing resident education are also explored. Patient notes dictated by residents were extracted from the Hospital Information System and analyzed using an unstructured mining program. History, examination and ICD codes were obtained and compared to the existing manual log. The automated data History, examination, and ICD codes were gathered for a 30-day period and compared to manual case logs. The automated method extracted all resident dictations with the dates of encounter and transcription. The automated data-miner processed information from all 19 residents, while only 4 residents logged manually. The manual method identified only broad categories of diseases; the major categories were stroke or vascular disorder 53 (27.6%), epilepsy 28 (14.7%), and pain syndromes 26 (13.5%). In the automated method, epilepsy 114 (21.1%), cerebral atherosclerosis 114 (21.1%), and headache 105 (19.4%) were the most frequent primary diagnoses, and headache 89 (16.5%), seizures 94 (17.4%), and low back pain 47 (9%) were the most common chief complaints. More detailed patient information such as tobacco use 227 (42%), alcohol use 205 (38%), and drug use 38 (7%) were extracted by the data-mining method. Manual case logs are time-consuming, provide limited information, and may be unpopular with residents. Data mining is a time-effective tool that may aid in the assessment of resident experience or the ACGME core competencies or in resident clinical research. More study of this method in larger numbers of residency programs is needed.

  5. A Method of Separation Assurance for Instrument Flight Procedures at Non-Radar Airports

    NASA Technical Reports Server (NTRS)

    Conway, Sheila R.; Consiglio, Maria

    2002-01-01

    A method to provide automated air traffic separation assurance services during approach to or departure from a non-radar, non-towered airport environment is described. The method is constrained by provision of these services without radical changes or ambitious investments in current ground-based technologies. The proposed procedures are designed to grant access to a large number of airfields that currently have no or very limited access under Instrument Flight Rules (IFR), thus increasing mobility with minimal infrastructure investment. This paper primarily addresses a low-cost option for airport and instrument approach infrastructure, but is designed to be an architecture from which a more efficient, albeit more complex, system may be developed. A functional description of the capabilities in the current NAS infrastructure is provided. Automated terminal operations and procedures are introduced. Rules of engagement and the operations are defined. Results of preliminary simulation testing are presented. Finally, application of the method to more terminal-like operations, and major research areas, including necessary piloted studies, are discussed.

  6. Expected Improvements in Work Truck Efficiency Through Connectivity and Automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkowicz, Kevin A

    This presentation focuses on the potential impact of connected and automated technologies on commercial vehicle operations. It includes topics such as the U.S. Department of Energy's Energy Efficient Mobility Systems (EEMS) program and the Systems and Modeling for Accelerated Research in Transportation (SMART) Mobility Initiative. It also describes National Renewable Energy Laboratory (NREL) research findings pertaining to the potential energy impacts of connectivity and automation and stresses the need for integration and optimization to take advantage of the benefits offered by these transformative technologies while mitigating the potential negative consequences.

  7. Automated assay for screening the enzymatic release of reducing sugars from micronized biomass.

    PubMed

    Navarro, David; Couturier, Marie; da Silva, Gabriela Ghizzi Damasceno; Berrin, Jean-Guy; Rouau, Xavier; Asther, Marcel; Bignon, Christophe

    2010-07-16

    To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol), it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature) were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the robot can autonomously process 120 triplicate wheat-straw samples per day. This throughput can be doubled if the incubation time is reduced from 24 h to 4 h (for initial rates measurements, for instance). This method can potentially be used with any insoluble substrate that is micronizable. A video illustrating the method can be seen at the following URL: http://www.youtube.com/watch?v=NFg6TxjuMWU.

  8. Automated method for measuring the extent of selective logging damage with airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Melendy, L.; Hagen, S. C.; Sullivan, F. B.; Pearson, T. R. H.; Walker, S. M.; Ellis, P.; Kustiyo; Sambodo, Ari Katmoko; Roswintiarti, O.; Hanson, M. A.; Klassen, A. W.; Palace, M. W.; Braswell, B. H.; Delgado, G. M.

    2018-05-01

    Selective logging has an impact on the global carbon cycle, as well as on the forest micro-climate, and longer-term changes in erosion, soil and nutrient cycling, and fire susceptibility. Our ability to quantify these impacts is dependent on methods and tools that accurately identify the extent and features of logging activity. LiDAR-based measurements of these features offers significant promise. Here, we present a set of algorithms for automated detection and mapping of critical features associated with logging - roads/decks, skid trails, and gaps - using commercial airborne LiDAR data as input. The automated algorithm was applied to commercial LiDAR data collected over two logging concessions in Kalimantan, Indonesia in 2014. The algorithm results were compared to measurements of the logging features collected in the field soon after logging was complete. The automated algorithm-mapped road/deck and skid trail features match closely with features measured in the field, with agreement levels ranging from 69% to 99% when adjusting for GPS location error. The algorithm performed most poorly with gaps, which, by their nature, are variable due to the unpredictable impact of tree fall versus the linear and regular features directly created by mechanical means. Overall, the automated algorithm performs well and offers significant promise as a generalizable tool useful to efficiently and accurately capture the effects of selective logging, including the potential to distinguish reduced impact logging from conventional logging.

  9. Protein Crystal Growth

    NASA Technical Reports Server (NTRS)

    2003-01-01

    In order to rapidly and efficiently grow crystals, tools were needed to automatically identify and analyze the growing process of protein crystals. To meet this need, Diversified Scientific, Inc. (DSI), with the support of a Small Business Innovation Research (SBIR) contract from NASA s Marshall Space Flight Center, developed CrystalScore(trademark), the first automated image acquisition, analysis, and archiving system designed specifically for the macromolecular crystal growing community. It offers automated hardware control, image and data archiving, image processing, a searchable database, and surface plotting of experimental data. CrystalScore is currently being used by numerous pharmaceutical companies and academic and nonprofit research centers. DSI, located in Birmingham, Alabama, was awarded the patent Method for acquiring, storing, and analyzing crystal images on March 4, 2003. Another DSI product made possible by Marshall SBIR funding is VaporPro(trademark), a unique, comprehensive system that allows for the automated control of vapor diffusion for crystallization experiments.

  10. Assessment of Alternative Interfaces for Manual Commanding of Spacecraft Systems: Compatibility with Flexible Allocation Policies

    NASA Technical Reports Server (NTRS)

    Billman, Dorrit Owen; Schreckenghost, Debra; Miri, Pardis

    2014-01-01

    Astronauts will be responsible for executing a much larger body of procedures as human exploration moves further from Earth and Mission Control. Efficient, reliable methods for executing these procedures, including manual, automated, and mixed execution will be important. Our interface integrates step-by-step instruction with the means for execution. The research reported here compared manual execution using the new system to a system analogous to the manual-only system currently in use on the International Space Station, to assess whether user performance in manual operations would be as good or better with the new than with the legacy system. The system used also allows flexible automated execution. The system and our data lay the foundation for integrating automated execution into the flow of procedures designed for humans. In our formative study, we found speed and accuracy of manual procedure execution was better using the new, integrated interface over the legacy design.

  11. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  12. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Automated classification of self-grooming in mice using open-source software.

    PubMed

    van den Boom, Bastijn J G; Pavlidi, Pavlina; Wolf, Casper J H; Mooij, Adriana H; Willuhn, Ingo

    2017-09-01

    Manual analysis of behavior is labor intensive and subject to inter-rater variability. Although considerable progress in automation of analysis has been made, complex behavior such as grooming still lacks satisfactory automated quantification. We trained a freely available, automated classifier, Janelia Automatic Animal Behavior Annotator (JAABA), to quantify self-grooming duration and number of bouts based on video recordings of SAPAP3 knockout mice (a mouse line that self-grooms excessively) and wild-type animals. We compared the JAABA classifier with human expert observers to test its ability to measure self-grooming in three scenarios: mice in an open field, mice on an elevated plus-maze, and tethered mice in an open field. In each scenario, the classifier identified both grooming and non-grooming with great accuracy and correlated highly with results obtained by human observers. Consistently, the JAABA classifier confirmed previous reports of excessive grooming in SAPAP3 knockout mice. Thus far, manual analysis was regarded as the only valid quantification method for self-grooming. We demonstrate that the JAABA classifier is a valid and reliable scoring tool, more cost-efficient than manual scoring, easy to use, requires minimal effort, provides high throughput, and prevents inter-rater variability. We introduce the JAABA classifier as an efficient analysis tool for the assessment of rodent self-grooming with expert quality. In our "how-to" instructions, we provide all information necessary to implement behavioral classification with JAABA. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  15. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  16. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE PAGES

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...

    2017-11-06

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  17. Automation to improve efficiency of field expedient injury prediction screening.

    PubMed

    Teyhen, Deydre S; Shaffer, Scott W; Umlauf, Jon A; Akerman, Raymond J; Canada, John B; Butler, Robert J; Goffar, Stephen L; Walker, Michael J; Kiesel, Kyle B; Plisky, Phillip J

    2012-07-01

    Musculoskeletal injuries are a primary source of disability in the U.S. Military. Physical training and sports-related activities account for up to 90% of all injuries, and 80% of these injuries are considered overuse in nature. As a result, there is a need to develop an evidence-based musculoskeletal screen that can assist with injury prevention. The purpose of this study was to assess the capability of an automated system to improve the efficiency of field expedient tests that may help predict injury risk and provide corrective strategies for deficits identified. The field expedient tests include survey questions and measures of movement quality, balance, trunk stability, power, mobility, and foot structure and mobility. Data entry for these tests was automated using handheld computers, barcode scanning, and netbook computers. An automated algorithm for injury risk stratification and mitigation techniques was run on a server computer. Without automation support, subjects were assessed in 84.5 ± 9.1 minutes per subject compared with 66.8 ± 6.1 minutes per subject with automation and 47.1 ± 5.2 minutes per subject with automation and process improvement measures (p < 0.001). The average time to manually enter the data was 22.2 ± 7.4 minutes per subject. An additional 11.5 ± 2.5 minutes per subject was required to manually assign an intervention strategy. Automation of this injury prevention screening protocol using handheld devices and netbook computers allowed for real-time data entry and enhanced the efficiency of injury screening, risk stratification, and prescription of a risk mitigation strategy.

  18. Automated selective disruption of slow wave sleep

    PubMed Central

    Ooms, Sharon J.; Zempel, John M.; Holtzman, David M.; Ju, Yo-El S.

    2017-01-01

    Background Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. New Method We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10 seconds live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. Results The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5–4 Hz) band, particularly in the 0.5–2 Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4–8 Hz) and alpha (8–12 Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6 minutes, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. Comparison with existing method This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. Conclusion This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. PMID:28238859

  19. Role of Gist and PHOG Features in Computer-Aided Diagnosis of Tuberculosis without Segmentation

    PubMed Central

    Chauhan, Arun; Chauhan, Devesh; Rout, Chittaranjan

    2014-01-01

    Purpose Effective diagnosis of tuberculosis (TB) relies on accurate interpretation of radiological patterns found in a chest radiograph (CXR). Lack of skilled radiologists and other resources, especially in developing countries, hinders its efficient diagnosis. Computer-aided diagnosis (CAD) methods provide second opinion to the radiologists for their findings and thereby assist in better diagnosis of cancer and other diseases including TB. However, existing CAD methods for TB are based on the extraction of textural features from manually or semi-automatically segmented CXRs. These methods are prone to errors and cannot be implemented in X-ray machines for automated classification. Methods Gabor, Gist, histogram of oriented gradients (HOG), and pyramid histogram of oriented gradients (PHOG) features extracted from the whole image can be implemented into existing X-ray machines to discriminate between TB and non-TB CXRs in an automated manner. Localized features were extracted for the above methods using various parameters, such as frequency range, blocks and region of interest. The performance of these features was evaluated against textural features. Two digital CXR image datasets (8-bit DA and 14-bit DB) were used for evaluating the performance of these features. Results Gist (accuracy 94.2% for DA, 86.0% for DB) and PHOG (accuracy 92.3% for DA, 92.0% for DB) features provided better results for both the datasets. These features were implemented to develop a MATLAB toolbox, TB-Xpredict, which is freely available for academic use at http://sourceforge.net/projects/tbxpredict/. This toolbox provides both automated training and prediction modules and does not require expertise in image processing for operation. Conclusion Since the features used in TB-Xpredict do not require segmentation, the toolbox can easily be implemented in X-ray machines. This toolbox can effectively be used for the mass screening of TB in high-burden areas with improved efficiency. PMID:25390291

  20. A robotic system for automation of logistics functions on the Space Station

    NASA Technical Reports Server (NTRS)

    Martin, J. C.; Purves, R. B.; Hosier, R. N.; Krein, B. A.

    1988-01-01

    Spacecraft inventory management is currently performed by the crew and as systems become more complex, increased crew time will be required to perform routine logistics activities. If future spacecraft are to function effectively as research labs and production facilities, the efficient use of crew time as a limited resource for performing mission functions must be employed. The use of automation and robotics technology, such as automated warehouse and materials handling functions, can free the crew from many logistics tasks and provide more efficient use of crew time. Design criteria for a Space Station Automated Logistics Inventory Management System is focused on through the design and demonstration of a mobile two armed terrestrial robot. The system functionally represents a 0 gravity automated inventory management system and the problems associated with operating in such an environment. Features of the system include automated storage and retrieval, item recognition, two armed robotic manipulation, and software control of all inventory item transitions and queries.

  1. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow.

    PubMed

    Walsh, Kristin E; Chui, Michelle Anne; Kieser, Mara A; Williams, Staci M; Sutter, Susan L; Sutter, John G

    2011-01-01

    To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign.

  2. Who watches the watchers?: preventing fault in a fault tolerance library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanavige, C. D.

    The Scalable Checkpoint/Restart library (SCR) was developed and is used by researchers at Lawrence Livermore National Laboratory to provide a fast and efficient method of saving and recovering large applications during runtime on high-performance computing (HPC) systems. Though SCR protects other programs, up until June 2017, nothing was actively protecting SCR. The goal of this project was to automate the building and testing of this library on the varying HPC architectures on which it is used. Our methods centered around the use of a continuous integration tool called Bamboo that allowed for automation agents to be installed on the HPCmore » systems themselves. These agents provided a way for us to establish a new and unique way to automate and customize the allocation of resources and running of tests with CMake’s unit testing framework, CTest, as well as integration testing scripts though an HPC package manager called Spack. These methods provided a parallel environment in which to test the more complex features of SCR. As a result, SCR is now automatically built and tested on several HPC architectures any time changes are made by developers to the library’s source code. The results of these tests are then communicated back to the developers for immediate feedback, allowing them to fix functionality of SCR that may have broken. Hours of developers’ time are now being saved from the tedious process of manually testing and debugging, which saves money and allows the SCR project team to focus their efforts towards development. Thus, HPC system users can use SCR in conjunction with their own applications to efficiently and effectively checkpoint and restart as needed with the assurance that SCR itself is functioning properly.« less

  3. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Efficient Local Search

    DTIC Science & Technology

    2012-02-01

    use the ERDC software implementation of the secant LM method that accommodates the PEST model independent interface to calibrate a GSSHA...how the method works. We will also demonstrate how our LM/SLM implementation compares with its counterparts as implemented in the popular PEST ...function values and total model calls for local search to converge) associated with Examples 1 and 3 using the PEST LM/SLM implementations

  4. An Automated High-Throughput Metabolic Stability Assay Using an Integrated High-Resolution Accurate Mass Method and Automated Data Analysis Software.

    PubMed

    Shah, Pranav; Kerns, Edward; Nguyen, Dac-Trung; Obach, R Scott; Wang, Amy Q; Zakharov, Alexey; McKew, John; Simeonov, Anton; Hop, Cornelis E C A; Xu, Xin

    2016-10-01

    Advancement of in silico tools would be enabled by the availability of data for metabolic reaction rates and intrinsic clearance (CLint) of a diverse compound structure data set by specific metabolic enzymes. Our goal is to measure CLint for a large set of compounds with each major human cytochrome P450 (P450) isozyme. To achieve our goal, it is of utmost importance to develop an automated, robust, sensitive, high-throughput metabolic stability assay that can efficiently handle a large volume of compound sets. The substrate depletion method [in vitro half-life (t1/2) method] was chosen to determine CLint The assay (384-well format) consisted of three parts: 1) a robotic system for incubation and sample cleanup; 2) two different integrated, ultraperformance liquid chromatography/mass spectrometry (UPLC/MS) platforms to determine the percent remaining of parent compound, and 3) an automated data analysis system. The CYP3A4 assay was evaluated using two long t1/2 compounds, carbamazepine and antipyrine (t1/2 > 30 minutes); one moderate t1/2 compound, ketoconazole (10 < t1/2 < 30 minutes); and two short t1/2 compounds, loperamide and buspirone (t½ < 10 minutes). Interday and intraday precision and accuracy of the assay were within acceptable range (∼12%) for the linear range observed. Using this assay, CYP3A4 CLint and t1/2 values for more than 3000 compounds were measured. This high-throughput, automated, and robust assay allows for rapid metabolic stability screening of large compound sets and enables advanced computational modeling for individual human P450 isozymes. U.S. Government work not protected by U.S. copyright.

  5. Toward high-throughput phenotyping: unbiased automated feature extraction and selection from knowledge sources.

    PubMed

    Yu, Sheng; Liao, Katherine P; Shaw, Stanley Y; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2015-09-01

    Analysis of narrative (text) data from electronic health records (EHRs) can improve population-scale phenotyping for clinical and genetic research. Currently, selection of text features for phenotyping algorithms is slow and laborious, requiring extensive and iterative involvement by domain experts. This paper introduces a method to develop phenotyping algorithms in an unbiased manner by automatically extracting and selecting informative features, which can be comparable to expert-curated ones in classification accuracy. Comprehensive medical concepts were collected from publicly available knowledge sources in an automated, unbiased fashion. Natural language processing (NLP) revealed the occurrence patterns of these concepts in EHR narrative notes, which enabled selection of informative features for phenotype classification. When combined with additional codified features, a penalized logistic regression model was trained to classify the target phenotype. The authors applied our method to develop algorithms to identify patients with rheumatoid arthritis and coronary artery disease cases among those with rheumatoid arthritis from a large multi-institutional EHR. The area under the receiver operating characteristic curves (AUC) for classifying RA and CAD using models trained with automated features were 0.951 and 0.929, respectively, compared to the AUCs of 0.938 and 0.929 by models trained with expert-curated features. Models trained with NLP text features selected through an unbiased, automated procedure achieved comparable or slightly higher accuracy than those trained with expert-curated features. The majority of the selected model features were interpretable. The proposed automated feature extraction method, generating highly accurate phenotyping algorithms with improved efficiency, is a significant step toward high-throughput phenotyping. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Fully automated chest wall line segmentation in breast MRI by using context information

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  7. Development and evaluation of needle trap device geometry and packing methods for automated and manual analysis.

    PubMed

    Warren, Jamie M; Pawliszyn, Janusz

    2011-12-16

    For air/headspace analysis, needle trap devices (NTDs) are applicable for sampling a wide range of volatiles such as benzene, alkanes, and semi-volatile particulate bound compounds such as pyrene. This paper describes a new NTD that is simpler to produce and improves performance relative to previous NTD designs. A NTD utilizing a side-hole needle used a modified tip, which removed the need to use epoxy glue to hold sorbent particles inside the NTD. This design also improved the seal between the NTD and narrow neck liner of the GC injector; therefore, improving the desorption efficiency. A new packing method has been developed and evaluated using solvent to pack the device, and is compared to NTDs prepared using the previous vacuum aspiration method. The slurry packing method reduced preparation time and improved reproducibility between NTDs. To evaluate the NTDs, automated headspace extraction was completed using benzene, toluene, ethylbenzene, p-xylene (BTEX), anthracene, and pyrene (PAH). NTD geometries evaluated include: blunt tip with side-hole needle, tapered tip with side-hole needle, slider tip with side-hole, dome tapered tip with side-hole and blunt with no side-hole needle (expanded desorptive flow). Results demonstrate that the tapered and slider tip NTDs performed with improved desorption efficiency. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. An automated in vitro model for the evaluation of ultrasound modalities measuring myocardial deformation

    PubMed Central

    2010-01-01

    Background Echocardiography is the method of choice when one wishes to examine myocardial function. Qualitative assessment of the 2D grey scale images obtained is subjective, and objective methods are required. Speckle Tracking Ultrasound is an emerging technology, offering an objective mean of quantifying left ventricular wall motion. However, before a new ultrasound technology can be adopted in the clinic, accuracy and reproducibility needs to be investigated. Aim It was hypothesized that the collection of ultrasound sample data from an in vitro model could be automated. The aim was to optimize an in vitro model to allow for efficient collection of sample data. Material & Methods A tissue-mimicking phantom was made from water, gelatin powder, psyllium fibers and a preservative. Sonomicrometry crystals were molded into the phantom. The solid phantom was mounted in a stable stand and cyclically compressed. Peak strain was then measured by Speckle Tracking Ultrasound and sonomicrometry. Results We succeeded in automating the acquisition and analysis of sample data. Sample data was collected at a rate of 200 measurement pairs in 30 minutes. We found good agreement between Speckle Tracking Ultrasound and sonomicrometry in the in vitro model. Best agreement was 0.83 ± 0.70%. Worst agreement was -1.13 ± 6.46%. Conclusions It has been shown possible to automate a model that can be used for evaluating the in vitro accuracy and precision of ultrasound modalities measuring deformation. Sonomicrometry and Speckle Tracking Ultrasound had acceptable agreement. PMID:20822532

  9. Image-based path planning for automated virtual colonoscopy navigation

    NASA Astrophysics Data System (ADS)

    Hong, Wei

    2008-03-01

    Virtual colonoscopy (VC) is a noninvasive method for colonic polyp screening, by reconstructing three-dimensional models of the colon using computerized tomography (CT). In virtual colonoscopy fly-through navigation, it is crucial to generate an optimal camera path for efficient clinical examination. In conventional methods, the centerline of the colon lumen is usually used as the camera path. In order to extract colon centerline, some time consuming pre-processing algorithms must be performed before the fly-through navigation, such as colon segmentation, distance transformation, or topological thinning. In this paper, we present an efficient image-based path planning algorithm for automated virtual colonoscopy fly-through navigation without the requirement of any pre-processing. Our algorithm only needs the physician to provide a seed point as the starting camera position using 2D axial CT images. A wide angle fisheye camera model is used to generate a depth image from the current camera position. Two types of navigational landmarks, safe regions and target regions are extracted from the depth images. Camera position and its corresponding view direction are then determined using these landmarks. The experimental results show that the generated paths are accurate and increase the user comfort during the fly-through navigation. Moreover, because of the efficiency of our path planning algorithm and rendering algorithm, our VC fly-through navigation system can still guarantee 30 FPS.

  10. A self-adapting system for the automated detection of inter-ictal epileptiform discharges.

    PubMed

    Lodder, Shaun S; van Putten, Michel J A M

    2014-01-01

    Scalp EEG remains the standard clinical procedure for the diagnosis of epilepsy. Manual detection of inter-ictal epileptiform discharges (IEDs) is slow and cumbersome, and few automated methods are used to assist in practice. This is mostly due to low sensitivities, high false positive rates, or a lack of trust in the automated method. In this study we aim to find a solution that will make computer assisted detection more efficient than conventional methods, while preserving the detection certainty of a manual search. Our solution consists of two phases. First, a detection phase finds all events similar to epileptiform activity by using a large database of template waveforms. Individual template detections are combined to form "IED nominations", each with a corresponding certainty value based on the reliability of their contributing templates. The second phase uses the ten nominations with highest certainty and presents them to the reviewer one by one for confirmation. Confirmations are used to update certainty values of the remaining nominations, and another iteration is performed where ten nominations with the highest certainty are presented. This continues until the reviewer is satisfied with what has been seen. Reviewer feedback is also used to update template accuracies globally and improve future detections. Using the described method and fifteen evaluation EEGs (241 IEDs), one third of all inter-ictal events were shown after one iteration, half after two iterations, and 74%, 90%, and 95% after 5, 10 and 15 iterations respectively. Reviewing fifteen iterations for the 20-30 min recordings 1 took approximately 5 min. The proposed method shows a practical approach for combining automated detection with visual searching for inter-ictal epileptiform activity. Further evaluation is needed to verify its clinical feasibility and measure the added value it presents.

  11. Assessing the Efficiency of Phenotyping Early Traits in a Greenhouse Automated Platform for Predicting Drought Tolerance of Soybean in the Field.

    PubMed

    Peirone, Laura S; Pereyra Irujo, Gustavo A; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A N

    2018-01-01

    Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping.

  12. Content Based Lecture Video Retrieval Using Speech and Video Text Information

    ERIC Educational Resources Information Center

    Yang, Haojin; Meinel, Christoph

    2014-01-01

    In the last decade e-lecturing has become more and more popular. The amount of lecture video data on the "World Wide Web" (WWW) is growing rapidly. Therefore, a more efficient method for video retrieval in WWW or within large lecture video archives is urgently needed. This paper presents an approach for automated video indexing and video…

  13. Research on the Impact of a Computerized Circulation System on the Performance of a Large College Library. Final Report.

    ERIC Educational Resources Information Center

    Frohmberg, Katherine A.; Moffett, William A.

    In order to study the effects of introducing an automated circulation system at Oberlin College, Ohio, data were collected from September 1978 until June 1982 on book availability, usage of library facilities, attitudes of library users toward the library, and the efficiency of circulation activities. Data collection methods included circulation…

  14. Hippocampal Structure and Human Cognition: Key Role of Spatial Processing and Evidence Supporting the Efficiency Hypothesis in Females

    ERIC Educational Resources Information Center

    Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martinez, Kenia; Hermel, David; Wang, Yalin; Alvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, Ma. Angeles; Shih, Pei Chun; Thompson, Paul M.

    2013-01-01

    Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests…

  15. The Impact of Learning Context on Intent to Use Marketing and Sales Technology: A Comparison of Scenario-Based and Task-Based Approaches

    ERIC Educational Resources Information Center

    Mallin, Michael L.; Jones, Deirdre E.; Cordell, Jennifer L.

    2010-01-01

    With firms focused on increasing efficiency and effectiveness in today's marketing and sales environment, it is crucial that salesforce training methods facilitate greater adoption of salesforce automation technology. Given the growth in sales education at colleges and universities, firms are looking to recruit their frontline marketing and sales…

  16. 3-D segmentation of articular cartilages by graph cuts using knee MR images from osteoarthritis initiative

    NASA Astrophysics Data System (ADS)

    Shim, Hackjoon; Lee, Soochan; Kim, Bohyeong; Tao, Cheng; Chang, Samuel; Yun, Il Dong; Lee, Sang Uk; Kwoh, Kent; Bae, Kyongtae

    2008-03-01

    Knee osteoarthritis is the most common debilitating health condition affecting elderly population. MR imaging of the knee is highly sensitive for diagnosis and evaluation of the extent of knee osteoarthritis. Quantitative analysis of the progression of osteoarthritis is commonly based on segmentation and measurement of articular cartilage from knee MR images. Segmentation of the knee articular cartilage, however, is extremely laborious and technically demanding, because the cartilage is of complex geometry and thin and small in size. To improve precision and efficiency of the segmentation of the cartilage, we have applied a semi-automated segmentation method that is based on an s/t graph cut algorithm. The cost function was defined integrating regional and boundary cues. While regional cues can encode any intensity distributions of two regions, "object" (cartilage) and "background" (the rest), boundary cues are based on the intensity differences between neighboring pixels. For three-dimensional (3-D) segmentation, hard constraints are also specified in 3-D way facilitating user interaction. When our proposed semi-automated method was tested on clinical patients' MR images (160 slices, 0.7 mm slice thickness), a considerable amount of segmentation time was saved with improved efficiency, compared to a manual segmentation approach.

  17. An algorithm for the detection and characterisation of volcanic plumes using thermal camera imagery

    NASA Astrophysics Data System (ADS)

    Bombrun, Maxime; Jessop, David; Harris, Andrew; Barra, Vincent

    2018-02-01

    Volcanic plumes are turbulent mixtures of particles and gas which are injected into the atmosphere during a volcanic eruption. Depending on the intensity of the eruption, plumes can rise from a few tens of metres up to many tens of kilometres above the vent and thus, present a major hazard for the surrounding population. Currently, however, few if any algorithms are available for automated plume tracking and assessment. Here, we present a new image processing algorithm for segmentation, tracking and parameters extraction of convective plume recorded with thermal cameras. We used thermal video of two volcanic eruptions and two plumes simulated in laboratory to develop and test an efficient technique for analysis of volcanic plumes. We validated our method by two different approaches. First, we compare our segmentation method to previously published algorithms. Next, we computed plume parameters, such as height, width and spreading angle at regular intervals of time. These parameters allowed us to calculate an entrainment coefficient and obtain information about the entrainment efficiency in Strombolian eruptions. Our proposed algorithm is rapid, automated while producing better visual outlines compared to the other segmentation algorithms, and provides output that is at least as accurate as manual measurements of plumes.

  18. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  19. Quantitative semi-automated analysis of morphogenesis with single-cell resolution in complex embryos.

    PubMed

    Giurumescu, Claudiu A; Kang, Sukryool; Planchon, Thomas A; Betzig, Eric; Bloomekatz, Joshua; Yelon, Deborah; Cosman, Pamela; Chisholm, Andrew D

    2012-11-01

    A quantitative understanding of tissue morphogenesis requires description of the movements of individual cells in space and over time. In transparent embryos, such as C. elegans, fluorescently labeled nuclei can be imaged in three-dimensional time-lapse (4D) movies and automatically tracked through early cleavage divisions up to ~350 nuclei. A similar analysis of later stages of C. elegans development has been challenging owing to the increased error rates of automated tracking of large numbers of densely packed nuclei. We present Nucleitracker4D, a freely available software solution for tracking nuclei in complex embryos that integrates automated tracking of nuclei in local searches with manual curation. Using these methods, we have been able to track >99% of all nuclei generated in the C. elegans embryo. Our analysis reveals that ventral enclosure of the epidermis is accompanied by complex coordinated migration of the neuronal substrate. We can efficiently track large numbers of migrating nuclei in 4D movies of zebrafish cardiac morphogenesis, suggesting that this approach is generally useful in situations in which the number, packing or dynamics of nuclei present challenges for automated tracking.

  20. Laboratory automation: a challenge for the 1990s.

    PubMed

    Mordini, C

    1994-01-01

    THERE IS TREMENDOUS PRESSURE ON INDUSTRY AND LABORATORIES TO DEVELOP INCREASINGLY COMPLEX PROCUCTS: for example catalysts, chiral chemicals, drugs and ceramics; conform to regulations; cope with increasingly severe competition; and meet steadily increasing costs. It is difficult, in this situation, to remain productive and competitive. It is vital to be equipped with, and be able to use appropriately, all the suitable methodologies and technologies. Working methods and personnel have to be appropriate. The future depends on three interdependent domains: automation in the broadest sense of the word, instrumentation and information systems. The easy work has already been done. Between 1984 and 1990, it was a question of going from nothing to something; now, it is necessary to increase and optimize.THEREFORE, THE CRUCIAL QUESTION IS NOW: 'how can we go quicker in experimentation and acquire more knowledge, while spending less money?' One solution is to use all the aspects of automation (robotics, instrumentation, data). Successful laboratory automation depends.on: shortened time to market; improved efficiency/cost ratio; motivation/competence/ expertise; communication; and knowledge acquisition. This paper examines some of the major technological areas of application.

  1. Laboratory automation: a challenge for the 1990s

    PubMed Central

    Mordini, Claude

    1994-01-01

    There is tremendous pressure on industry and laboratories to develop increasingly complex procucts: for example catalysts, chiral chemicals, drugs and ceramics; conform to regulations; cope with increasingly severe competition; and meet steadily increasing costs. It is difficult, in this situation, to remain productive and competitive. It is vital to be equipped with, and be able to use appropriately, all the suitable methodologies and technologies. Working methods and personnel have to be appropriate. The future depends on three interdependent domains: automation in the broadest sense of the word, instrumentation and information systems. The easy work has already been done. Between 1984 and 1990, it was a question of going from nothing to something; now, it is necessary to increase and optimize. Therefore, the crucial question is now: ‘how can we go quicker in experimentation and acquire more knowledge, while spending less money?’ One solution is to use all the aspects of automation (robotics, instrumentation, data). Successful laboratory automation depends.on: shortened time to market; improved efficiency/cost ratio; motivation/competence/ expertise; communication; and knowledge acquisition. This paper examines some of the major technological areas of application. PMID:18924998

  2. Automated acoustic analysis in detection of spontaneous swallows in Parkinson's disease.

    PubMed

    Golabbakhsh, Marzieh; Rajaei, Ali; Derakhshan, Mahmoud; Sadri, Saeed; Taheri, Masoud; Adibi, Peyman

    2014-10-01

    Acoustic monitoring of swallow frequency has become important as the frequency of spontaneous swallowing can be an index for dysphagia and related complications. In addition, it can be employed as an objective quantification of ingestive behavior. Commonly, swallowing complications are manually detected using videofluoroscopy recordings, which require expensive equipment and exposure to radiation. In this study, a noninvasive automated technique is proposed that uses breath and swallowing recordings obtained via a microphone located over the laryngopharynx. Nonlinear diffusion filters were used in which a scale-space decomposition of recorded sound at different levels extract swallows from breath sounds and artifacts. This technique was compared to manual detection of swallows using acoustic signals on a sample of 34 subjects with Parkinson's disease. A speech language pathologist identified five subjects who showed aspiration during the videofluoroscopic swallowing study. The proposed automated method identified swallows with a sensitivity of 86.67 %, a specificity of 77.50 %, and an accuracy of 82.35 %. These results indicate the validity of automated acoustic recognition of swallowing as a fast and efficient approach to objectively estimate spontaneous swallow frequency.

  3. Wireless energizing system for an automated implantable sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swain, Biswaranjan; Nayak, Praveen P.; Kar, Durga P.

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonantmore » frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.« less

  4. Automated volume of interest delineation and rendering of cone beam CT images in interventional cardiology

    NASA Astrophysics Data System (ADS)

    Lorenz, Cristian; Schäfer, Dirk; Eshuis, Peter; Carroll, John; Grass, Michael

    2012-02-01

    Interventional C-arm systems allow the efficient acquisition of 3D cone beam CT images. They can be used for intervention planning, navigation, and outcome assessment. We present a fast and completely automated volume of interest (VOI) delineation for cardiac interventions, covering the whole visceral cavity including mediastinum and lungs but leaving out rib-cage and spine. The problem is addressed in a model based approach. The procedure has been evaluated on 22 patient cases and achieves an average surface error below 2mm. The method is able to cope with varying image intensities, varying truncations due to the limited reconstruction volume, and partially with heavy metal and motion artifacts.

  5. MIEC-SVM: automated pipeline for protein peptide/ligand interaction prediction.

    PubMed

    Li, Nan; Ainsworth, Richard I; Wu, Meixin; Ding, Bo; Wang, Wei

    2016-03-15

    MIEC-SVM is a structure-based method for predicting protein recognition specificity. Here, we present an automated MIEC-SVM pipeline providing an integrated and user-friendly workflow for construction and application of the MIEC-SVM models. This pipeline can handle standard amino acids and those with post-translational modifications (PTMs) or small molecules. Moreover, multi-threading and support to Sun Grid Engine (SGE) are implemented to significantly boost the computational efficiency. The program is available at http://wanglab.ucsd.edu/MIEC-SVM CONTACT: : wei-wang@ucsd.edu Supplementary data available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  7. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    PubMed

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom activity ratios 9.7:1, 4:1, and 2:1, respectively. For all phantoms and at all contrast ratios, the average RMS error was found to be significantly lower for the proposed automated method compared to the manual analysis of the phantom scans. The uptake measurements produced by the automated method showed high correlation with the independent reference standard (R 2 ≥ 0.9987). In addition, the average computing time for the automated method was 30.6 s and was found to be significantly lower (P ≪ 0.001) compared to manual analysis (mean: 247.8 s). The proposed automated approach was found to have less error when measured against the independent reference than the manual approach. It can be easily adapted to other phantoms with spherical inserts. In addition, it eliminates inter- and intraoperator variability in PET phantom analysis and is significantly more time efficient, and therefore, represents a promising approach to facilitate and simplify PET standardization and harmonization efforts. © 2017 American Association of Physicists in Medicine.

  8. Automation in an Addiction Treatment Research Clinic: Computerized Contingency Management, Ecological Momentary Assessment, and a Protocol Workflow System

    PubMed Central

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.

    2009-01-01

    Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669

  9. Development and clinical introduction of automated radiotherapy treatment planning for prostate cancer

    NASA Astrophysics Data System (ADS)

    Winkel, D.; Bol, G. H.; van Asselen, B.; Hes, J.; Scholten, V.; Kerkmeijer, L. G. W.; Raaymakers, B. W.

    2016-12-01

    To develop an automated radiotherapy treatment planning and optimization workflow to efficiently create patient specifically optimized clinical grade treatment plans for prostate cancer and to implement it in clinical practice. A two-phased planning and optimization workflow was developed to automatically generate 77Gy 5-field simultaneously integrated boost intensity modulated radiation therapy (SIB-IMRT) plans for prostate cancer treatment. A retrospective planning study (n  =  100) was performed in which automatically and manually generated treatment plans were compared. A clinical pilot (n  =  21) was performed to investigate the usability of our method. Operator time for the planning process was reduced to  <5 min. The retrospective planning study showed that 98 plans met all clinical constraints. Significant improvements were made in the volume receiving 72Gy (V72Gy) for the bladder and rectum and the mean dose of the bladder and the body. A reduced plan variance was observed. During the clinical pilot 20 automatically generated plans met all constraints and 17 plans were selected for treatment. The automated radiotherapy treatment planning and optimization workflow is capable of efficiently generating patient specifically optimized and improved clinical grade plans. It has now been adopted as the current standard workflow in our clinic to generate treatment plans for prostate cancer.

  10. Methods for detection of ataxia telangiectasia mutations

    DOEpatents

    Gatti, Richard A.

    2005-10-04

    The present invention is directed to a method of screening large, complex, polyexonic eukaryotic genes such as the ATM gene for mutations and polymorphisms by an improved version of single strand conformation polymorphism (SSCP) electrophoresis that allows electrophoresis of two or three amplified segments in a single lane. The present invention also is directed to new mutations and polymorphisms in the ATM gene that are useful in performing more accurate screening of human DNA samples for mutations and in distinguishing mutations from polymorphisms, thereby improving the efficiency of automated screening methods.

  11. Modified SSCP method using sequential electrophoresis of multiple nucleic acid segments

    DOEpatents

    Gatti, Richard A.

    2002-10-01

    The present invention is directed to a method of screening large, complex, polyexonic eukaryotic genes such as the ATM gene for mutations and polymorphisms by an improved version of single strand conformation polymorphism (SSCP) electrophoresis that allows electrophoresis of two or three amplified segments in a single lane. The present invention also is directed to new mutations and polymorphisms in the ATM gene that are useful in performing more accurate screening of human DNA samples for mutations and in distinguishing mutations from polymorphisms, thereby improving the efficiency of automated screening methods.

  12. Human-Automation Allocations for Current Robotic Space Operations

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.; Chang, Mai L.; Beard, Bettina L.; Kim, Yun Kyung; Karasinski, John A.

    2018-01-01

    Within the Human Research Program, one risk delineates the uncertainty surrounding crew working with automation and robotics in spaceflight. The Risk of Inadequate Design of Human and Automation/Robotic Integration (HARI) is concerned with the detrimental effects on crew performance due to ineffective user interfaces, system designs and/or functional task allocation, potentially compromising mission success and safety. Risk arises because we have limited experience with complex automation and robotics. One key gap within HARI, is the gap related to functional allocation. The gap states: We need to evaluate, develop, and validate methods and guidelines for identifying human-automation/robot task information needs, function allocation, and team composition for future long duration, long distance space missions. Allocations determine the human-system performance as it identifies the functions and performance levels required by the automation/robotic system, and in turn, what work the crew is expected to perform and the necessary human performance requirements. Allocations must take into account each of the human, automation, and robotic systems capabilities and limitations. Some functions may be intuitively assigned to the human versus the robot, but to optimize efficiency and effectiveness, purposeful role assignments will be required. The role of automation and robotics will significantly change in future exploration missions, particularly as crew becomes more autonomous from ground controllers. Thus, we must understand the suitability of existing function allocation methods within NASA as well as the existing allocations established by the few robotic systems that are operational in spaceflight. In order to evaluate future methods of robotic allocations, we must first benchmark the allocations and allocation methods that have been used. We will present 1) documentation of human-automation-robotic allocations in existing, operational spaceflight systems; and 2) To gather existing lessons learned and best practices in these role assignments, from spaceflight operational experience of crew and ground teams that may be used to guide development for future systems. NASA and other space agencies have operational spaceflight experience with two key Human-Automation-Robotic (HAR) systems: heavy lift robotic arms and planetary robotic explorers. Additionally, NASA has invested in high-fidelity rover systems that can carry crew, building beyond Apollo's lunar rover. The heavy lift robotic arms reviewed are: Space Station Remote Manipulator System (SSRMS), Japanese Remote Manipulator System (JEMRMS), and the European Robotic Arm (ERA, designed but not deployed in space). The robotic rover systems reviewed are: Mars Exploration Rovers, Mars Science Laboratory rover, and the high-fidelity K10 rovers. Much of the design and operational feedback for these systems have been communicated to flight controllers and robotic design teams. As part of the mitigating the HARI risk for future human spaceflight operations, we must document function allocations between robots and humans that have worked well in practice.

  13. In Vitro Mass Propagation of Cymbopogon citratus Stapf., a Medicinal Gramineae.

    PubMed

    Quiala, Elisa; Barbón, Raúl; Capote, Alina; Pérez, Naivy; Jiménez, Elio

    2016-01-01

    Cymbopogon citratus (D.C.) Stapf. is a medicinal plant source of lemon grass oils with multiple uses in the pharmaceutical and food industry. Conventional propagation in semisolid culture medium has become a fast tool for mass propagation of lemon grass, but the production cost must be lower. A solution could be the application of in vitro propagation methods based on liquid culture advantages and automation. This chapter provides two efficient protocols for in vitro propagation via organogenesis and somatic embryogenesis of this medicinal plant. Firstly, we report the production of shoots using a temporary immersion system (TIS). Secondly, a protocol for somatic embryogenesis using semisolid culture for callus formation and multiplication, and liquid culture in a rotatory shaker and conventional bioreactors for the maintenance of embryogenic culture, is described. Well-developed plants can be achieved from both protocols. Here we provide a fast and efficient technology for mass propagation of this medicinal plant taking the advantage of liquid culture and automation.

  14. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    PubMed

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  15. Improving treatment plan evaluation with automation

    PubMed Central

    Covington, Elizabeth L.; Chen, Xiaoping; Younge, Kelly C.; Lee, Choonik; Matuszak, Martha M.; Kessler, Marc L.; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M.; Filpansick, Stephanie E.

    2016-01-01

    The goal of this work is to evaluate the effectiveness of Plan‐Checker Tool (PCT) which was created to improve first‐time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.‐x, 87.55.N‐, 87.55.Qr, 87.55.tm, 89.20.Bb PMID:27929478

  16. Impact of pharmacy automation on patient waiting time: an application of computer simulation.

    PubMed

    Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng

    2009-06-01

    This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.

  17. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Effective and Efficient Stochastic Global Optimization

    DTIC Science & Technology

    2012-02-01

    parameter estimation method, but rather to carefully describe how to use the ERDC software implementation of MLSL that accommodates the PEST model...model independent LM method based parameter estimation software PEST (Doherty, 2004, 2007a, 2007b), which quantifies model to measure- ment misfit...et al. (2011) focused on one drawback associated with LM-based model independent parameter estimation as implemented in PEST ; viz., that it requires

  18. Automation of [(18) F]fluoroacetaldehyde synthesis: application to a recombinant human interleukin-1 receptor antagonist (rhIL-1RA).

    PubMed

    Morris, Olivia; McMahon, Adam; Boutin, Herve; Grigg, Julian; Prenant, Christian

    2016-06-15

    [(18) F]Fluoroacetaldehyde is a biocompatible prosthetic group that has been implemented pre-clinically using a semi-automated remotely controlled system. Automation of radiosyntheses permits use of higher levels of [(18) F]fluoride whilst minimising radiochemist exposure and enhancing reproducibility. In order to achieve full-automation of [(18) F]fluoroacetaldehyde peptide radiolabelling, a customised GE Tracerlab FX-FN with fully programmed automated synthesis was developed. The automated synthesis of [(18) F]fluoroacetaldehyde is carried out using a commercially available precursor, with reproducible yields of 26% ± 3 (decay-corrected, n = 10) within 45 min. Fully automated radiolabelling of a protein, recombinant human interleukin-1 receptor antagonist (rhIL-1RA), with [(18) F]fluoroacetaldehyde was achieved within 2 h. Radiolabelling efficiency of rhIL-1RA with [(18) F]fluoroacetaldehyde was confirmed using HPLC and reached 20% ± 10 (n = 5). Overall RCY of [(18) F]rhIL-1RA was 5% ± 2 (decay-corrected, n = 5) within 2 h starting from 35 to 40 GBq of [(18) F]fluoride. Specific activity measurements of 8.11-13.5 GBq/µmol were attained (n = 5), a near three-fold improvement of those achieved using the semi-automated approach. The strategy can be applied to radiolabelling a range of peptides and proteins with [(18) F]fluoroacetaldehyde analogous to other aldehyde-bearing prosthetic groups, yet automation of the method provides reproducibility thereby aiding translation to Good Manufacturing Practice manufacture and the transformation from pre-clinical to clinical production. Copyright © 2016 The Authors. Journal of Labelled Compounds and Radiopharmaceuticals published by John Wiley & Sons, Ltd.

  19. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  20. Classification methods to detect sleep apnea in adults based on respiratory and oximetry signals: a systematic review.

    PubMed

    Uddin, M B; Chow, C M; Su, S W

    2018-03-26

    Sleep apnea (SA), a common sleep disorder, can significantly decrease the quality of life, and is closely associated with major health risks such as cardiovascular disease, sudden death, depression, and hypertension. The normal diagnostic process of SA using polysomnography is costly and time consuming. In addition, the accuracy of different classification methods to detect SA varies with the use of different physiological signals. If an effective, reliable, and accurate classification method is developed, then the diagnosis of SA and its associated treatment will be time-efficient and economical. This study aims to systematically review the literature and present an overview of classification methods to detect SA using respiratory and oximetry signals and address the automated detection approach. Sixty-two included studies revealed the application of single and multiple signals (respiratory and oximetry) for the diagnosis of SA. Both airflow and oxygen saturation signals alone were effective in detecting SA in the case of binary decision-making, whereas multiple signals were good for multi-class detection. In addition, some machine learning methods were superior to the other classification methods for SA detection using respiratory and oximetry signals. To deal with the respiratory and oximetry signals, a good choice of classification method as well as the consideration of associated factors would result in high accuracy in the detection of SA. An accurate classification method should provide a high detection rate with an automated (independent of human action) analysis of respiratory and oximetry signals. Future high-quality automated studies using large samples of data from multiple patient groups or record batches are recommended.

  1. Automated Weight-Window Generation for Threat Detection Applications Using ADVANTG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, Scott W; Miller, Thomas Martin; Evans, Thomas M

    2009-01-01

    Deterministic transport codes have been used for some time to generate weight-window parameters that can improve the efficiency of Monte Carlo simulations. As the use of this hybrid computational technique is becoming more widespread, the scope of applications in which it is being applied is expanding. An active source of new applications is the field of homeland security--particularly the detection of nuclear material threats. For these problems, automated hybrid methods offer an efficient alternative to trial-and-error variance reduction techniques (e.g., geometry splitting or the stochastic weight window generator). The ADVANTG code has been developed to automate the generation of weight-windowmore » parameters for MCNP using the Consistent Adjoint Driven Importance Sampling method and employs the TORT or Denovo 3-D discrete ordinates codes to generate importance maps. In this paper, we describe the application of ADVANTG to a set of threat-detection simulations. We present numerical results for an 'active-interrogation' problem in which a standard cargo container is irradiated by a deuterium-tritium fusion neutron generator. We also present results for two passive detection problems in which a cargo container holding a shielded neutron or gamma source is placed near a portal monitor. For the passive detection problems, ADVANTG obtains an O(10{sup 4}) speedup and, for a detailed gamma spectrum tally, an average O(10{sup 2}) speedup relative to implicit-capture-only simulations, including the deterministic calculation time. For the active-interrogation problem, an O(10{sup 4}) speedup is obtained when compared to a simulation with angular source biasing and crude geometry splitting.« less

  2. Automation for Accommodating Fuel-Efficient Descents in Constrained Airspace

    NASA Technical Reports Server (NTRS)

    Coopenbarger, Richard A.

    2010-01-01

    Continuous descents at low engine power are desired to reduce fuel consumption, emissions and noise during arrival operations. The challenge is to allow airplanes to fly these types of efficient descents without interruption during busy traffic conditions. During busy conditions today, airplanes are commonly forced to fly inefficient, step-down descents as airtraffic controllers work to ensure separation and maximize throughput. NASA in collaboration with government and industry partners is developing new automation to help controllers accommodate continuous descents in the presence of complex traffic and airspace constraints. This automation relies on accurate trajectory predictions to compute strategic maneuver advisories. The talk will describe the concept behind this new automation and provide an overview of the simulations and flight testing used to develop and refine its underlying technology.

  3. CLIPS: A tool for corn disease diagnostic system and an aid to neural network for automated knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy

    1990-01-01

    This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.

  4. Evaluation Of A Powder-Free DNA Extraction Method For Skeletal Remains.

    PubMed

    Harrel, Michelle; Mayes, Carrie; Gangitano, David; Hughes-Stamm, Sheree

    2018-02-07

    Bones are often recovered in forensic investigations, including missing persons and mass disasters. While traditional DNA extraction methods rely on grinding bone into powder prior to DNA purification, the TBone Ex buffer (DNA Chip Research Inc.) digests bone chips without powdering. In this study, six bones were extracted using the TBone Ex kit in conjunction with the PrepFiler ® BTA™ DNA extraction kit (Thermo Fisher Scientific) both manually and via an automated platform. Comparable amounts of DNA were recovered from a 50 mg bone chip using the TBone Ex kit and 50 mg of powdered bone with the PrepFiler ® BTA™ kit. However, automated DNA purification decreased DNA yield (p < 0.05). Nevertheless, short tandem repeat (STR) success was comparable across all methods tested. This study demonstrates that digestion of whole bone fragments is an efficient alternative to powdering bones for DNA extraction without compromising downstream STR profile quality. © 2018 American Academy of Forensic Sciences.

  5. Using a geographic information system and scanning technology to create high-resolution land-use data sets

    USGS Publications Warehouse

    Harvey, Craig A.; Kolpin, Dana W.; Battaglin, William A.

    1996-01-01

    A geographic information system (GIS) procedure was developed to compile low-altitude aerial photography, digitized data, and land-use data from U.S. Department of Agriculture Consolidated Farm Service Agency (CFSA) offices into a high-resolution (approximately 5 meters) land-use GIS data set. The aerial photography consisted of 35-mm slides which were scanned into tagged information file format (TIFF) images. These TIFF images were then imported into the GIS where they were registered into a geographically referenced coordinate system. Boundaries between land use were delineated from these GIS data sets using on-screen digitizing techniques. Crop types were determined using information obtained from the U.S. Department of Agriculture CFSA offices. Crop information not supplied by the CFSA was attributed by manual classification procedures. Automated methods to provide delineation of the field boundaries and land-use classification were investigated. It was determined that using these data sources, automated methods were less efficient and accurate than manual methods of delineating field boundaries and classifying land use.

  6. Comparative evaluation of three automated systems for DNA extraction in conjunction with three commercially available real-time PCR assays for quantitation of plasma Cytomegalovirus DNAemia in allogeneic stem cell transplant recipients.

    PubMed

    Bravo, Dayana; Clari, María Ángeles; Costa, Elisa; Muñoz-Cobo, Beatriz; Solano, Carlos; José Remigia, María; Navarro, David

    2011-08-01

    Limited data are available on the performance of different automated extraction platforms and commercially available quantitative real-time PCR (QRT-PCR) methods for the quantitation of cytomegalovirus (CMV) DNA in plasma. We compared the performance characteristics of the Abbott mSample preparation system DNA kit on the m24 SP instrument (Abbott), the High Pure viral nucleic acid kit on the COBAS AmpliPrep system (Roche), and the EZ1 Virus 2.0 kit on the BioRobot EZ1 extraction platform (Qiagen) coupled with the Abbott CMV PCR kit, the LightCycler CMV Quant kit (Roche), and the Q-CMV complete kit (Nanogen), for both plasma specimens from allogeneic stem cell transplant (Allo-SCT) recipients (n = 42) and the OptiQuant CMV DNA panel (AcroMetrix). The EZ1 system displayed the highest extraction efficiency over a wide range of CMV plasma DNA loads, followed by the m24 and the AmpliPrep methods. The Nanogen PCR assay yielded higher mean CMV plasma DNA values than the Abbott and the Roche PCR assays, regardless of the platform used for DNA extraction. Overall, the effects of the extraction method and the QRT-PCR used on CMV plasma DNA load measurements were less pronounced for specimens with high CMV DNA content (>10,000 copies/ml). The performance characteristics of the extraction methods and QRT-PCR assays evaluated herein for clinical samples were extensible at cell-based standards from AcroMetrix. In conclusion, different automated systems are not equally efficient for CMV DNA extraction from plasma specimens, and the plasma CMV DNA loads measured by commercially available QRT-PCRs can differ significantly. The above findings should be taken into consideration for the establishment of cutoff values for the initiation or cessation of preemptive antiviral therapies and for the interpretation of data from clinical studies in the Allo-SCT setting.

  7. LAMBADA and InflateGRO2: efficient membrane alignment and insertion of membrane proteins for molecular dynamics simulations.

    PubMed

    Schmidt, Thomas H; Kandt, Christian

    2012-10-22

    At the beginning of each molecular dynamics membrane simulation stands the generation of a suitable starting structure which includes the working steps of aligning membrane and protein and seamlessly accommodating the protein in the membrane. Here we introduce two efficient and complementary methods based on pre-equilibrated membrane patches, automating these steps. Using a voxel-based cast of the coarse-grained protein, LAMBADA computes a hydrophilicity profile-derived scoring function based on which the optimal rotation and translation operations are determined to align protein and membrane. Employing an entirely geometrical approach, LAMBADA is independent from any precalculated data and aligns even large membrane proteins within minutes on a regular workstation. LAMBADA is the first tool performing the entire alignment process automatically while providing the user with the explicit 3D coordinates of the aligned protein and membrane. The second tool is an extension of the InflateGRO method addressing the shortcomings of its predecessor in a fully automated workflow. Determining the exact number of overlapping lipids based on the area occupied by the protein and restricting expansion, compression and energy minimization steps to a subset of relevant lipids through automatically calculated and system-optimized operation parameters, InflateGRO2 yields optimal lipid packing and reduces lipid vacuum exposure to a minimum preserving as much of the equilibrated membrane structure as possible. Applicable to atomistic and coarse grain structures in MARTINI format, InflateGRO2 offers high accuracy, fast performance, and increased application flexibility permitting the easy preparation of systems exhibiting heterogeneous lipid composition as well as embedding proteins into multiple membranes. Both tools can be used separately, in combination with other methods, or in tandem permitting a fully automated workflow while retaining a maximum level of usage control and flexibility. To assess the performance of both methods, we carried out test runs using 22 membrane proteins of different size and transmembrane structure.

  8. Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.

    PubMed

    Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S

    2013-03-01

    Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.

  9. Automated detection and enumeration of marine wildlife using unmanned aircraft systems (UAS) and thermal imagery

    PubMed Central

    Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.

    2017-01-01

    Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95–98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management. PMID:28338047

  10. Automated detection and enumeration of marine wildlife using unmanned aircraft systems (UAS) and thermal imagery

    NASA Astrophysics Data System (ADS)

    Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.

    2017-03-01

    Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95-98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management.

  11. A completely automated flow, heat-capacity, calorimeter for use at high temperatures and pressures

    NASA Astrophysics Data System (ADS)

    Rogers, P. S. Z.; Sandarusi, Jamal

    1990-11-01

    An automated, flow calorimeter has been constructed to measure the isobaric heat capacities of concentrated, aqueous electrolyte solutions using a differential calorimetry technique. The calorimeter is capable of operation to 700 K and 40 MPa with a measurement accuracy of 0.03% relative to the heat capacity of the pure reference fluid (water). A novel design encloses the calorimeter within a double set of separately controlled, copper, adiabatic shields that minimize calorimeter heat losses and precisely control the temperature of the inlet fluids. A multistage preheat train, used to efficiently heat the flowing fluid, includes a counter-current heat exchanger for the inlet and outlet fluid streams in tandem with two calorimeter preheaters. Complete system automation is accomplished with a distributed control scheme using multiple processors, allowing the major control tasks of calorimeter operation and control, data logging and display, and pump control to be performed simultaneously. A sophisticated pumping strategy for the two separate syringe pumps allows continuous fluid delivery. This automation system enables the calorimeter to operate unattended except for the reloading of sample fluids. In addition, automation has allowed the development and implementation of an improved heat loss calibration method that provides calorimeter calibration with absolute accuracy comparable to the overall measurement precision, even for very concentrated solutions.

  12. A Flexible Workflow for Automated Bioluminescent Kinase Selectivity Profiling.

    PubMed

    Worzella, Tracy; Butzler, Matt; Hennek, Jacquelyn; Hanson, Seth; Simdon, Laura; Goueli, Said; Cowan, Cris; Zegzouti, Hicham

    2017-04-01

    Kinase profiling during drug discovery is a necessary process to confirm inhibitor selectivity and assess off-target activities. However, cost and logistical limitations prevent profiling activities from being performed in-house. We describe the development of an automated and flexible kinase profiling workflow that combines ready-to-use kinase enzymes and substrates in convenient eight-tube strips, a bench-top liquid handling device, ADP-Glo Kinase Assay (Promega, Madison, WI) technology to quantify enzyme activity, and a multimode detection instrument. Automated methods were developed for kinase reactions and quantification reactions to be assembled on a Gilson (Middleton, WI) PIPETMAX, following standardized plate layouts for single- and multidose compound profiling. Pipetting protocols were customized at runtime based on user-provided information, including compound number, increment for compound titrations, and number of kinase families to use. After the automated liquid handling procedures, a GloMax Discover (Promega) microplate reader preloaded with SMART protocols was used for luminescence detection and automatic data analysis. The functionality of the automated workflow was evaluated with several compound-kinase combinations in single-dose or dose-response profiling formats. Known target-specific inhibitions were confirmed. Novel small molecule-kinase interactions, including off-target inhibitions, were identified and confirmed in secondary studies. By adopting this streamlined profiling process, researchers can quickly and efficiently profile compounds of interest on site.

  13. Cost-Efficient Phase Noise Measurement

    NASA Astrophysics Data System (ADS)

    Perić, Ana; Bjelica, Milan

    2014-05-01

    In this paper, an automated system for oscillator phase noise measurement is described. The system is primarily intended for use in academic institutions, such as smaller university or research laboratories, as it deploys standard spectrum analyzer and free software. A method to compensate the effect of instrument intrinsic noise is proposed. Through series of experimental tests, good performances of our system are verified and compliance to theoretical expectations is demonstrated.

  14. Assessing the Efficiency of Phenotyping Early Traits in a Greenhouse Automated Platform for Predicting Drought Tolerance of Soybean in the Field

    PubMed Central

    Peirone, Laura S.; Pereyra Irujo, Gustavo A.; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A. N.

    2018-01-01

    Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping. PMID:29774042

  15. Automated Cell Detection and Morphometry on Growth Plate Images of Mouse Bone

    PubMed Central

    Ascenzi, Maria-Grazia; Du, Xia; Harding, James I; Beylerian, Emily N; de Silva, Brian M; Gross, Ben J; Kastein, Hannah K; Wang, Weiguang; Lyons, Karen M; Schaeffer, Hayden

    2014-01-01

    Microscopy imaging of mouse growth plates is extensively used in biology to understand the effect of specific molecules on various stages of normal bone development and on bone disease. Until now, such image analysis has been conducted by manual detection. In fact, when existing automated detection techniques were applied, morphological variations across the growth plate and heterogeneity of image background color, including the faint presence of cells (chondrocytes) located deeper in tissue away from the image’s plane of focus, and lack of cell-specific features, interfered with identification of cell. We propose the first method of automated detection and morphometry applicable to images of cells in the growth plate of long bone. Through ad hoc sequential application of the Retinex method, anisotropic diffusion and thresholding, our new cell detection algorithm (CDA) addresses these challenges on bright-field microscopy images of mouse growth plates. Five parameters, chosen by the user in respect of image characteristics, regulate our CDA. Our results demonstrate effectiveness of the proposed numerical method relative to manual methods. Our CDA confirms previously established results regarding chondrocytes’ number, area, orientation, height and shape of normal growth plates. Our CDA also confirms differences previously found between the genetic mutated mouse Smad1/5CKO and its control mouse on fluorescence images. The CDA aims to aid biomedical research by increasing efficiency and consistency of data collection regarding arrangement and characteristics of chondrocytes. Our results suggest that automated extraction of data from microscopy imaging of growth plates can assist in unlocking information on normal and pathological development, key to the underlying biological mechanisms of bone growth. PMID:25525552

  16. Automated matching of multiple terrestrial laser scans for stem mapping without the use of artificial references

    NASA Astrophysics Data System (ADS)

    Liu, Jingbin; Liang, Xinlian; Hyyppä, Juha; Yu, Xiaowei; Lehtomäki, Matti; Pyörälä, Jiri; Zhu, Lingli; Wang, Yunsheng; Chen, Ruizhi

    2017-04-01

    Terrestrial laser scanning has been widely used to analyze the 3D structure of a forest in detail and to generate data at the level of a reference plot for forest inventories without destructive measurements. Multi-scan terrestrial laser scanning is more commonly applied to collect plot-level data so that all of the stems can be detected and analyzed. However, it is necessary to match the point clouds of multiple scans to yield a point cloud with automated processing. Mismatches between datasets will lead to errors during the processing of multi-scan data. Classic registration methods based on flat surfaces cannot be directly applied in forest environments; therefore, artificial reference objects have conventionally been used to assist with scan matching. The use of artificial references requires additional labor and expertise, as well as greatly increasing the cost. In this study, we present an automated processing method for plot-level stem mapping that matches multiple scans without artificial references. In contrast to previous studies, the registration method developed in this study exploits the natural geometric characteristics among a set of tree stems in a plot and combines the point clouds of multiple scans into a unified coordinate system. Integrating multiple scans improves the overall performance of stem mapping in terms of the correctness of tree detection, as well as the bias and the root-mean-square errors of forest attributes such as diameter at breast height and tree height. In addition, the automated processing method makes stem mapping more reliable and consistent among plots, reduces the costs associated with plot-based stem mapping, and enhances the efficiency.

  17. Taboo search algorithm for item assignment in synchronized zone automated order picking system

    NASA Astrophysics Data System (ADS)

    Wu, Yingying; Wu, Yaohua

    2014-07-01

    The idle time which is part of the order fulfillment time is decided by the number of items in the zone; therefore the item assignment method affects the picking efficiency. Whereas previous studies only focus on the balance of number of kinds of items between different zones but not the number of items and the idle time in each zone. In this paper, an idle factor is proposed to measure the idle time exactly. The idle factor is proven to obey the same vary trend with the idle time, so the object of this problem can be simplified from minimizing idle time to minimizing idle factor. Based on this, the model of item assignment problem in synchronized zone automated order picking system is built. The model is a form of relaxation of parallel machine scheduling problem which had been proven to be NP-complete. To solve the model, a taboo search algorithm is proposed. The main idea of the algorithm is minimizing the greatest idle factor of zones with the 2-exchange algorithm. Finally, the simulation which applies the data collected from a tobacco distribution center is conducted to evaluate the performance of the algorithm. The result verifies the model and shows the algorithm can do a steady work to reduce idle time and the idle time can be reduced by 45.63% on average. This research proposed an approach to measure the idle time in synchronized zone automated order picking system. The approach can improve the picking efficiency significantly and can be seen as theoretical basis when optimizing the synchronized automated order picking systems.

  18. Automated Computerized Analysis of Speechin Psychiatric Disorders

    PubMed Central

    Cohen, Alex S.; Elvevåg, Brita

    2014-01-01

    Purpose of Review Disturbances in communication are a hallmark of severe mental illnesses. Recent technological advances have paved the way for objectifying communication using automated computerized linguistic and acoustic analysis. We review recent studies applying various computer-based assessments to the natural language produced by adult patients with severe mental illness. Recent Findings Automated computerized methods afford tools with which it is possible to objectively evaluate patients in a reliable, valid and efficient manner that complements human ratings. Crucially, these measures correlate with important clinical measures. The clinical relevance of these novel metrics has been demonstrated by showing their relationship to functional outcome measures, their in vivo link to classic ‘language’ regions in the brain, and, in the case of linguistic analysis, their relationship to candidate genes for severe mental illness. Summary Computer based assessments of natural language afford a framework with which to measure communication disturbances in adults with SMI. Emerging evidence suggests that they can be reliable and valid, and overcome many practical limitations of more traditional assessment methods. The advancement of these technologies offers unprecedented potential for measuring and understanding some of the most crippling symptoms of some of the most debilitating illnesses known to humankind. PMID:24613984

  19. Microtiter miniature shaken bioreactor system as a scale-down model for process development of production of therapeutic alpha-interferon2b by recombinant Escherichia coli.

    PubMed

    Tan, Joo Shun; Abbasiliasi, Sahar; Kadkhodaei, Saeid; Tam, Yew Joon; Tang, Teck-Kim; Lee, Yee-Ying; Ariff, Arbakariya B

    2018-01-04

    Demand for high-throughput bioprocessing has dramatically increased especially in the biopharmaceutical industry because the technologies are of vital importance to process optimization and media development. This can be efficiently boosted by using microtiter plate (MTP) cultivation setup embedded into an automated liquid-handling system. The objective of this study was to establish an automated microscale method for upstream and downstream bioprocessing of α-IFN2b production by recombinant Escherichia coli. The extraction performance of α-IFN2b by osmotic shock using two different systems, automated microscale platform and manual extraction in MTP was compared. The amount of α-IFN2b extracted using automated microscale platform (49.2 μg/L) was comparable to manual osmotic shock method (48.8 μg/L), but the standard deviation was 2 times lower as compared to manual osmotic shock method. Fermentation parameters in MTP involving inoculum size, agitation speed, working volume and induction profiling revealed that the fermentation conditions for the highest production of α-IFN2b (85.5 μg/L) was attained at inoculum size of 8%, working volume of 40% and agitation speed of 1000 rpm with induction at 4 h after the inoculation. Although the findings at MTP scale did not show perfect scalable results as compared to shake flask culture, but microscale technique development would serve as a convenient and low-cost solution in process optimization for recombinant protein.

  20. A control system based on field programmable gate array for papermaking sewage treatment

    NASA Astrophysics Data System (ADS)

    Zhang, Zi Sheng; Xie, Chang; Qing Xiong, Yan; Liu, Zhi Qiang; Li, Qing

    2013-03-01

    A sewage treatment control system is designed to improve the efficiency of papermaking wastewater treatment system. The automation control system is based on Field Programmable Gate Array (FPGA), coded with Very-High-Speed Integrate Circuit Hardware Description Language (VHDL), compiled and simulated with Quartus. In order to ensure the stability of the data used in FPGA, the data is collected through temperature sensors, water level sensor and online PH measurement system. The automatic control system is more sensitive, and both the treatment efficiency and processing power are increased. This work provides a new method for sewage treatment control.

  1. Combined process automation for large-scale EEG analysis.

    PubMed

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    PubMed

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image quality assessment by two observers revealed that the MTT maps exhibited superior quality over the TTP maps (88% good rating of MTT as compared to 68% of TTP). Our software allowed fully automated deconvolution analysis of DSC PWI using proven efficient algorithms that can be applied to acute stroke treatment decisions. Our streamlined method also offers promise for further development of automated quantitative analysis of the ischemic penumbra. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  3. Automated detection of diagnostically relevant regions in H&E stained digital pathology slides

    NASA Astrophysics Data System (ADS)

    Bahlmann, Claus; Patel, Amar; Johnson, Jeffrey; Ni, Jie; Chekkoury, Andrei; Khurd, Parmeshwar; Kamen, Ali; Grady, Leo; Krupinski, Elizabeth; Graham, Anna; Weinstein, Ronald

    2012-03-01

    We present a computationally efficient method for analyzing H&E stained digital pathology slides with the objective of discriminating diagnostically relevant vs. irrelevant regions. Such technology is useful for several applications: (1) It can speed up computer aided diagnosis (CAD) for histopathology based cancer detection and grading by an order of magnitude through a triage-like preprocessing and pruning. (2) It can improve the response time for an interactive digital pathology workstation (which is usually dealing with several GByte digital pathology slides), e.g., through controlling adaptive compression or prioritization algorithms. (3) It can support the detection and grading workflow for expert pathologists in a semi-automated diagnosis, hereby increasing throughput and accuracy. At the core of the presented method is the statistical characterization of tissue components that are indicative for the pathologist's decision about malignancy vs. benignity, such as, nuclei, tubules, cytoplasm, etc. In order to allow for effective yet computationally efficient processing, we propose visual descriptors that capture the distribution of color intensities observed for nuclei and cytoplasm. Discrimination between statistics of relevant vs. irrelevant regions is learned from annotated data, and inference is performed via linear classification. We validate the proposed method both qualitatively and quantitatively. Experiments show a cross validation error rate of 1.4%. We further show that the proposed method can prune ~90% of the area of pathological slides while maintaining 100% of all relevant information, which allows for a speedup of a factor of 10 for CAD systems.

  4. Optimization of the working process of the axial compressor according to the criterion of efficiency

    NASA Astrophysics Data System (ADS)

    Baturin, O. V.; Popov, G. M.; Goryachkin, E. S.; Novikova, Yu D.

    2017-01-01

    The paper shows search results of the optimal shape of low pressure compressor blades of the industrial gas turbine plant using methods of computational fluid dynamics and multicriteria methods of mathematical optimization. The essence of the methods is that an increase in compressor efficiency should be achieved by increasing the degree of compression up to 2%, and reducing the air flow to 8% relative to basic engine parameters. However, the compressor design elements should be retained as maximally unchanged as possible. During the work, the calculation model of the workflow in the test compressor has been developed and verified in the NUMECA software package, the automated algorithm of the blades shape change has been also developed using a small number of variables, while maintaining its stress-strain state. It allows reducing the number of changeable variables more than twofold. As the result of this study, the option of compressor performance was found, which can increase its efficiency by 1.3% (abs.).

  5. Computer laser system for prevention and treatment of dental diseases: new methods and results

    NASA Astrophysics Data System (ADS)

    Fedyai, S. G.; Prochonchukov, Alexander A.; Zhizhina, Nina A.; Metelnikov, Michael A.

    1995-05-01

    We report results of clinical application of the new computer-laser system. The system includes hardware and software means, which are applied for new efficient methods of prevention and treatment of main dental diseases. The hardware includes a laser physiotherapeutic device (LPD) `Optodan' and a fiberoptic laser delivery system with special endodontic rigging. The semiconductor AG-AL-AG laser diode with wavelengths in the spectral range of 850 - 950 nm (produced by Scientific-Industrial Concern `Reflector') is used as a basic unit. The LPD `Optodan' and methods of treatment are covered by Russian patent No 2014107 and certified by the Russian Ministry of Health. The automated computer system allows us to examine patients quickly and to input differential diagnosis, to determine indications (and contraindications), parameters and regimen of laser therapy, to control treatment efficacy (for carious -- through clinical indexes of enamel solubles, velocity of demineralization and other tests; for periodontal diseases trough complex of the periodontal indexes with automated registry and calculation). We present last results of application of the new technique and methods in treatment of dental diseases in Russian clinics.

  6. Automated Knowledge Discovery from Simulators

    NASA Technical Reports Server (NTRS)

    Burl, Michael C.; DeCoste, D.; Enke, B. L.; Mazzoni, D.; Merline, W. J.; Scharenbroich, L.

    2006-01-01

    In this paper, we explore one aspect of knowledge discovery from simulators, the landscape characterization problem, where the aim is to identify regions in the input/ parameter/model space that lead to a particular output behavior. Large-scale numerical simulators are in widespread use by scientists and engineers across a range of government agencies, academia, and industry; in many cases, simulators provide the only means to examine processes that are infeasible or impossible to study otherwise. However, the cost of simulation studies can be quite high, both in terms of the time and computational resources required to conduct the trials and the manpower needed to sift through the resulting output. Thus, there is strong motivation to develop automated methods that enable more efficient knowledge extraction.

  7. Automated optimization of an aspheric light-emitting diode lens for uniform illumination.

    PubMed

    Luo, Xiaoxia; Liu, Hua; Lu, Zhenwu; Wang, Yao

    2011-07-10

    In this paper, an automated optimization method in the sequential mode of ZEMAX is proposed in the design of an aspheric lens with uniform illuminance for an LED source. A feedback modification is introduced in the design for the LED extended source. The user-defined merit function is written out by using ZEMAX programming language macros language and, as an example, optimum parameters of an aspheric lens are obtained via running an optimization. The optical simulation results show that the illumination efficiency and uniformity can reach 83% and 90%, respectively, on a target surface of 40 mm diameter and at 60 mm away for a 1×1 mm LED source. © 2011 Optical Society of America

  8. Enhanced mobility for aging populations using automated vehicles.

    DOT National Transportation Integrated Search

    2015-12-01

    Automated vehicles (AV) offer a unique opportunity to improve the safety and efficiency of the transportation : system and enhance the mobility of aging and transportation disadvantaged populations simultaneously. : However, before this potential can...

  9. Automated matching software for clinical trials eligibility: measuring efficiency and flexibility.

    PubMed

    Penberthy, Lynne; Brown, Richard; Puma, Federico; Dahman, Bassam

    2010-05-01

    Clinical trials (CT) serve as the media that translates clinical research into standards of care. Low or slow recruitment leads to delays in delivery of new therapies to the public. Determination of eligibility in all patients is one of the most important factors to assure unbiased results from the clinical trials process and represents the first step in addressing the issue of under representation and equal access to clinical trials. This is a pilot project evaluating the efficiency, flexibility, and generalizibility of an automated clinical trials eligibility screening tool across 5 different clinical trials and clinical trial scenarios. There was a substantial total savings during the study period in research staff time spent in evaluating patients for eligibility ranging from 165h to 1329h. There was a marked enhancement in efficiency with the automated system for all but one study in the pilot. The ratio of mean staff time required per eligible patient identified ranged from 0.8 to 19.4 for the manual versus the automated process. The results of this study demonstrate that automation offers an opportunity to reduce the burden of the manual processes required for CT eligibility screening and to assure that all patients have an opportunity to be evaluated for participation in clinical trials as appropriate. The automated process greatly reduces the time spent on eligibility screening compared with the traditional manual process by effectively transferring the load of the eligibility assessment process to the computer. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  10. 31 CFR 205.17 - Are funds transfers delayed by automated payment systems restrictions based on the size and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... automated payment systems restrictions based on the size and timing of the drawdown request subject to this... EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance Programs Included in a Treasury-State Agreement § 205.17 Are funds transfers delayed by automated payment systems restrictions...

  11. Shifting and power sharing control of a novel dual input clutchless transmission for electric vehicles

    NASA Astrophysics Data System (ADS)

    Liang, Jiejunyi; Yang, Haitao; Wu, Jinglai; Zhang, Nong; Walker, Paul D.

    2018-05-01

    To improve the overall efficiency of electric vehicles and guarantee the driving comfort and vehicle drivability under the concept of simplifying mechanism complexity and minimizing manufacturing cost, this paper proposes a novel clutchless power-shifting transmission system with shifting control strategy and power sharing control strategy. The proposed shifting strategy takes advantage of the transmission architecture to achieve power-on shifting, which greatly improves the driving comfort compared with conventional automated manual transmission, with a bump function based shifting control method. To maximize the overall efficiency, a real-time power sharing control strategy is designed to solve the power distribution problem between the two motors. Detailed mathematical model is built to verify the effectiveness of the proposed methods. The results demonstrate the proposed strategies considerably improve the overall efficiency while achieve non-interrupted power-on shifting and maintain the vehicle jerk during shifting under an acceptable threshold.

  12. Chapter 22: Compressed Air Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Benton, Nathanael; Burns, Patrick

    Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less

  13. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids by Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  14. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  15. Accelerated Enveloping Distribution Sampling: Enabling Sampling of Multiple End States while Preserving Local Energy Minima.

    PubMed

    Perthold, Jan Walther; Oostenbrink, Chris

    2018-05-17

    Enveloping distribution sampling (EDS) is an efficient approach to calculate multiple free-energy differences from a single molecular dynamics (MD) simulation. However, the construction of an appropriate reference-state Hamiltonian that samples all states efficiently is not straightforward. We propose a novel approach for the construction of the EDS reference-state Hamiltonian, related to a previously described procedure to smoothen energy landscapes. In contrast to previously suggested EDS approaches, our reference-state Hamiltonian preserves local energy minima of the combined end-states. Moreover, we propose an intuitive, robust and efficient parameter optimization scheme to tune EDS Hamiltonian parameters. We demonstrate the proposed method with established and novel test systems and conclude that our approach allows for the automated calculation of multiple free-energy differences from a single simulation. Accelerated EDS promises to be a robust and user-friendly method to compute free-energy differences based on solid statistical mechanics.

  16. Automated assay for screening the enzymatic release of reducing sugars from micronized biomass

    PubMed Central

    2010-01-01

    Background To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol), it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. Results We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature) were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Conclusions Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the robot can autonomously process 120 triplicate wheat-straw samples per day. This throughput can be doubled if the incubation time is reduced from 24 h to 4 h (for initial rates measurements, for instance). This method can potentially be used with any insoluble substrate that is micronizable. A video illustrating the method can be seen at the following URL: http://www.youtube.com/watch?v=NFg6TxjuMWU PMID:20637080

  17. Analysis of nitrosamines in water by automated SPE and isotope dilution GC/HRMS Occurrence in the different steps of a drinking water treatment plant, and in chlorinated samples from a reservoir and a sewage treatment plant effluent.

    PubMed

    Planas, Carles; Palacios, Oscar; Ventura, Francesc; Rivera, Josep; Caixach, Josep

    2008-08-15

    A method based on automated solid-phase extraction (SPE) and isotope dilution gas chromatography/high resolution mass spectrometry (GC/HRMS) has been developed for the analysis of nine nitrosamines in water samples. The combination of automated SPE and GC/HRMS for the analysis of nitrosamines has not been reported previously. The method shows as advantages the selectivity and sensitivity of GC/HRMS analysis and the high efficiency of automated SPE with coconut charcoal EPA 521 cartridges. Low method detection limits (MDLs) were achieved, along with a greater facility of the procedure and less dependence on the operator with regard to the methods based on manual SPE. Quality requirements for isotope dilution-based methods were accomplished for most analysed nitrosamines, regarding to trueness (80-120%), method precision (<15%) and MDLs (0.08-1.7 ng/L). Nineteen water samples (16 samples from a drinking water treatment plant {DWTP}, 2 chlorinated samples from a sewage treatment plant {STP} effluent, and 1 chlorinated sample from a reservoir) were analysed. Concentrations of nitrosamines in the STP effluent were 309.4 and 730.2 ng/L, being higher when higher doses of chlorine were applied. N-Nitrosodimethylamine (NDMA) and N-nitrosodiethylamine (NDEA) were the main compounds identified in the STP effluent, and NDEA was detected above 200 ng/L, regulatory level for NDMA in effluents stated in Ontario (Canada). Lower concentrations of nitrosamines were found in the reservoir (20.3 ng/L) and in the DWTP samples (n.d. -28.6 ng/L). NDMA and NDEA were respectively found in the reservoir and in treated and highly chlorinated DWTP samples at concentrations above 10 ng/L (guide value established in different countries). The highest concentrations of nitrosamines were found after chlorination and ozonation processes (ozonated, treated and highly chlorinated water) in DWTP samples.

  18. A Self-Adapting System for the Automated Detection of Inter-Ictal Epileptiform Discharges

    PubMed Central

    Lodder, Shaun S.; van Putten, Michel J. A. M.

    2014-01-01

    Purpose Scalp EEG remains the standard clinical procedure for the diagnosis of epilepsy. Manual detection of inter-ictal epileptiform discharges (IEDs) is slow and cumbersome, and few automated methods are used to assist in practice. This is mostly due to low sensitivities, high false positive rates, or a lack of trust in the automated method. In this study we aim to find a solution that will make computer assisted detection more efficient than conventional methods, while preserving the detection certainty of a manual search. Methods Our solution consists of two phases. First, a detection phase finds all events similar to epileptiform activity by using a large database of template waveforms. Individual template detections are combined to form “IED nominations”, each with a corresponding certainty value based on the reliability of their contributing templates. The second phase uses the ten nominations with highest certainty and presents them to the reviewer one by one for confirmation. Confirmations are used to update certainty values of the remaining nominations, and another iteration is performed where ten nominations with the highest certainty are presented. This continues until the reviewer is satisfied with what has been seen. Reviewer feedback is also used to update template accuracies globally and improve future detections. Key Findings Using the described method and fifteen evaluation EEGs (241 IEDs), one third of all inter-ictal events were shown after one iteration, half after two iterations, and 74%, 90%, and 95% after 5, 10 and 15 iterations respectively. Reviewing fifteen iterations for the 20–30 min recordings 1took approximately 5 min. Significance The proposed method shows a practical approach for combining automated detection with visual searching for inter-ictal epileptiform activity. Further evaluation is needed to verify its clinical feasibility and measure the added value it presents. PMID:24454813

  19. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  1. Automated alignment of a reconfigurable optical system using focal-plane sensing and Kalman filtering.

    PubMed

    Fang, Joyce; Savransky, Dmitry

    2016-08-01

    Automation of alignment tasks can provide improved efficiency and greatly increase the flexibility of an optical system. Current optical systems with automated alignment capabilities are typically designed to include a dedicated wavefront sensor. Here, we demonstrate a self-aligning method for a reconfigurable system using only focal plane images. We define a two lens optical system with 8 degrees of freedom. Images are simulated given misalignment parameters using ZEMAX software. We perform a principal component analysis on the simulated data set to obtain Karhunen-Loève modes, which form the basis set whose weights are the system measurements. A model function, which maps the state to the measurement, is learned using nonlinear least-squares fitting and serves as the measurement function for the nonlinear estimator (extended and unscented Kalman filters) used to calculate control inputs to align the system. We present and discuss simulated and experimental results of the full system in operation.

  2. The discriminatory power of ribotyping as automatable technique for differentiation of bacteria.

    PubMed

    Schumann, Peter; Pukall, Rüdiger

    2013-09-01

    Since the introduction of ribonucleic acid gene restriction patterns as taxonomic tools in 1986, ribotyping has become an established method for systematics, epidemiological, ecological and population studies of microorganisms. In the last 25 years, several modifications have improved the convenience, reproducibility and turn-around time of this technique. The technological development culminated in the automation of ribotyping which allowed for high-throughput applications e.g. in the quality control of food production, pharmaceutical industry and culture collections. The capability of the fully automated RiboPrinter(®) System for the differentiation of bacteria below the species level is compared with the discriminatory power of traditional ribotyping, of molecular fingerprint techniques like PFGE, MLST and MLVA as well as of MALDI-TOF mass spectrometry. While automated RiboPrinting is advantageous with respect to standardization, ease and speed, PCR ribotyping has proved being a highly discriminatory, flexible, robust and cost-efficient routine technique which makes inter-laboratory comparison and build of ribotype databases possible, too. Copyright © 2013 Elsevier GmbH. All rights reserved.

  3. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    PubMed

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  4. Automated profiling of individual cell-cell interactions from high-throughput time-lapse imaging microscopy in nanowell grids (TIMING).

    PubMed

    Merouane, Amine; Rey-Villamizar, Nicolas; Lu, Yanbin; Liadi, Ivan; Romain, Gabrielle; Lu, Jennifer; Singh, Harjeet; Cooper, Laurence J N; Varadarajan, Navin; Roysam, Badrinath

    2015-10-01

    There is a need for effective automated methods for profiling dynamic cell-cell interactions with single-cell resolution from high-throughput time-lapse imaging data, especially, the interactions between immune effector cells and tumor cells in adoptive immunotherapy. Fluorescently labeled human T cells, natural killer cells (NK), and various target cells (NALM6, K562, EL4) were co-incubated on polydimethylsiloxane arrays of sub-nanoliter wells (nanowells), and imaged using multi-channel time-lapse microscopy. The proposed cell segmentation and tracking algorithms account for cell variability and exploit the nanowell confinement property to increase the yield of correctly analyzed nanowells from 45% (existing algorithms) to 98% for wells containing one effector and a single target, enabling automated quantification of cell locations, morphologies, movements, interactions, and deaths without the need for manual proofreading. Automated analysis of recordings from 12 different experiments demonstrated automated nanowell delineation accuracy >99%, automated cell segmentation accuracy >95%, and automated cell tracking accuracy of 90%, with default parameters, despite variations in illumination, staining, imaging noise, cell morphology, and cell clustering. An example analysis revealed that NK cells efficiently discriminate between live and dead targets by altering the duration of conjugation. The data also demonstrated that cytotoxic cells display higher motility than non-killers, both before and during contact. broysam@central.uh.edu or nvaradar@central.uh.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Position verification systems for an automated highway system.

    DOT National Transportation Integrated Search

    2015-03-01

    Automated vehicles promote road safety, fuel efficiency, and reduced travel time by decreasing traffic : congestion and driver workload. In a vehicle platoon (grouping vehicles to increase road capacity by : managing distance between vehicles using e...

  6. Office Automation Boosts University's Productivity.

    ERIC Educational Resources Information Center

    School Business Affairs, 1986

    1986-01-01

    The University of Pittsburgh has a 2-year agreement designating the Xerox Corporation as the primary supplier of word processing and related office automation equipment in order to increase productivity and more efficient use of campus resources. (MLF)

  7. Quantitative semi-automated analysis of morphogenesis with single-cell resolution in complex embryos

    PubMed Central

    Giurumescu, Claudiu A.; Kang, Sukryool; Planchon, Thomas A.; Betzig, Eric; Bloomekatz, Joshua; Yelon, Deborah; Cosman, Pamela; Chisholm, Andrew D.

    2012-01-01

    A quantitative understanding of tissue morphogenesis requires description of the movements of individual cells in space and over time. In transparent embryos, such as C. elegans, fluorescently labeled nuclei can be imaged in three-dimensional time-lapse (4D) movies and automatically tracked through early cleavage divisions up to ~350 nuclei. A similar analysis of later stages of C. elegans development has been challenging owing to the increased error rates of automated tracking of large numbers of densely packed nuclei. We present Nucleitracker4D, a freely available software solution for tracking nuclei in complex embryos that integrates automated tracking of nuclei in local searches with manual curation. Using these methods, we have been able to track >99% of all nuclei generated in the C. elegans embryo. Our analysis reveals that ventral enclosure of the epidermis is accompanied by complex coordinated migration of the neuronal substrate. We can efficiently track large numbers of migrating nuclei in 4D movies of zebrafish cardiac morphogenesis, suggesting that this approach is generally useful in situations in which the number, packing or dynamics of nuclei present challenges for automated tracking. PMID:23052905

  8. Automated mitosis detection of stem cell populations in phase-contrast microscopy images.

    PubMed

    Huh, Seungil; Ker, Dai Fei Elmer; Bise, Ryoma; Chen, Mei; Kanade, Takeo

    2011-03-01

    Due to the enormous potential and impact that stem cells may have on regenerative medicine, there has been a rapidly growing interest for tools to analyze and characterize the behaviors of these cells in vitro in an automated and high throughput fashion. Among these behaviors, mitosis, or cell division, is important since stem cells proliferate and renew themselves through mitosis. However, current automated systems for measuring cell proliferation often require destructive or sacrificial methods of cell manipulation such as cell lysis or in vitro staining. In this paper, we propose an effective approach for automated mitosis detection using phase-contrast time-lapse microscopy, which is a nondestructive imaging modality, thereby allowing continuous monitoring of cells in culture. In our approach, we present a probabilistic model for event detection, which can simultaneously 1) identify spatio-temporal patch sequences that contain a mitotic event and 2) localize a birth event, defined as the time and location at which cell division is completed and two daughter cells are born. Our approach significantly outperforms previous approaches in terms of both detection accuracy and computational efficiency, when applied to multipotent C3H10T1/2 mesenchymal and C2C12 myoblastic stem cell populations.

  9. Augmenting SCA project management and automation framework

    NASA Astrophysics Data System (ADS)

    Iyapparaja, M.; Sharma, Bhanupriya

    2017-11-01

    In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.

  10. Finding Flounders: Outsmarting the Art of Camouflage

    ScienceCinema

    None

    2018-06-13

    At Los Alamos National Laboratory, we study camouflage in nature to learn how we can identify things trying to disguise themselves. We do that by looking at marine organisms that are exceptionally good at the art of blending in: flounders, skates, cuttlefish and octopi. The goal of this work was to develop efficient automated methods for detecting and analyzing features in remote sensing imagery for national security and intelligence applications.

  11. Finding Flounders: Outsmarting the Art of Camouflage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    At Los Alamos National Laboratory, we study camouflage in nature to learn how we can identify things trying to disguise themselves. We do that by looking at marine organisms that are exceptionally good at the art of blending in: flounders, skates, cuttlefish and octopi. The goal of this work was to develop efficient automated methods for detecting and analyzing features in remote sensing imagery for national security and intelligence applications.

  12. Intelligent interface design and evaluation

    NASA Technical Reports Server (NTRS)

    Greitzer, Frank L.

    1988-01-01

    Intelligent interface concepts and systematic approaches to assessing their functionality are discussed. Four general features of intelligent interfaces are described: interaction efficiency, subtask automation, context sensitivity, and use of an appropriate design metaphor. Three evaluation methods are discussed: Functional Analysis, Part-Task Evaluation, and Operational Testing. Design and evaluation concepts are illustrated with examples from a prototype expert system interface for environmental control and life support systems for manned space platforms.

  13. Use of a Novel Fluidics Microbead Trap/Flow-cell Enhances Speed and Sensitivity of Bead-Based Bioassays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozanich, Rich M.; Antolick, Kathryn C.; Bruckner-Lea, Cindy J.

    2007-09-15

    Automated devices and methods for biological sample preparation often utilize surface functionalized microbeads (superparamagnetic or non-magnetic) to allow capture, purification and pre-concentration of trace amounts of proteins, cells, or nucleic acids (DNA/RNA) from complex samples. We have developed unique methods and hardware for trapping either magnetic or non-magnetic functionalized beads that allow samples and reagents to be efficiently perfused over a micro-column of beads. This approach yields enhanced mass transport and up to 5-fold improvements in assay sensitivity or speed, dramatically improving assay capability relative to assays conducted in more traditional “batch modes” (i.e., in tubes or microplate wells). Summarymore » results are given that highlight the analytical performance improvements obtained for automated microbead processing systems utilizing novel microbead trap/flow-cells for various applications, including: 1) simultaneous capture of multiple cytokines using an antibody-coupled polystyrene bead assay with subsequent flow cytometry detection; 2) capture of nucleic acids using oligonucleotide coupled polystyrene beads with flow cytometry detection; and 3) capture of Escherichia coli 0157:H7 (E. coli) from 50 mL sample volumes using antibody-coupled superparamagnetic microbeads with subsequent culturing to assess capture efficiency.« less

  14. A spatiotemporal-based scheme for efficient registration-based segmentation of thoracic 4-D MRI.

    PubMed

    Yang, Y; Van Reeth, E; Poh, C L; Tan, C H; Tham, I W K

    2014-05-01

    Dynamic three-dimensional (3-D) (four-dimensional, 4-D) magnetic resonance (MR) imaging is gaining importance in the study of pulmonary motion for respiratory diseases and pulmonary tumor motion for radiotherapy. To perform quantitative analysis using 4-D MR images, segmentation of anatomical structures such as the lung and pulmonary tumor is required. Manual segmentation of entire thoracic 4-D MRI data that typically contains many 3-D volumes acquired over several breathing cycles is extremely tedious, time consuming, and suffers high user variability. This requires the development of new automated segmentation schemes for 4-D MRI data segmentation. Registration-based segmentation technique that uses automatic registration methods for segmentation has been shown to be an accurate method to segment structures for 4-D data series. However, directly applying registration-based segmentation to segment 4-D MRI series lacks efficiency. Here we propose an automated 4-D registration-based segmentation scheme that is based on spatiotemporal information for the segmentation of thoracic 4-D MR lung images. The proposed scheme saved up to 95% of computation amount while achieving comparable accurate segmentations compared to directly applying registration-based segmentation to 4-D dataset. The scheme facilitates rapid 3-D/4-D visualization of the lung and tumor motion and potentially the tracking of tumor during radiation delivery.

  15. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  16. Automated Cooperative Trajectories for a More Efficient and Responsive Air Transportation System

    NASA Technical Reports Server (NTRS)

    Hanson, Curt

    2015-01-01

    The NASA Automated Cooperative Trajectories project is developing a prototype avionics system that enables multi-vehicle cooperative control by integrating 1090 MHz ES ADS-B digital communications with onboard autopilot systems. This cooperative control capability will enable meta-aircraft operations for enhanced airspace utilization, as well as improved vehicle efficiency through wake surfing. This briefing describes the objectives and approach to a flight evaluation of this system planned for 2016.

  17. Principal axes estimation using the vibration modes of physics-based deformable models.

    PubMed

    Krinidis, Stelios; Chatzis, Vassilios

    2008-06-01

    This paper addresses the issue of accurate, effective, computationally efficient, fast, and fully automated 2-D object orientation and scaling factor estimation. The object orientation is calculated using object principal axes estimation. The approach relies on the object's frequency-based features. The frequency-based features used by the proposed technique are extracted by a 2-D physics-based deformable model that parameterizes the objects shape. The method was evaluated on synthetic and real images. The experimental results demonstrate the accuracy of the method, both in orientation and the scaling estimations.

  18. Chapter 19: HVAC Controls (DDC/EMS/BAS) Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W.; Romberger, Jeff

    The HVAC Controls Evaluation Protocol is designed to address evaluation issues for direct digital controls/energy management systems/building automation systems (DDC/EMS/BAS) that are installed to control heating, ventilation, and air-conditioning (HVAC) equipment in commercial and institutional buildings. (This chapter refers to the DDC/EMS/BAS measure as HVAC controls.) This protocol may also be applicable to industrial facilities such as clean rooms and labs, which have either significant HVAC equipment or spaces requiring special environmental conditions.

  19. Imputing missing data via sparse reconstruction techniques.

    DOT National Transportation Integrated Search

    2017-06-01

    The State of Texas does not currently have an automated approach for estimating volumes for links without counts. This research project proposes the development of an automated system to efficiently estimate the traffic volumes on uncounted links, in...

  20. Automated low-cost and real-time truck parking information system.

    DOT National Transportation Integrated Search

    2013-11-01

    In this project an automated real-time parking information system was developed to improve : truck-parking safety through efficient gathering and disseminating information regarding the use : of existing parking capacity. The system consists of four ...

  1. The Adam and Eve Robot Scientists for the Automated Discovery of Scientific Knowledge

    NASA Astrophysics Data System (ADS)

    King, Ross

    A Robot Scientist is a physically implemented robotic system that applies techniques from artificial intelligence to execute cycles of automated scientific experimentation. A Robot Scientist can automatically execute cycles of hypothesis formation, selection of efficient experiments to discriminate between hypotheses, execution of experiments using laboratory automation equipment, and analysis of results. The motivation for developing Robot Scientists is to better understand science, and to make scientific research more efficient. The Robot Scientist `Adam' was the first machine to autonomously discover scientific knowledge: both form and experimentally confirm novel hypotheses. Adam worked in the domain of yeast functional genomics. The Robot Scientist `Eve' was originally developed to automate early-stage drug development, with specific application to neglected tropical disease such as malaria, African sleeping sickness, etc. We are now adapting Eve to work with on cancer. We are also teaching Eve to autonomously extract information from the scientific literature.

  2. Workload and Performance in Air Traffic Control: Exploring the Influence of Levels of Automation and Variation in Task Demand

    NASA Technical Reports Server (NTRS)

    Edwards, Tamsyn El; Martin, Lynne; Bienert, Nancy; Mercer, Joey

    2017-01-01

    In air traffic control, task demand and workload have important implications for the safety and efficiency of air traffic. Task demand is dynamic, however, research on demand transitions and associated controller perception and performance is limited. In addition, there is a comparatively restricted understanding of the influence of task demand transitions on workload and performance, in association with automation. This study used an air traffic control simulation to investigate the influence of task demand transitions and two conditions of varying automation, on workload and efficiency-related performance. Findings showed that a both the direction of the task demand variation, and the amount of automation, influenced the relationship between workload and performance. Further research is needed to enhance understanding of demand transition and workload history effects on operator experience and performance, in both air traffic control and other safety-critical domains.

  3. Estimated Bounds and Important Factors for Fuel Use and Consumer Costs of Connected and Automated Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, T. S.; Gonder, Jeff; Chen, Yuche

    This report details a study of the potential effects of connected and automated vehicle (CAV) technologies on vehicle miles traveled (VMT), vehicle fuel efficiency, and consumer costs. Related analyses focused on a range of light-duty CAV technologies in conventional powertrain vehicles -- from partial automation to full automation, with and without ridesharing -- compared to today's base-case scenario. Analysis results revealed widely disparate upper- and lower-bound estimates for fuel use and VMT, ranging from a tripling of fuel use to decreasing light-duty fuel use to below 40% of today's level. This wide range reflects uncertainties in the ways that CAVmore » technologies can influence vehicle efficiency and use through changes in vehicle designs, driving habits, and travel behavior. The report further identifies the most significant potential impacting factors, the largest areas of uncertainty, and where further research is particularly needed.« less

  4. An operator interface design for a telerobotic inspection system

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Tso, Kam S.; Hayati, Samad

    1993-01-01

    The operator interface has recently emerged as an important element for efficient and safe interactions between human operators and telerobotics. Advances in graphical user interface and graphics technologies enable us to produce very efficient operator interface designs. This paper describes an efficient graphical operator interface design newly developed for remote surface inspection at NASA-JPL. The interface, designed so that remote surface inspection can be performed by a single operator with an integrated robot control and image inspection capability, supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.

  5. Automated Estimation of Melanocytic Skin Tumor Thickness by Ultrasonic Radiofrequency Data.

    PubMed

    Andrekute, Kristina; Valiukeviciene, Skaidra; Raisutis, Renaldas; Linkeviciute, Gintare; Makstiene, Jurgita; Kliunkiene, Renata

    2016-05-01

    High-frequency (>20-MHz) ultrasound (US) is a noninvasive preoperative tool for assessment of melanocytic skin tumor thickness. Ultrasonic melanocytic skin tumor thickness estimation is not always easy and is related to the experience of the clinician. In this article, we present an automated thickness measurement method based on time-frequency analysis of US radiofrequency signals. The study was performed on 52 thin (≤1-mm) melanocytic skin tumors (46 melanocytic nevi and 6 melanomas). Radiofrequency signals were obtained with a single-element focused transducer (fundamental frequency, 22 MHz; bandwidth, 12-28 MHz). The radiofrequency data were analyzed in the time-frequency domain to make the tumor boundaries more noticeable. The thicknesses of the tumors were evaluated by 3 different metrics: histologically measured Breslow thickness, manually measured US thickness, and automatically measured US thickness. The results showed a higher correlation coefficient between the automatically measured US thickness and Breslow thickness (r= 0.83; P< .0001) than the manually measured US thickness (r = 0.68; P < .0001). The sensitivity of the automated tumor thickness measurement algorithm was 96.55%, and the specificity was 78.26% compared with histologic measurement. The sensitivity of the manually measured US thickness was 75.86%, and the specificity was 73.91%. The efficient automated tumor thickness measurement method developed could be used as a tool for preoperative assessment of melanocytic skin tumor thickness. © 2016 by the American Institute of Ultrasound in Medicine.

  6. Automated selective disruption of slow wave sleep.

    PubMed

    Ooms, Sharon J; Zempel, John M; Holtzman, David M; Ju, Yo-El S

    2017-04-01

    Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10s live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5-4Hz) band, particularly in the 0.5-2Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4-8Hz) and alpha (8-12Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6min, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Automated face detection for occurrence and occupancy estimation in chimpanzees.

    PubMed

    Crunchant, Anne-Sophie; Egerer, Monika; Loos, Alexander; Burghardt, Tilo; Zuberbühler, Klaus; Corogenes, Katherine; Leinert, Vera; Kulik, Lars; Kühl, Hjalmar S

    2017-03-01

    Surveying endangered species is necessary to evaluate conservation effectiveness. Camera trapping and biometric computer vision are recent technological advances. They have impacted on the methods applicable to field surveys and these methods have gained significant momentum over the last decade. Yet, most researchers inspect footage manually and few studies have used automated semantic processing of video trap data from the field. The particular aim of this study is to evaluate methods that incorporate automated face detection technology as an aid to estimate site use of two chimpanzee communities based on camera trapping. As a comparative baseline we employ traditional manual inspection of footage. Our analysis focuses specifically on the basic parameter of occurrence where we assess the performance and practical value of chimpanzee face detection software. We found that the semi-automated data processing required only 2-4% of the time compared to the purely manual analysis. This is a non-negligible increase in efficiency that is critical when assessing the feasibility of camera trap occupancy surveys. Our evaluations suggest that our methodology estimates the proportion of sites used relatively reliably. Chimpanzees are mostly detected when they are present and when videos are filmed in high-resolution: the highest recall rate was 77%, for a false alarm rate of 2.8% for videos containing only chimpanzee frontal face views. Certainly, our study is only a first step for transferring face detection software from the lab into field application. Our results are promising and indicate that the current limitation of detecting chimpanzees in camera trap footage due to lack of suitable face views can be easily overcome on the level of field data collection, that is, by the combined placement of multiple high-resolution cameras facing reverse directions. This will enable to routinely conduct chimpanzee occupancy surveys based on camera trapping and semi-automated processing of footage. Using semi-automated ape face detection technology for processing camera trap footage requires only 2-4% of the time compared to manual analysis and allows to estimate site use by chimpanzees relatively reliably. © 2017 Wiley Periodicals, Inc.

  8. An event-related visual occlusion method for examining anticipatory skill in natural interceptive tasks.

    PubMed

    Mann, David L; Abernethy, Bruce; Farrow, Damian; Davis, Mark; Spratford, Wayne

    2010-05-01

    This article describes a new automated method for the controlled occlusion of vision during natural tasks. The method permits the time course of the presence or absence of visual information to be linked to identifiable events within the task of interest. An example application is presented in which the method is used to examine the ability of cricket batsmen to pick up useful information from the prerelease movement patterns of the opposing bowler. Two key events, separated by a consistent within-action time lag, were identified in the cricket bowling action sequence-namely, the penultimate foot strike prior to ball release (Event 1), and the subsequent moment of ball release (Event 2). Force-plate registration of Event 1 was then used as a trigger to facilitate automated occlusion of vision using liquid crystal occlusion goggles at time points relative to Event 2. Validation demonstrated that, compared with existing approaches that are based on manual triggering, this method of occlusion permitted considerable gains in temporal precision and a reduction in the number of unusable trials. A more efficient and accurate protocol to examine anticipation is produced, while preserving the important natural coupling between perception and action.

  9. Automated Segmentation of High-Resolution Photospheric Images of Active Regions

    NASA Astrophysics Data System (ADS)

    Yang, Meng; Tian, Yu; Rao, Changhui

    2018-02-01

    Due to the development of ground-based, large-aperture solar telescopes with adaptive optics (AO) resulting in increasing resolving ability, more accurate sunspot identifications and characterizations are required. In this article, we have developed a set of automated segmentation methods for high-resolution solar photospheric images. Firstly, a local-intensity-clustering level-set method is applied to roughly separate solar granulation and sunspots. Then reinitialization-free level-set evolution is adopted to adjust the boundaries of the photospheric patch; an adaptive intensity threshold is used to discriminate between umbra and penumbra; light bridges are selected according to their regional properties from candidates produced by morphological operations. The proposed method is applied to the solar high-resolution TiO 705.7-nm images taken by the 151-element AO system and Ground-Layer Adaptive Optics prototype system at the 1-m New Vacuum Solar Telescope of the Yunnan Observatory. Experimental results show that the method achieves satisfactory robustness and efficiency with low computational cost on high-resolution images. The method could also be applied to full-disk images, and the calculated sunspot areas correlate well with the data given by the National Oceanic and Atmospheric Administration (NOAA).

  10. Automated Mini-Column Solid-Phase Extraction Cleanup for High-Throughput Analysis of Chemical Contaminants in Foods by Low-Pressure Gas Chromatography-Tandem Mass Spectrometry.

    PubMed

    Lehotay, Steven J; Han, Lijun; Sapozhnikova, Yelena

    2016-01-01

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography-tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. Cleanup efficiencies and breakthrough volumes using different mini-SPE sorbents were compared using avocado, salmon, pork loin, and kale as representative matrices. Optimum extract load volume was 300 µL for the 45 mg mini-cartridges containing 20/12/12/1 (w/w/w/w) anh. MgSO 4 /PSA (primary secondary amine)/C 18 /CarbonX sorbents used in the final method. In method validation to demonstrate high-throughput capabilities and performance results, 230 spiked extracts of 10 different foods (apple, kiwi, carrot, kale, orange, black olive, wheat grain, dried basil, pork, and salmon) underwent automated mini-SPE cleanup and analysis over the course of 5 days. In all, 325 analyses for 54 pesticides and 43 environmental contaminants (3 analyzed together) were conducted using the 10 min LPGC-MS/MS method without changing the liner or retuning the instrument. Merely, 1 mg equivalent sample injected achieved <5 ng g -1 limits of quantification. With the use of internal standards, method validation results showed that 91 of the 94 analytes including pairs achieved satisfactory results (70-120 % recovery and RSD ≤ 25 %) in the 10 tested food matrices ( n  = 160). Matrix effects were typically less than ±20 %, mainly due to the use of analyte protectants, and minimal human review of software data processing was needed due to summation function integration of analyte peaks. This study demonstrated that the automated mini-SPE + LPGC-MS/MS method yielded accurate results in rugged, high-throughput operations with minimal labor and data review.

  11. Automated Immunomagnetic Separation and Microarray Detection of E. coli O157:H7 from Poultry Carcass Rinse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandler, Darrell P.; Brown, Jeremy D.; Call, Douglas R.

    2001-09-01

    We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowedmore » complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.« less

  12. Automation Bias: Decision Making and Performance in High-Tech Cockpits

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  13. An optimal model-based trajectory following architecture synthesising the lateral adaptive preview strategy and longitudinal velocity planning for highly automated vehicle

    NASA Astrophysics Data System (ADS)

    Cao, Haotian; Song, Xiaolin; Zhao, Song; Bao, Shan; Huang, Zhi

    2017-08-01

    Automated driving has received a broad of attentions from the academia and industry, since it is effective to greatly reduce the severity of potential traffic accidents and achieve the ultimate automobile safety and comfort. This paper presents an optimal model-based trajectory following architecture for highly automated vehicle in its driving tasks such as automated guidance or lane keeping, which includes a velocity-planning module, a steering controller and a velocity-tracking controller. The velocity-planning module considering the optimal time-consuming and passenger comforts simultaneously could generate a smooth velocity profile. The robust sliding mode control (SMC) steering controller with adaptive preview time strategy could not only track the target path well, but also avoid a big lateral acceleration occurred in its path-tracking progress due to a fuzzy-adaptive preview time mechanism introduced. In addition, an SMC controller with input-output linearisation method for velocity tracking is built and validated. Simulation results show this trajectory following architecture are effective and feasible for high automated driving vehicle, comparing with the Driver-in-the-Loop simulations performed by an experienced driver and novice driver, respectively. The simulation results demonstrate that the present trajectory following architecture could plan a satisfying longitudinal speed profile, track the target path well and safely when dealing with different road geometry structure, it ensures a good time efficiency and driving comfort simultaneously.

  14. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  15. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping1[C][W][OPEN

    PubMed Central

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-01-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818

  16. Serum bactericidal assay for the evaluation of typhoid vaccine using a semi-automated colony-counting method.

    PubMed

    Jang, Mi Seon; Sahastrabuddhe, Sushant; Yun, Cheol-Heui; Han, Seung Hyun; Yang, Jae Seung

    2016-08-01

    Typhoid fever, mainly caused by Salmonella enterica serovar Typhi (S. Typhi), is a life-threatening disease, mostly in developing countries. Enzyme-linked immunosorbent assay (ELISA) is widely used to quantify antibodies against S. Typhi in serum but does not provide information about functional antibody titers. Although the serum bactericidal assay (SBA) using an agar plate is often used to measure functional antibody titers against various bacterial pathogens in clinical specimens, it has rarely been used for typhoid vaccines because it is time-consuming and labor-intensive. In the present study, we established an improved SBA against S. Typhi using a semi-automated colony-counting system with a square agar plate harboring 24 samples. The semi-automated SBA efficiently measured bactericidal titers of sera from individuals immunized with S. Typhi Vi polysaccharide vaccines. The assay specifically responded to S. Typhi Ty2 but not to other irrelevant enteric bacteria including Vibrio cholerae and Shigella flexneri. Baby rabbit complement was more appropriate source for the SBA against S. Typhi than complements from adult rabbit, guinea pig, and human. We also examined the correlation between SBA and ELISA for measuring antibody responses against S. Typhi using pre- and post-vaccination sera from 18 human volunteers. The SBA titer showed a good correlation with anti-Vi IgG quantity in the serum as determined by Spearman correlation coefficient of 0.737 (P < 0.001). Taken together, the semi-automated SBA might be efficient, accurate, sensitive, and specific enough to measure functional antibody titers against S. Typhi in sera from human subjects immunized with typhoid vaccines. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Automated direct-immersion solid-phase microextraction using crosslinked polymeric ionic liquid sorbent coatings for the determination of water pollutants by gas chromatography.

    PubMed

    Cordero-Vaca, María; Trujillo-Rodríguez, María J; Zhang, Cheng; Pino, Verónica; Anderson, Jared L; Afonso, Ana M

    2015-06-01

    Four different crosslinked polymeric ionic liquid (PIL)-based sorbent coatings were evaluated in an automated direct-immersion solid-phase microextraction method (automated DI-SPME) in combination with gas chromatography (GC). The crosslinked PIL coatings were based on vinyl-alkylimidazolium- (ViCnIm-) or vinylbenzyl-alkylimidazolium- (ViBzCnIm-) IL monomers, and di-(vinylimidazolium)dodecane ((ViIm)2C12-) or di-(vinylbenzylimidazolium)dodecane ((ViBzIm)2C12-) dicationic IL crosslinkers. In addition, a PIL-based hybrid coating containing multi-walled carbon nanotubes (MWCNTs) was also studied. The studied PIL coatings were covalently attached to derivatized nitinol wires and mounted onto the Supelco assembly to ensure automation when acting as SPME coatings. Their behavior was evaluated in the determination of a group of water pollutants, after proper optimization. A comparison was carried out with three common commercial SPME fibers. It was observed that those PILs containing a benzyl group in their structures, either in the IL monomer and crosslinker (PIL-1-1) or only in the crosslinker (PIL-0-1), were the most efficient sorbents for the selected analytes. The validation of the overall automated DI-SPME-GC-flame ionization detector (FID) method gave limits of detection down to 135 μg · L(-1) for p-cresol when using the PIL-1-1 and down to 270 μg · L(-1) when using the PIL-0-1; despite their coating thickness: ~2 and ~5 μm, respectively. Average relative recoveries with waters were of 85 ± 14 % and 87 ± 15 % for PIL-1-1 and PIL-0-1, respectively. Precision values as relative standard deviation were always lower than 4.9 and 7.6 % (spiked level between 10 and 750 μg · L(-1), as intra-day precision). Graphical Abstract Automated DI-SPME-GC-FID using crosslinked-PILs sorbent coatings for the determination of waterpollutants.

  18. Automated Cross-Sectional Measurement Method of Intracranial Dural Venous Sinuses.

    PubMed

    Lublinsky, S; Friedman, A; Kesler, A; Zur, D; Anconina, R; Shelef, I

    2016-03-01

    MRV is an important blood vessel imaging and diagnostic tool for the evaluation of stenosis, occlusions, or aneurysms. However, an accurate image-processing tool for vessel comparison is unavailable. The purpose of this study was to develop and test an automated technique for vessel cross-sectional analysis. An algorithm for vessel cross-sectional analysis was developed that included 7 main steps: 1) image registration, 2) masking, 3) segmentation, 4) skeletonization, 5) cross-sectional planes, 6) clustering, and 7) cross-sectional analysis. Phantom models were used to validate the technique. The method was also tested on a control subject and a patient with idiopathic intracranial hypertension (4 large sinuses tested: right and left transverse sinuses, superior sagittal sinus, and straight sinus). The cross-sectional area and shape measurements were evaluated before and after lumbar puncture in patients with idiopathic intracranial hypertension. The vessel-analysis algorithm had a high degree of stability with <3% of cross-sections manually corrected. All investigated principal cranial blood sinuses had a significant cross-sectional area increase after lumbar puncture (P ≤ .05). The average triangularity of the transverse sinuses was increased, and the mean circularity of the sinuses was decreased by 6% ± 12% after lumbar puncture. Comparison of phantom and real data showed that all computed errors were <1 voxel unit, which confirmed that the method provided a very accurate solution. In this article, we present a novel automated imaging method for cross-sectional vessels analysis. The method can provide an efficient quantitative detection of abnormalities in the dural sinuses. © 2016 by American Journal of Neuroradiology.

  19. Development and validation of automated 2D-3D bronchial airway matching to track changes in regional bronchial morphology using serial low-dose chest CT scans in children with chronic lung disease.

    PubMed

    Raman, Pavithra; Raman, Raghav; Newman, Beverley; Venkatraman, Raman; Raman, Bhargav; Robinson, Terry E

    2010-12-01

    To address potential concern for cumulative radiation exposure with serial spiral chest computed tomography (CT) scans in children with chronic lung disease, we developed an approach to match bronchial airways on low-dose spiral and low-dose high-resolution CT (HRCT) chest images to allow serial comparisons. An automated algorithm matches the position and orientation of bronchial airways obtained from HRCT slices with those in the spiral CT scan. To validate this algorithm, we compared manual matching vs automatic matching of bronchial airways in three pediatric patients. The mean absolute percentage difference between the manually matched spiral CT airway and the index HRCT airways were 9.4 ± 8.5% for the internal diameter measurements, 6.0 ± 4.1% for the outer diameter measurements, and 10.1 ± 9.3% for the wall thickness measurements. The mean absolute percentage difference between the automatically matched spiral CT airway measurements and index HRCT airway measurements were 9.2 ± 8.6% for the inner diameter, 5.8 ± 4.5% for the outer diameter, and 9.9 ± 9.5% for the wall thickness. The overall difference between manual and automated methods was 2.1 ± 1.2%, which was significantly less than the interuser variability of 5.1 ± 4.6% (p<0.05). Tests of equivalence had p<0.05, demonstrating no significant difference between the two methods. The time required for matching was significantly reduced in the automated method (p<0.01) and was as accurate as manual matching, allowing efficient comparison of airways obtained on low-dose spiral CT imaging with low-dose HRCT scans.

  20. Automated chromatographic laccase-mediator-system activity assay.

    PubMed

    Anders, Nico; Schelden, Maximilian; Roth, Simon; Spiess, Antje C

    2017-08-01

    To study the interaction of laccases, mediators, and substrates in laccase-mediator systems (LMS), an on-line measurement was developed using high performance anion exchange chromatography equipped with a CarboPac™ PA 100 column coupled to pulsed amperometric detection (HPAEC-PAD). The developed method was optimized for overall chromatographic run time (45 to 120 min) and automated sample drawing. As an example, the Trametes versicolor laccase induced oxidation of 1-(3,4-dimethoxyphenyl)-2-(2-methoxyphenoxy)-1,3-dihydroxypropane (adlerol) using 1-hydroxybenzotriazole (HBT) as mediator was measured and analyzed on-line. Since the Au electrode of the PAD detects only hydroxyl group containing substances with a limit of detection being in the milligram/liter range, not all products are measureable. Therefore, this method was applied for the quantification of adlerol, and-based on adlerol conversion-for the quantification of the LMS activity at a specific T. versicolor laccase/HBT ratio. The automated chromatographic activity assay allowed for a defined reaction start of all laccase-mediator-system reactions mixtures, and the LMS reaction progress was automatically monitored for 48 h. The automatization enabled an integrated monitoring overnight and over-weekend and minimized all manual errors such as pipetting of solutions accordingly. The activity of the LMS based on adlerol consumption was determined to 0.47 U/mg protein for a laccase/mediator ratio of 1.75 U laccase/g HBT. In the future, the automated method will allow for a fast screening of combinations of laccases, mediators, and substrates which are efficient for lignin modification. In particular, it allows for a fast and easy quantification of the oxidizing activity of an LMS on a lignin-related substrate which is not covered by typical colorimetric laccase assays. ᅟ.

  1. Automated processing of whole blood units: operational value and in vitro quality of final blood components

    PubMed Central

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958

  2. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  3. USSR Report: Cybernetics, Computers and Automation Technology. No. 69.

    DTIC Science & Technology

    1983-05-06

    computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A

  4. Space station as a vital focus for advancing the technologies of automation and robotics

    NASA Technical Reports Server (NTRS)

    Varsi, Giulio; Herman, Daniel H.

    1988-01-01

    A major guideline for the design of the U.S. Space Station is that the Space Station address a wide variety of functions. These functions include the servicing of unmanned assets in space, the support of commercial labs in space and the efficient management of the Space Station itself; the largest space asset. The technologies of Automation and Robotics have the promise to help in reducing Space Station operating costs and to achieve a highly efficient use of the human in space. The use of advanced automation and artificial intelligence techniques, such as expert systems, in Space Station subsystems for activity planning and failure mode management will enable us to reduce dependency on a mission control center and could ultimately result in breaking the umbilical link from Earth to the Space Station. The application of robotic technologies with advanced perception capability and hierarchical intelligent control to servicing system will enable the servicing of assets either in space or in situ with a high degree of human efficiency. The results of studies leading toward the formulation of an automation and robotics plan for Space Station development are presented.

  5. Automated Acquisition of Proximal Femur Morphological Characteristics

    NASA Astrophysics Data System (ADS)

    Tabakovic, Slobodan; Zeljkovic, Milan; Milojevic, Zoran

    2014-10-01

    The success of the hip arthroplasty surgery largely depends on the endoprosthesis adjustment to the patient's femur. This implies that the position of the femoral bone in relation to the pelvis is preserved and that the endoprosthesis position ensures its longevity. Dimensions and body shape of the hip joint endoprosthesis and its position after the surgery depend on a number of geometrical parameters of the patient's femur. One of the most suitable methods for determination of these parameters involves 3D reconstruction of femur, based on diagnostic images, and subsequent determination of the required geometric parameters. In this paper, software for automated determination of geometric parameters of the femur is presented. Detailed software development procedure for the purpose of faster and more efficient design of the hip endoprosthesis that ensures patients' specific requirements is also offered

  6. The development of small-scale mechanization means positioning algorithm using radio frequency identification technology in industrial plants

    NASA Astrophysics Data System (ADS)

    Astafiev, A.; Orlov, A.; Privezencev, D.

    2018-01-01

    The article is devoted to the development of technology and software for the construction of positioning and control systems for small mechanization in industrial plants based on radio frequency identification methods, which will be the basis for creating highly efficient intelligent systems for controlling the product movement in industrial enterprises. The main standards that are applied in the field of product movement control automation and radio frequency identification are considered. The article reviews modern publications and automation systems for the control of product movement developed by domestic and foreign manufacturers. It describes the developed algorithm for positioning of small-scale mechanization means in an industrial enterprise. Experimental studies in laboratory and production conditions have been conducted and described in the article.

  7. Automated synthesis of 4-[(18)F]fluoroanisole, [(18)F]DAA1106 and 4-[(18)F]FPhe using Cu-mediated radiofluorination under "minimalist" conditions.

    PubMed

    Zischler, Johannes; Krapf, Philipp; Richarz, Raphael; Zlatopolskiy, Boris D; Neumaier, Bernd

    2016-09-01

    The application of the "minimalist" approach to Cu-mediated radiofluorination allows the efficient preparation of (18)F-labeled arenes regardless of their electronic properties. The implementation of this methodology on a commercially available synthesis module (hotbox(three), Scintomics, Germany) enabled the automated production of 4-[(18)F]fluoroanisole as well as the clinically relevant PET-tracers, 4-[(18)F]FPhe and [(18)F]DAA1106, in radiochemical yields of 41-61% and radiochemical purities of >95% within 30-60min. These results demonstrated the high efficacy and versatility of the developed method that will open up opportunities for a broad application of Cu-mediated radiofluorination in PET-chemistry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Online fully automated three-dimensional surface reconstruction of unknown objects

    NASA Astrophysics Data System (ADS)

    Khalfaoui, Souhaiel; Aigueperse, Antoine; Fougerolle, Yohan; Seulin, Ralph; Fofi, David

    2015-04-01

    This paper presents a novel scheme for automatic and intelligent 3D digitization using robotic cells. The advantage of our procedure is that it is generic since it is not performed for a specific scanning technology. Moreover, it is not dependent on the methods used to perform the tasks associated with each elementary process. The comparison of results between manual and automatic scanning of complex objects shows that our digitization strategy is very efficient and faster than trained experts. The 3D models of the different objects are obtained with a strongly reduced number of acquisitions while moving efficiently the ranging device.

  9. Human-Centered Aviation Automation: Principles and Guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1996-01-01

    This document presents principles and guidelines for human-centered automation in aircraft and in the aviation system. Drawing upon operational experience with highly automated aircraft, it describes classes of problems that have occurred in these vehicles, the effects of advanced automation on the human operators of the aviation system, and ways in which these problems may be avoided in the design of future aircraft and air traffic management automation. Many incidents and a few serious accidents suggest that these problems are related to automation complexity, autonomy, coupling, and opacity, or inadequate feedback to operators. An automation philosophy that emphasizes improved communication, coordination and cooperation between the human and machine elements of this complex, distributed system is required to improve the safety and efficiency of aviation operations in the future.

  10. Feasibility and Utility of Lexical Analysis for Occupational Health Text.

    PubMed

    Harber, Philip; Leroy, Gondy

    2017-06-01

    Assess feasibility and potential utility of natural language processing (NLP) for storing and analyzing occupational health data. Basic NLP lexical analysis methods were applied to 89,000 Mine Safety and Health Administration (MSHA) free text records. Steps included tokenization, term and co-occurrence counts, term annotation, and identifying exposure-health effect relationships. Presence of terms in the Unified Medical Language System (UMLS) was assessed. The methods efficiently demonstrated common exposures, health effects, and exposure-injury relationships. Many workplace terms are not present in UMLS or map inaccurately. Use of free text rather than narrowly defined numerically coded fields is feasible, flexible, and efficient. It has potential to encourage workers and clinicians to provide more data and to support automated knowledge creation. The lexical method used is easily generalizable to other areas. The UMLS vocabularies should be enhanced to be relevant to occupational health.

  11. A Simple Automated Method for the Determination of Nitrate and Nitrite in Infant Formula and Milk Powder Using Sequential Injection Analysis

    PubMed Central

    Pistón, Mariela; Mollo, Alicia; Knochen, Moisés

    2011-01-01

    A fast and efficient automated method using a sequential injection analysis (SIA) system, based on the Griess, reaction was developed for the determination of nitrate and nitrite in infant formulas and milk powder. The system enables to mix a measured amount of sample (previously constituted in the liquid form and deproteinized) with the chromogenic reagent to produce a colored substance whose absorbance was recorded. For nitrate determination, an on-line prereduction step was added by passing the sample through a Cd minicolumn. The system was controlled from a PC by means of a user-friendly program. Figures of merit include linearity (r2 > 0.999 for both analytes), limits of detection (0.32 mg kg−1 NO3-N, and 0.05 mg kg−1 NO2-N), and precision (sr%) 0.8–3.0. Results were statistically in good agreement with those obtained with the reference ISO-IDF method. The sampling frequency was 30 hour−1 (nitrate) and 80 hour−1 (nitrite) when performed separately. PMID:21960750

  12. DEF: an automated dead-end filling approach based on quasi-endosymbiosis.

    PubMed

    Liu, Lili; Zhang, Zijun; Sheng, Taotao; Chen, Ming

    2017-02-01

    Gap filling for the reconstruction of metabolic networks is to restore the connectivity of metabolites via finding high-confidence reactions that could be missed in target organism. Current methods for gap filling either fall into the network topology or have limited capability in finding missing reactions that are indirectly related to dead-end metabolites but of biological importance to the target model. We present an automated dead-end filling (DEF) approach, which is derived from the wisdom of endosymbiosis theory, to fill gaps by finding the most efficient dead-end utilization paths in a constructed quasi-endosymbiosis model. The recalls of reactions and dead ends of DEF reach around 73% and 86%, respectively. This method is capable of finding indirectly dead-end-related reactions with biological importance for the target organism and is applicable to any given metabolic model. In the E. coli iJR904 model, for instance, about 42% of the dead-end metabolites were fixed by our proposed method. DEF is publicly available at http://bis.zju.edu.cn/DEF/. mchen@zju.edu.cn Supplementary data are available at Bioinformatics online.

  13. Efficiency of an automated reception and turnaround time management system for the phlebotomy room.

    PubMed

    Yun, Soon Gyu; Shin, Jeong Won; Park, Eun Su; Bang, Hae In; Kang, Jung Gu

    2016-01-01

    Recent advances in laboratory information systems have largely been focused on automation. However, the phlebotomy services have not been completely automated. To address this issue, we introduced an automated reception and turnaround time (TAT) management system, for the first time in Korea, whereby the patient's information is transmitted directly to the actual phlebotomy site and the TAT for each phlebotomy step can be monitored at a glance. The GNT5 system (Energium Co., Ltd., Korea) was installed in June 2013. The automated reception and TAT management system has been in operation since February 2014. Integration of the automated reception machine with the GNT5 allowed for direct transmission of laboratory order information to the GNT5 without involving any manual reception step. We used the mean TAT from reception to actual phlebotomy as the parameter for evaluating the efficiency of our system. Mean TAT decreased from 5:45 min to 2:42 min after operationalization of the system. The mean number of patients in queue decreased from 2.9 to 1.0. Further, the number of cases taking more than five minutes from reception to phlebotomy, defined as the defect rate, decreased from 20.1% to 9.7%. The use of automated reception and TAT management system was associated with a decrease of overall TAT and an improved workflow at the phlebotomy room.

  14. Automated Classification of Medical Percussion Signals for the Diagnosis of Pulmonary Injuries

    NASA Astrophysics Data System (ADS)

    Bhuiyan, Md Moinuddin

    Used for centuries in the clinical practice, audible percussion is a method of eliciting sounds by areas of the human body either by finger tips or by a percussion hammer. Despite its advantages, pulmonary diagnostics by percussion is still highly subjective, depends on the physician's skills, and requires quiet surroundings. Automation of this well-established technique could help amplify its existing merits while removing the above drawbacks. In this study, an attempt is made to automatically decompose clinical percussion signals into a sum of Exponentially Damped Sinusoids (EDS) using Matrix Pencil Method, which in this case form a more natural basis than Fourier harmonics and thus allow for a more robust representation of the signal in the parametric space. It is found that some EDS represent transient oscillation modes of the thorax/abdomen excited by the percussion event, while others are associated with the noise. It is demonstrated that relatively few EDS are usually enough to accurately reconstruct the original signal. It is shown that combining the frequency and damping parameters of these most significant EDS allows for efficient classification of percussion signals into the two main types historically known as "resonant" and "tympanic". This classification ability can provide a basis for the automated objective diagnostics of various pulmonary pathologies including pneumothorax.

  15. Automated planning of MRI scans of knee joints

    NASA Astrophysics Data System (ADS)

    Bystrov, Daniel; Pekar, Vladimir; Young, Stewart; Dries, Sebastian P. M.; Heese, Harald S.; van Muiswinkel, Arianne M.

    2007-03-01

    A novel and robust method for automatic scan planning of MRI examinations of knee joints is presented. Clinical knee examinations require acquisition of a 'scout' image, in which the operator manually specifies the scan volume orientations (off-centres, angulations, field-of-view) for the subsequent diagnostic scans. This planning task is time-consuming and requires skilled operators. The proposed automated planning system determines orientations for the diagnostic scan by using a set of anatomical landmarks derived by adapting active shape models of the femur, patella and tibia to the acquired scout images. The expert knowledge required to position scan geometries is learned from previous manually planned scans, allowing individual preferences to be taken into account. The system is able to automatically discriminate between left and right knees. This allows to use and merge training data from both left and right knees, and to automatically transform all learned scan geometries to the side for which a plan is required, providing a convenient integration of the automated scan planning system in the clinical routine. Assessment of the method on the basis of 88 images from 31 different individuals, exhibiting strong anatomical and positional variability demonstrates success, robustness and efficiency of all parts of the proposed approach, which thus has the potential to significantly improve the clinical workflow.

  16. Fully Automated Deep Learning System for Bone Age Assessment.

    PubMed

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  17. The Human Side of Library Automation.

    ERIC Educational Resources Information Center

    Morris, Anne; Barnacle, Stephen

    1989-01-01

    Discusses the importance of recognizing the human component in library automation systems to ensure the smooth and efficient operation of the system. Human factors considerations are discussed in terms of health and safety aspects, ergonomics, workplace design, and job organization. (41 references) (CLB)

  18. A Novel Patient Recruitment Strategy: Patient Selection Directly from the Community through Linkage to Clinical Data.

    PubMed

    Zimmerman, Lindsay P; Goel, Satyender; Sathar, Shazia; Gladfelter, Charon E; Onate, Alejandra; Kane, Lindsey L; Sital, Shelly; Phua, Jasmin; Davis, Paris; Margellos-Anast, Helen; Meltzer, David O; Polonsky, Tamar S; Shah, Raj C; Trick, William E; Ahmad, Faraz S; Kho, Abel N

    2018-01-01

    This article presents and describes our methods in developing a novel strategy for recruitment of underrepresented, community-based participants, for pragmatic research studies leveraging routinely collected electronic health record (EHR) data. We designed a new approach for recruiting eligible patients from the community, while also leveraging affiliated health systems to extract clinical data for community participants. The strategy involves methods for data collection, linkage, and tracking. In this workflow, potential participants are identified in the community and surveyed regarding eligibility. These data are then encrypted and deidentified via a hashing algorithm for linkage of the community participant back to a record at a clinical site. The linkage allows for eligibility verification and automated follow-up. Longitudinal data are collected by querying the EHR data and surveying the community participant directly. We discuss this strategy within the context of two national research projects, a clinical trial and an observational cohort study. The community-based recruitment strategy is a novel, low-touch, clinical trial enrollment method to engage a diverse set of participants. Direct outreach to community participants, while utilizing EHR data for clinical information and follow-up, allows for efficient recruitment and follow-up strategies. This new strategy for recruitment links data reported from community participants to clinical data in the EHR and allows for eligibility verification and automated follow-up. The workflow has the potential to improve recruitment efficiency and engage traditionally underrepresented individuals in research. Schattauer GmbH Stuttgart.

  19. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orimoto, Yuuichi, E-mail: orimoto.yuuichi.888@m.kyushu-u.ac.jp; Aoki, Yuriko; Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method,more » and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.« less

  20. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA.

    PubMed

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  1. Rapid iterative reanalysis for automated design

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.

    1973-01-01

    A method for iterative reanalysis in automated structural design is presented for a finite-element analysis using the direct stiffness approach. A basic feature of the method is that the generalized stiffness and inertia matrices are expressed as functions of structural design parameters, and these generalized matrices are expanded in Taylor series about the initial design. Only the linear terms are retained in the expansions. The method is approximate because it uses static condensation, modal reduction, and the linear Taylor series expansions. The exact linear representation of the expansions of the generalized matrices is also described and a basis for the present method is established. Results of applications of the present method to the recalculation of the natural frequencies of two simple platelike structural models are presented and compared with results obtained by using a commonly applied analysis procedure used as a reference. In general, the results are in good agreement. A comparison of the computer times required for the use of the present method and the reference method indicated that the present method required substantially less time for reanalysis. Although the results presented are for relatively small-order problems, the present method will become more efficient relative to the reference method as the problem size increases. An extension of the present method to static reanalysis is described, ana a basis for unifying the static and dynamic reanalysis procedures is presented.

  2. Efficient droplet router for digital microfluidic biochip using particle swarm optimizer

    NASA Astrophysics Data System (ADS)

    Pan, Indrajit; Samanta, Tuhina

    2013-01-01

    Digital Microfluidic Biochip has emerged as a revolutionary finding in the field of micro-electromechanical research. Different complex bioassays and pathological analysis are being efficiently performed on this miniaturized chip with negligible amount of sample specimens. Initially biochip was invented on continuous-fluid-flow mechanism but later it has evolved with more efficient concept of digital-fluid-flow. These second generation biochips are capable of serving more complex bioassays. This operational change in biochip technology emerged with the requirement of high end computer aided design needs for physical design automation. The change also paved new avenues of research to assist the proficient design automation. Droplet routing is one of those major aspects where it necessarily requires minimization of both routing completion time and total electrode usage. This task involves optimization of multiple associated parameters. In this paper we have proposed a particle swarm optimization based approach for droplet outing. The process mainly operates in two phases where initially we perform clustering of state space and classification of nets into designated clusters. This helps us to reduce solution space by redefining local sub optimal target in the interleaved space between source and global target of a net. In the next phase we resolve the concurrent routing issues of every sub optimal situation to generate final routing schedule. The method was applied on some standard test benches and hard test sets. Comparative analysis of experimental results shows good improvement on the aspect of unit cell usage, routing completion time and execution time over some well existing methods.

  3. Localization of interictal epileptic spikes with MEG: optimization of an automated beamformer screening method (SAMepi) in a diverse epilepsy population

    PubMed Central

    Scott, Jonathan M.; Robinson, Stephen E.; Holroyd, Tom; Coppola, Richard; Sato, Susumu; Inati, Sara K.

    2016-01-01

    OBJECTIVE To describe and optimize an automated beamforming technique followed by identification of locations with excess kurtosis (g2) for efficient detection and localization of interictal spikes in medically refractory epilepsy patients. METHODS Synthetic Aperture Magnetometry with g2 averaged over a sliding time window (SAMepi) was performed in 7 focal epilepsy patients and 5 healthy volunteers. The effect of varied window lengths on detection of spiking activity was evaluated. RESULTS Sliding window lengths of 0.5–10 seconds performed similarly, with 0.5 and 1 second windows detecting spiking activity in one of the 3 virtual sensor locations with highest kurtosis. These locations were concordant with the region of eventual surgical resection in these 7 patients who remained seizure free at one year. Average g2 values increased with increasing sliding window length in all subjects. In healthy volunteers kurtosis values stabilized in datasets longer than two minutes. CONCLUSIONS SAMepi using g2 averaged over 1 second sliding time windows in datasets of at least 2 minutes duration reliably identified interictal spiking and the presumed seizure focus in these 7 patients. Screening the 5 locations with highest kurtosis values for spiking activity is an efficient and accurate technique for localizing interictal activity using MEG. SIGNIFICANCE SAMepi should be applied using the parameter values and procedure described for optimal detection and localization of interictal spikes. Use of this screening procedure could significantly improve the efficiency of MEG analysis if clinically validated. PMID:27760068

  4. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    PubMed

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  5. Evaluation of mouse red blood cell and platelet counting with an automated hematology analyzer.

    PubMed

    Fukuda, Teruko; Asou, Eri; Nogi, Kimiko; Goto, Kazuo

    2017-10-07

    An evaluation of mouse red blood cell (RBC) and platelet (PLT) counting with an automated hematology analyzer was performed with three strains of mice, C57BL/6 (B6), BALB/c (BALB) and DBA/2 (D2). There were no significant differences in RBC and PLT counts between manual and automated optical methods in any of the samples, except for D2 mice. For D2, RBC counts obtained using the manual method were significantly lower than those obtained using the automated optical method (P<0.05), and PLT counts obtained using the manual method were higher than those obtained using the automated optical method (P<0.05). An automated hematology analyzer can be used for RBC and PLT counting; however, an appropriate method should be selected when D2 mice samples are used.

  6. Characterization of glycoprotein biopharmaceutical products by Caliper LC90 CE-SDS gel technology.

    PubMed

    Chen, Grace; Ha, Sha; Rustandi, Richard R

    2013-01-01

    Over the last decade, science has greatly improved in the area of protein sizing and characterization. Efficient high-throughput methods are now available to substitute for the traditional labor-intensive SDS-PAGE methods, which alternatively take days to analyze a very limited number of samples. Currently, PerkinElmer(®) (Caliper) has designed an automated chip-based fluorescence detection method capable of analyzing proteins in minutes with sensitivity similar to standard SDS-PAGE. Here, we describe the use and implementation of this technology to characterize and screen a large number of formulations of target glycoproteins in the 14-200 kDa molecular weight range.

  7. Fatigue Level Estimation of Bill Based on Acoustic Signal Feature by Supervised SOM

    NASA Astrophysics Data System (ADS)

    Teranishi, Masaru; Omatu, Sigeru; Kosaka, Toshihisa

    Fatigued bills have harmful influence on daily operation of Automated Teller Machine(ATM). To make the fatigued bills classification more efficient, development of an automatic fatigued bill classification method is desired. We propose a new method to estimate bending rigidity of bill from acoustic signal feature of banking machines. The estimated bending rigidities are used as continuous fatigue level for classification of fatigued bill. By using the supervised Self-Organizing Map(supervised SOM), we estimate the bending rigidity from only the acoustic energy pattern effectively. The experimental result with real bill samples shows the effectiveness of the proposed method.

  8. Comparative cost-efficiency of the EVOTECH endoscope cleaner and reprocessor versus manual cleaning plus automated endoscope reprocessing in a real-world Canadian hospital endoscopy setting

    PubMed Central

    2011-01-01

    Background Reprocessing of endoscopes generally requires labour-intensive manual cleaning followed by high-level disinfection in an automated endoscope reprocessor (AER). EVOTECH Endoscope Cleaner and Reprocessor (ECR) is approved for fully automated cleaning and disinfection whereas AERs require manual cleaning prior to the high-level disinfection procedure. The purpose of this economic evaluation was to determine the cost-efficiency of the ECR versus AER methods of endoscopy reprocessing in an actual practice setting. Methods A time and motion study was conducted at a Canadian hospital to collect data on the personnel resources and consumable supplies costs associated with the use of EVOTECH ECR versus manual cleaning followed by AER with Medivators DSD-201. Reprocessing of all endoscopes was observed and timed for both reprocessor types over three days. Laboratory staff members were interviewed regarding the consumption and cost of all disposable supplies and equipment. Exact Wilcoxon rank sum test was used for assessing differences in total cycle reprocessing time. Results Endoscope reprocessing was significantly shorter with the ECR than with manual cleaning followed by AER. The differences in median time were 12.46 minutes per colonoscope (p < 0.0001), 6.31 minutes per gastroscope (p < 0.0001), and 5.66 minutes per bronchoscope (p = 0.0040). Almost 2 hours of direct labour time was saved daily with the ECR. The total per cycle cost of consumables and labour for maintenance was slightly higher for EVOTECH ECR versus manual cleaning followed by AER ($8.91 versus $8.31, respectively). Including the cost of direct labour time consumed in reprocessing scopes, the per cycle and annual costs of using the EVOTECH ECR was less than the cost of manual cleaning followed by AER disinfection ($11.50 versus $11.88). Conclusions The EVOTECH ECR was more efficient and less costly to use for the reprocessing of endoscopes than manual cleaning followed by AER disinfection. Although the cost of consumable supplies required to reprocess endoscopes with EVOTECH ECR was slightly higher, the value of the labour time saved with EVOTECH ECR more than offset the additional consumables cost. The increased efficiency with EVOTECH ECR could lead to even further cost-savings by shifting endoscopy laboratory personnel responsibilities but further study is required. PMID:21967345

  9. In-depth investigation of spin-on doped solar cells with thermally grown oxide passivation

    NASA Astrophysics Data System (ADS)

    Ahmad, Samir Mahmmod; Cheow, Siu Leong; Ludin, Norasikin A.; Sopian, K.; Zaidi, Saleem H.

    Solar cell industrial manufacturing, based largely on proven semiconductor processing technologies supported by significant advancements in automation, has reached a plateau in terms of cost and efficiency. However, solar cell manufacturing cost (dollar/watt) is still substantially higher than fossil fuels. The route to lowering cost may not lie with continuing automation and economies of scale. Alternate fabrication processes with lower cost and environmental-sustainability coupled with self-reliance, simplicity, and affordability may lead to price compatibility with carbon-based fuels. In this paper, a custom-designed formulation of phosphoric acid has been investigated, for n-type doping in p-type substrates, as a function of concentration and drive-in temperature. For post-diffusion surface passivation and anti-reflection, thermally-grown oxide films in 50-150-nm thickness were grown. These fabrication methods facilitate process simplicity, reduced costs, and environmental sustainability by elimination of poisonous chemicals and toxic gases (POCl3, SiH4, NH3). Simultaneous fire-through contact formation process based on screen-printed front surface Ag and back surface through thermally grown oxide films was optimized as a function of the peak temperature in conveyor belt furnace. Highest efficiency solar cells fabricated exhibited efficiency of ∼13%. Analysis of results based on internal quantum efficiency and minority carried measurements reveals three contributing factors: high front surface recombination, low minority carrier lifetime, and higher reflection. Solar cell simulations based on PC1D showed that, with improved passivation, lower reflection, and high lifetimes, efficiency can be enhanced to match with commercially-produced PECVD SiN-coated solar cells.

  10. MATLAB-based automated patch-clamp system for awake behaving mice

    PubMed Central

    Siegel, Jennifer J.; Taylor, William; Chitwood, Raymond A.; Johnston, Daniel

    2015-01-01

    Automation has been an important part of biomedical research for decades, and the use of automated and robotic systems is now standard for such tasks as DNA sequencing, microfluidics, and high-throughput screening. Recently, Kodandaramaiah and colleagues (Nat Methods 9: 585–587, 2012) demonstrated, using anesthetized animals, the feasibility of automating blind patch-clamp recordings in vivo. Blind patch is a good target for automation because it is a complex yet highly stereotyped process that revolves around analysis of a single signal (electrode impedance) and movement along a single axis. Here, we introduce an automated system for blind patch-clamp recordings from awake, head-fixed mice running on a wheel. In its design, we were guided by 3 requirements: easy-to-use and easy-to-modify software; seamless integration of behavioral equipment; and efficient use of time. The resulting system employs equipment that is standard for patch recording rigs, moderately priced, or simple to make. It is written entirely in MATLAB, a programming environment that has an enormous user base in the neuroscience community and many available resources for analysis and instrument control. Using this system, we obtained 19 whole cell patch recordings from neurons in the prefrontal cortex of awake mice, aged 8–9 wk. Successful recordings had series resistances that averaged 52 ± 4 MΩ and required 5.7 ± 0.6 attempts to obtain. These numbers are comparable with those of experienced electrophysiologists working manually, and this system, written in a simple and familiar language, will be useful to many cellular electrophysiologists who wish to study awake behaving mice. PMID:26084901

  11. A Framework for Evaluating Energy and Emissions of Connected and Automated Vehicles through Traffic Microsimulations

    DOT National Transportation Integrated Search

    2018-01-07

    Connected and automated vehicles (CAV) are poised to transform surface transportation systems in the United States. Near-term CAV technologies like cooperative adaptive cruise control (CACC) have the potential to deliver energy efficiency and air qua...

  12. Automated Bus Diagnostic System Demonstration in New York City

    DOT National Transportation Integrated Search

    1983-12-01

    In response to a growing problem with the quality and efficiency of nationwide bus maintenance practices, an award was granted to the Tri-State Regional Planning Commission for the testing of an automated bus diagnostic system (ABDS). The ABDS was de...

  13. Feasibility of rapid and automated importation of 3D echocardiographic left ventricular (LV) geometry into a finite element (FEM) analysis model

    PubMed Central

    Verhey, Janko F; Nathan, Nadia S

    2004-01-01

    Background Finite element method (FEM) analysis for intraoperative modeling of the left ventricle (LV) is presently not possible. Since 3D structural data of the LV is now obtainable using standard transesophageal echocardiography (TEE) devices intraoperatively, the present study describes a method to transfer this data into a commercially available FEM analysis system: ABAQUS©. Methods In this prospective study TomTec LV Analysis TEE© Software was used for semi-automatic endocardial border detection, reconstruction, and volume-rendering of the clinical 3D echocardiographic data. A newly developed software program MVCP FemCoGen©, written in Delphi, reformats the TomTec file structures in five patients for use in ABAQUS and allows visualization of regional deformation of the LV. Results This study demonstrates that a fully automated importation of 3D TEE data into FEM modeling is feasible and can be efficiently accomplished in the operating room. Conclusion For complete intraoperative 3D LV finite element analysis, three input elements are necessary: 1. time-gaited, reality-based structural information, 2. continuous LV pressure and 3. instantaneous tissue elastance. The first of these elements is now available using the methods presented herein. PMID:15473901

  14. Evaluation of a High Intensity Focused Ultrasound-Immobilized Trypsin Digestion and 18O-Labeling Method for Quantitative Proteomics

    PubMed Central

    López-Ferrer, Daniel; Hixson, Kim K.; Smallwood, Heather; Squier, Thomas C.; Petritis, Konstantinos; Smith, Richard D.

    2009-01-01

    A new method that uses immobilized trypsin concomitant with ultrasonic irradiation results in ultra-rapid digestion and thorough 18O labeling for quantitative protein comparisons. The reproducible and highly efficient method provided effective digestions in <1 min with a minimized amount of enzyme required compared to traditional methods. This method was demonstrated for digestion of both simple and complex protein mixtures, including bovine serum albumin, a global proteome extract from the bacteria Shewanella oneidensis, and mouse plasma, as well as 18O labeling of such complex protein mixtures, which validated the application of this method for differential proteomic measurements. This approach is simple, reproducible, cost effective, rapid, and thus well-suited for automation. PMID:19555078

  15. Accuracy and generalizability of using automated methods for identifying adverse events from electronic health record data: a validation study protocol.

    PubMed

    Rochefort, Christian M; Buckeridge, David L; Tanguay, Andréanne; Biron, Alain; D'Aragon, Frédérick; Wang, Shengrui; Gallix, Benoit; Valiquette, Louis; Audet, Li-Anne; Lee, Todd C; Jayaraman, Dev; Petrucci, Bruno; Lefebvre, Patricia

    2017-02-16

    Adverse events (AEs) in acute care hospitals are frequent and associated with significant morbidity, mortality, and costs. Measuring AEs is necessary for quality improvement and benchmarking purposes, but current detection methods lack in accuracy, efficiency, and generalizability. The growing availability of electronic health records (EHR) and the development of natural language processing techniques for encoding narrative data offer an opportunity to develop potentially better methods. The purpose of this study is to determine the accuracy and generalizability of using automated methods for detecting three high-incidence and high-impact AEs from EHR data: a) hospital-acquired pneumonia, b) ventilator-associated event and, c) central line-associated bloodstream infection. This validation study will be conducted among medical, surgical and ICU patients admitted between 2013 and 2016 to the Centre hospitalier universitaire de Sherbrooke (CHUS) and the McGill University Health Centre (MUHC), which has both French and English sites. A random 60% sample of CHUS patients will be used for model development purposes (cohort 1, development set). Using a random sample of these patients, a reference standard assessment of their medical chart will be performed. Multivariate logistic regression and the area under the curve (AUC) will be employed to iteratively develop and optimize three automated AE detection models (i.e., one per AE of interest) using EHR data from the CHUS. These models will then be validated on a random sample of the remaining 40% of CHUS patients (cohort 1, internal validation set) using chart review to assess accuracy. The most accurate models developed and validated at the CHUS will then be applied to EHR data from a random sample of patients admitted to the MUHC French site (cohort 2) and English site (cohort 3)-a critical requirement given the use of narrative data -, and accuracy will be assessed using chart review. Generalizability will be determined by comparing AUCs from cohorts 2 and 3 to those from cohort 1. This study will likely produce more accurate and efficient measures of AEs. These measures could be used to assess the incidence rates of AEs, evaluate the success of preventive interventions, or benchmark performance across hospitals.

  16. Robust and efficient overset grid assembly for partitioned unstructured meshes

    NASA Astrophysics Data System (ADS)

    Roget, Beatrice; Sitaraman, Jayanarayanan

    2014-03-01

    This paper presents a method to perform efficient and automated Overset Grid Assembly (OGA) on a system of overlapping unstructured meshes in a parallel computing environment where all meshes are partitioned into multiple mesh-blocks and processed on multiple cores. The main task of the overset grid assembler is to identify, in parallel, among all points in the overlapping mesh system, at which points the flow solution should be computed (field points), interpolated (receptor points), or ignored (hole points). Point containment search or donor search, an algorithm to efficiently determine the cell that contains a given point, is the core procedure necessary for accomplishing this task. Donor search is particularly challenging for partitioned unstructured meshes because of the complex irregular boundaries that are often created during partitioning.

  17. Alternative Approaches to Mission Control Automation at NASA's Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Rackley, Michael; Cooter, Miranda; Davis, George; Mackey, Jennifer

    2001-01-01

    To meet its objective of reducing operations costs without incurring a corresponding increase in risk, NASA is seeking new methods to automate mission operations. This paper examines the state of the art in automating ground operations for space missions. A summary of available technologies and methods for automating mission operations is provided. Responses from interviews with several space mission FOTs (Flight Operations Teams) to assess the degree and success of those technologies and methods implemented are presented. Mission operators that were interviewed approached automation using different tools and methods resulting in varying degrees of success - from nearly completely automated to nearly completely manual. Two key criteria for successful automation are the active participation of the FOT in the planning, designing, testing, and implementation of the system and the relative degree of complexity of the mission.

  18. Automatized set-up procedure for transcranial magnetic stimulation protocols.

    PubMed

    Harquel, S; Diard, J; Raffin, E; Passera, B; Dall'Igna, G; Marendaz, C; David, O; Chauvin, A

    2017-06-01

    Transcranial Magnetic Stimulation (TMS) established itself as a powerful technique for probing and treating the human brain. Major technological evolutions, such as neuronavigation and robotized systems, have continuously increased the spatial reliability and reproducibility of TMS, by minimizing the influence of human and experimental factors. However, there is still a lack of efficient set-up procedure, which prevents the automation of TMS protocols. For example, the set-up procedure for defining the stimulation intensity specific to each subject is classically done manually by experienced practitioners, by assessing the motor cortical excitability level over the motor hotspot (HS) of a targeted muscle. This is time-consuming and introduces experimental variability. Therefore, we developed a probabilistic Bayesian model (AutoHS) that automatically identifies the HS position. Using virtual and real experiments, we compared the efficacy of the manual and automated procedures. AutoHS appeared to be more reproducible, faster, and at least as reliable as classical manual procedures. By combining AutoHS with robotized TMS and automated motor threshold estimation methods, our approach constitutes the first fully automated set-up procedure for TMS protocols. The use of this procedure decreases inter-experimenter variability while facilitating the handling of TMS protocols used for research and clinical routine. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Efficient anharmonic vibrational spectroscopy for large molecules using local-mode coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Xiaolu; Steele, Ryan P., E-mail: ryan.steele@utah.edu

    This article presents a general computational approach for efficient simulations of anharmonic vibrational spectra in chemical systems. An automated local-mode vibrational approach is presented, which borrows techniques from localized molecular orbitals in electronic structure theory. This approach generates spatially localized vibrational modes, in contrast to the delocalization exhibited by canonical normal modes. The method is rigorously tested across a series of chemical systems, ranging from small molecules to large water clusters and a protonated dipeptide. It is interfaced with exact, grid-based approaches, as well as vibrational self-consistent field methods. Most significantly, this new set of reference coordinates exhibits a well-behavedmore » spatial decay of mode couplings, which allows for a systematic, a priori truncation of mode couplings and increased computational efficiency. Convergence can typically be reached by including modes within only about 4 Å. The local nature of this truncation suggests particular promise for the ab initio simulation of anharmonic vibrational motion in large systems, where connection to experimental spectra is currently most challenging.« less

  20. Office Automation in Student Affairs.

    ERIC Educational Resources Information Center

    Johnson, Sharon L.; Hamrick, Florence A.

    1987-01-01

    Offers recommendations to assist in introducing or expanding computer assistance in student affairs. Describes need for automation and considers areas of choosing hardware and software, funding and competitive bidding, installation and training, and system management. Cites greater efficiency in handling tasks and data and increased levels of…

  1. Reconfigurable Very Long Instruction Word (VLIW) Processor

    NASA Technical Reports Server (NTRS)

    Velev, Miroslav N.

    2015-01-01

    Future NASA missions will depend on radiation-hardened, power-efficient processing systems-on-a-chip (SOCs) that consist of a range of processor cores custom tailored for space applications. Aries Design Automation, LLC, has developed a processing SOC that is optimized for software-defined radio (SDR) uses. The innovation implements the Institute of Electrical and Electronics Engineers (IEEE) RazorII voltage management technique, a microarchitectural mechanism that allows processor cores to self-monitor, self-analyze, and selfheal after timing errors, regardless of their cause (e.g., radiation; chip aging; variations in the voltage, frequency, temperature, or manufacturing process). This highly automated SOC can also execute legacy PowerPC 750 binary code instruction set architecture (ISA), which is used in the flight-control computers of many previous NASA space missions. In developing this innovation, Aries Design Automation has made significant contributions to the fields of formal verification of complex pipelined microprocessors and Boolean satisfiability (SAT) and has developed highly efficient electronic design automation tools that hold promise for future developments.

  2. Design and analysis on sorting blade for automated size-based sorting device

    NASA Astrophysics Data System (ADS)

    Razali, Zol Bahri; Kader, Mohamed Mydin M. Abdul; Samsudin, Yasser Suhaimi; Daud, Mohd Hisam

    2017-09-01

    Nowadays rubbish separating or recycling is a main problem of nation, where peoples dumped their rubbish into dumpsite without caring the value of the rubbish if it can be recycled and reused. Thus the author proposed an automated segregating device, purposely to teach people to separate their rubbish and value the rubbish that can be reused. The automated size-based mechanical segregating device provides significant improvements in terms of efficiency and consistency in this segregating process. This device is designed to make recycling easier, user friendly, in the hope that more people will take responsibility if it is less of an expense of time and effort. This paper discussed about redesign a blade for the sorting device which is to develop an efficient automated mechanical sorting device for the similar material but in different size. The machine is able to identify the size of waste and it depends to the coil inside the container to separate it out. The detail design and methodology is described in detail in this paper.

  3. Automated Classification of Asteroids into Families at Work

    NASA Astrophysics Data System (ADS)

    Knežević, Zoran; Milani, Andrea; Cellino, Alberto; Novaković, Bojan; Spoto, Federica; Paolicchi, Paolo

    2014-07-01

    We have recently proposed a new approach to the asteroid family classification by combining the classical HCM method with an automated procedure to add newly discovered members to existing families. This approach is specifically intended to cope with ever increasing asteroid data sets, and consists of several steps to segment the problem and handle the very large amount of data in an efficient and accurate manner. We briefly present all these steps and show the results from three subsequent updates making use of only the automated step of attributing the newly numbered asteroids to the known families. We describe the changes of the individual families membership, as well as the evolution of the classification due to the newly added intersections between the families, resolved candidate family mergers, and emergence of the new candidates for the mergers. We thus demonstrate how by the new approach the asteroid family classification becomes stable in general terms (converging towards a permanent list of confirmed families), and in the same time evolving in details (to account for the newly discovered asteroids) at each update.

  4. Phase editing as a signal pre-processing step for automated bearing fault detection

    NASA Astrophysics Data System (ADS)

    Barbini, L.; Ompusunggu, A. P.; Hillis, A. J.; du Bois, J. L.; Bartic, A.

    2017-07-01

    Scheduled maintenance and inspection of bearing elements in industrial machinery contributes significantly to the operating costs. Savings can be made through automatic vibration-based damage detection and prognostics, to permit condition-based maintenance. However automation of the detection process is difficult due to the complexity of vibration signals in realistic operating environments. The sensitivity of existing methods to the choice of parameters imposes a requirement for oversight from a skilled operator. This paper presents a novel approach to the removal of unwanted vibrational components from the signal: phase editing. The approach uses a computationally-efficient full-band demodulation and requires very little oversight. Its effectiveness is tested on experimental data sets from three different test-rigs, and comparisons are made with two state-of-the-art processing techniques: spectral kurtosis and cepstral pre- whitening. The results from the phase editing technique show a 10% improvement in damage detection rates compared to the state-of-the-art while simultaneously improving on the degree of automation. This outcome represents a significant contribution in the pursuit of fully automatic fault detection.

  5. Automated Car Park Management System

    NASA Astrophysics Data System (ADS)

    Fabros, J. P.; Tabañag, D.; Espra, A.; Gerasta, O. J.

    2015-06-01

    This study aims to develop a prototype for an Automated Car Park Management System that will increase the quality of service of parking lots through the integration of a smart system that assists motorist in finding vacant parking lot. The research was based on implementing an operating system and a monitoring system for parking system without the use of manpower. This will include Parking Guidance and Information System concept which will efficiently assist motorists and ensures the safety of the vehicles and the valuables inside the vehicle. For monitoring, Optical Character Recognition was employed to monitor and put into list all the cars entering the parking area. All parking events in this system are visible via MATLAB GUI which contain time-in, time-out, time consumed information and also the lot number where the car parks. To put into reality, this system has a payment method, and it comes via a coin slot operation to control the exit gate. The Automated Car Park Management System was successfully built by utilizing microcontrollers specifically one PIC18f4550 and two PIC16F84s and one PIC16F628A.

  6. The Joint Modular Intermodal Container, is this the Future of Naval Logistics?

    DTIC Science & Technology

    2005-06-01

    pallet size. Contrast this with the commercial shipping industry , which for the last 40 years has been moving non-bulk goods in hyper-efficient container...a Heavy UNREP station than a current STREAM .7station7. Figure 4: Heavy UNREP Enables New Loads to be passed Between Ships UACN et Enggines (12,000 ls...man-hours are being spent on inefficient and relatively inaccurate paper-based accounting methods. The industry standard for automated accounting

  7. MoCha: Molecular Characterization of Unknown Pathways.

    PubMed

    Lobo, Daniel; Hammelman, Jennifer; Levin, Michael

    2016-04-01

    Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.

  8. Automated extraction method for the center line of spinal canal and its application to the spinal curvature quantification in torso X-ray CT images

    NASA Astrophysics Data System (ADS)

    Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi

    2010-03-01

    X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.

  9. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  10. Novel microscale approaches for easy, rapid determination of protein stability in academic and commercial settings

    PubMed Central

    Alexander, Crispin G.; Wanner, Randy; Johnson, Christopher M.; Breitsprecher, Dennis; Winter, Gerhard; Duhr, Stefan; Baaske, Philipp; Ferguson, Neil

    2014-01-01

    Chemical denaturant titrations can be used to accurately determine protein stability. However, data acquisition is typically labour intensive, has low throughput and is difficult to automate. These factors, combined with high protein consumption, have limited the adoption of chemical denaturant titrations in commercial settings. Thermal denaturation assays can be automated, sometimes with very high throughput. However, thermal denaturation assays are incompatible with proteins that aggregate at high temperatures and large extrapolation of stability parameters to physiological temperatures can introduce significant uncertainties. We used capillary-based instruments to measure chemical denaturant titrations by intrinsic fluorescence and microscale thermophoresis. This allowed higher throughput, consumed several hundred-fold less protein than conventional, cuvette-based methods yet maintained the high quality of the conventional approaches. We also established efficient strategies for automated, direct determination of protein stability at a range of temperatures via chemical denaturation, which has utility for characterising stability for proteins that are difficult to purify in high yield. This approach may also have merit for proteins that irreversibly denature or aggregate in classical thermal denaturation assays. We also developed procedures for affinity ranking of protein–ligand interactions from ligand-induced changes in chemical denaturation data, and proved the principle for this by correctly ranking the affinity of previously unreported peptide–PDZ domain interactions. The increased throughput, automation and low protein consumption of protein stability determinations afforded by using capillary-based methods to measure denaturant titrations, can help to revolutionise protein research. We believe that the strategies reported are likely to find wide applications in academia, biotherapeutic formulation and drug discovery programmes. PMID:25262836

  11. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    PubMed

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  12. Improving clinical laboratory efficiency: a time-motion evaluation of the Abbott m2000 RealTime and Roche COBAS AmpliPrep/COBAS TaqMan PCR systems for the simultaneous quantitation of HIV-1 RNA and HCV RNA.

    PubMed

    Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria

    2011-08-01

    Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.

  13. Self-reconfigurable ship fluid-network modeling for simulation-based design

    NASA Astrophysics Data System (ADS)

    Moon, Kyungjin

    Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.

  14. A Novel Adjustment Method for Shearer Traction Speed through Integration of T-S Cloud Inference Network and Improved PSO

    PubMed Central

    Si, Lei; Wang, Zhongbin; Yang, Yinwei

    2014-01-01

    In order to efficiently and accurately adjust the shearer traction speed, a novel approach based on Takagi-Sugeno (T-S) cloud inference network (CIN) and improved particle swarm optimization (IPSO) is proposed. The T-S CIN is built through the combination of cloud model and T-S fuzzy neural network. Moreover, the IPSO algorithm employs parameter automation adjustment strategy and velocity resetting to significantly improve the performance of basic PSO algorithm in global search and fine-tuning of the solutions, and the flowchart of proposed approach is designed. Furthermore, some simulation examples are carried out and comparison results indicate that the proposed method is feasible, efficient, and is outperforming others. Finally, an industrial application example of coal mining face is demonstrated to specify the effect of proposed system. PMID:25506358

  15. Technical devices of powered roof support for the top coal caving as automation objects

    NASA Astrophysics Data System (ADS)

    Nikitenko, M. S.; Kizilov, S. A.; Nikolaev, P. I.; Kuznetsov, I. S.

    2018-05-01

    In the paper technical devices for the top coal caving as automation objects in the composition of the longwall mining complex (LTCC) are considered. The proposed concept for automation of the top coal caving process allows caving efficiency to be ensured, coal dilution to be prevented, conveyor overloading to be prevented, the shearer service personnel to be unloaded, the influence of the “human factor” to be reduced.

  16. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  17. Human Systems Integration and Automation Issues in Small Unmanned Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    McCauley, Michael E.; Matsangas, Panagiotis

    2004-01-01

    The goal of this report is to identify Human System Integration (HSI) and automation issues that contribute to improved effectiveness and efficiency in the operation of U.S. military Small Unmanned Aerial Vehicles (SUAVs). HSI issues relevant to SUAV operations are reviewed and observations from field trials are summarized. Short-term improvements are suggested research issues are identified and an overview is provided of automation technologies applicable to future SUAV design.

  18. Correction of spin diffusion during iterative automated NOE assignment

    NASA Astrophysics Data System (ADS)

    Linge, Jens P.; Habeck, Michael; Rieping, Wolfgang; Nilges, Michael

    2004-04-01

    Indirect magnetization transfer increases the observed nuclear Overhauser enhancement (NOE) between two protons in many cases, leading to an underestimation of target distances. Wider distance bounds are necessary to account for this error. However, this leads to a loss of information and may reduce the quality of the structures generated from the inter-proton distances. Although several methods for spin diffusion correction have been published, they are often not employed to derive distance restraints. This prompted us to write a user-friendly and CPU-efficient method to correct for spin diffusion that is fully integrated in our program ambiguous restraints for iterative assignment (ARIA). ARIA thus allows automated iterative NOE assignment and structure calculation with spin diffusion corrected distances. The method relies on numerical integration of the coupled differential equations which govern relaxation by matrix squaring and sparse matrix techniques. We derive a correction factor for the distance restraints from calculated NOE volumes and inter-proton distances. To evaluate the impact of our spin diffusion correction, we tested the new calibration process extensively with data from the Pleckstrin homology (PH) domain of Mus musculus β-spectrin. By comparing structures refined with and without spin diffusion correction, we show that spin diffusion corrected distance restraints give rise to structures of higher quality (notably fewer NOE violations and a more regular Ramachandran map). Furthermore, spin diffusion correction permits the use of tighter error bounds which improves the distinction between signal and noise in an automated NOE assignment scheme.

  19. Robotics in biomedical chromatography and electrophoresis.

    PubMed

    Fouda, H G

    1989-08-11

    The ideal laboratory robot can be viewed as "an indefatigable assistant capable of working continuously for 24 h a day with constant efficiency". The development of a system approaching that promise requires considerable skill and time commitment, a thorough understanding of the capabilities and limitations of the robot and its specialized modules and an intimate knowledge of the functions to be automated. The robot need not emulate every manual step. Effective substitutes for difficult steps must be devised. The future of laboratory robots depends not only on technological advances in other fields, but also on the skill and creativity of chromatographers and other scientists. The robot has been applied to automate numerous biomedical chromatography and electrophoresis methods. The quality of its data can approach, and in some cases exceed, that of manual methods. Maintaining high data quality during continuous operation requires frequent maintenance and validation. Well designed robotic systems can yield substantial increase in the laboratory productivity without a corresponding increase in manpower. They can free skilled personnel from mundane tasks and can enhance the safety of the laboratory environment. The integration of robotics, chromatography systems and laboratory information management systems permits full automation and affords opportunities for unattended method development and for future incorporation of artificial intelligence techniques and the evolution of expert systems. Finally, humanoid attributes aside, robotic utilization in the laboratory should not be an end in itself. The robot is a useful tool that should be utilized only when it is prudent and cost-effective to do so.

  20. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    PubMed

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. AutoStitcher: An Automated Program for Efficient and Robust Reconstruction of Digitized Whole Histological Sections from Tissue Fragments

    NASA Astrophysics Data System (ADS)

    Penzias, Gregory; Janowczyk, Andrew; Singanamalli, Asha; Rusu, Mirabela; Shih, Natalie; Feldman, Michael; Stricker, Phillip D.; Delprado, Warick; Tiwari, Sarita; Böhm, Maret; Haynes, Anne-Maree; Ponsky, Lee; Viswanath, Satish; Madabhushi, Anant

    2016-07-01

    In applications involving large tissue specimens that have been sectioned into smaller tissue fragments, manual reconstruction of a “pseudo whole-mount” histological section (PWMHS) can facilitate (a) pathological disease annotation, and (b) image registration and correlation with radiological images. We have previously presented a program called HistoStitcher, which allows for more efficient manual reconstruction than general purpose image editing tools (such as Photoshop). However HistoStitcher is still manual and hence can be laborious and subjective, especially when doing large cohort studies. In this work we present AutoStitcher, a novel automated algorithm for reconstructing PWMHSs from digitized tissue fragments. AutoStitcher reconstructs (“stitches”) a PWMHS from a set of 4 fragments by optimizing a novel cost function that is domain-inspired to ensure (i) alignment of similar tissue regions, and (ii) contiguity of the prostate boundary. The algorithm achieves computational efficiency by performing reconstruction in a multi-resolution hierarchy. Automated PWMHS reconstruction results (via AutoStitcher) were quantitatively and qualitatively compared to manual reconstructions obtained via HistoStitcher for 113 prostate pathology sections. Distances between corresponding fiducials placed on each of the automated and manual reconstruction results were between 2.7%-3.2%, reflecting their excellent visual similarity.

  2. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, X; Li, S; Zheng, D

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheetmore » every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covington, E; Younge, K; Chen, X

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less

  4. An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials

    PubMed Central

    Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas

    2012-01-01

    Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per week. Thus, this novel automation facilitates standardized large-scale productions of HIV pseudoviruses for ongoing and upcoming HIV vaccine trials. PMID:23300558

  5. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  6. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between cross sections, and can generate working maps across a broad range of scales, for any selected area, and overlayed with easily updated cultural features. Local governments are aggressively collecting very-high-accuracy elevation data for numerous reasons; this not only lowers the cost and increases accuracy of flood maps, but also inherently boosts the level of community involvement in the mapping process. These elevation data are also ideal for hydraulic modeling, should an existing model be judged inadequate.

  7. Automated Antibody De Novo Sequencing and Its Utility in Biopharmaceutical Discovery

    NASA Astrophysics Data System (ADS)

    Sen, K. Ilker; Tang, Wilfred H.; Nayak, Shruti; Kil, Yong J.; Bern, Marshall; Ozoglu, Berk; Ueberheide, Beatrix; Davis, Darryl; Becker, Christopher

    2017-05-01

    Applications of antibody de novo sequencing in the biopharmaceutical industry range from the discovery of new antibody drug candidates to identifying reagents for research and determining the primary structure of innovator products for biosimilar development. When murine, phage display, or patient-derived monoclonal antibodies against a target of interest are available, but the cDNA or the original cell line is not, de novo protein sequencing is required to humanize and recombinantly express these antibodies, followed by in vitro and in vivo testing for functional validation. Availability of fully automated software tools for monoclonal antibody de novo sequencing enables efficient and routine analysis. Here, we present a novel method to automatically de novo sequence antibodies using mass spectrometry and the Supernovo software. The robustness of the algorithm is demonstrated through a series of stress tests.

  8. Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data

    PubMed Central

    2017-01-01

    Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823

  9. Automated On-tip Affinity Capture Coupled with Mass Spectrometry to Characterize Intact Antibody-Drug Conjugates from Blood

    NASA Astrophysics Data System (ADS)

    Li, Ke Sherry; Chu, Phillip Y.; Fourie-O'Donohue, Aimee; Srikumar, Neha; Kozak, Katherine R.; Liu, Yichin; Tran, John C.

    2018-05-01

    Antibody-drug conjugates (ADCs) present unique challenges for ligand-binding assays primarily due to the dynamic changes of the drug-to-antibody ratio (DAR) distribution in vivo and in vitro. Here, an automated on-tip affinity capture platform with subsequent mass spectrometry analysis was developed to accurately characterize the DAR distribution of ADCs from biological matrices. A variety of elution buffers were tested to offer optimal recovery, with trastuzumab serving as a surrogate to the ADCs. High assay repeatability (CV 3%) was achieved for trastuzumab antibody when captured below the maximal binding capacity of 7.5 μg. Efficient on-tip deglycosylation was also demonstrated in 1 h followed by affinity capture. Moreover, this tip-based platform affords higher throughput for DAR characterization when compared with a well-characterized bead-based method.

  10. Software-supported USER cloning strategies for site-directed mutagenesis and DNA assembly.

    PubMed

    Genee, Hans Jasper; Bonde, Mads Tvillinggaard; Bagger, Frederik Otzen; Jespersen, Jakob Berg; Sommer, Morten O A; Wernersson, Rasmus; Olsen, Lars Rønn

    2015-03-20

    USER cloning is a fast and versatile method for engineering of plasmid DNA. We have developed a user friendly Web server tool that automates the design of optimal PCR primers for several distinct USER cloning-based applications. Our Web server, named AMUSER (Automated DNA Modifications with USER cloning), facilitates DNA assembly and introduction of virtually any type of site-directed mutagenesis by designing optimal PCR primers for the desired genetic changes. To demonstrate the utility, we designed primers for a simultaneous two-position site-directed mutagenesis of green fluorescent protein (GFP) to yellow fluorescent protein (YFP), which in a single step reaction resulted in a 94% cloning efficiency. AMUSER also supports degenerate nucleotide primers, single insert combinatorial assembly, and flexible parameters for PCR amplification. AMUSER is freely available online at http://www.cbs.dtu.dk/services/AMUSER/.

  11. An Automated Scheme for the Large-Scale Survey of Herbig-Haro Objects

    NASA Astrophysics Data System (ADS)

    Deng, Licai; Yang, Ji; Zheng, Zhongyuan; Jiang, Zhaoji

    2001-04-01

    Owing to their spectral properties, Herbig-Haro (HH) objects can be discovered using photometric methods through a combination of filters, sampling the characteristic spectral lines and the nearby continuum. The data are commonly processed through direct visual inspection of the images. To make data reduction more efficient and the results more uniform and complete, an automated searching scheme for HH objects is developed to manipulate the images using IRAF. This approach helps to extract images with only intrinsic HH emissions. By using this scheme, the pointlike stellar sources and extended nebulous sources with continuum emission can be eliminated from the original images. The objects with only characteristic HH emission become prominent and can be easily picked up. In this paper our scheme is illustrated by a sample field and has been applied to our surveys for HH objects.

  12. Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.

    PubMed

    Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls

    2017-10-06

    Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.

  13. Ex vivo electroporation of retinal cells: a novel, high efficiency method for functional studies in primary retinal cultures

    PubMed Central

    Vergara, Maria Natalia; Gutierrez, Christian; O’Brien, David R.; Canto-Soler, Maria Valeria

    2013-01-01

    Primary retinal cultures constitute valuable tools not only for basic research on retinal cell development and physiology, but also for the identification of factors or drugs that promote cell survival and differentiation. In order to take full advantage of the benefits of this system it is imperative to develop efficient and reliable techniques for the manipulation of gene expression. However, achieving appropriate transfection efficiencies in these cultures has remained challenging. The purpose of this work was to develop and optimize a technique that would allow the transfection of chick retinal cells with high efficiency and reproducibility for multiple applications. We developed an ex vivo electroporation method applied to dissociated retinal cell cultures that offers a significant improvement over other currently available transfection techniques, increasing efficiency by five-fold. In this method, eyes were enucleated, devoid of RPE, and electroporated with GFP-encoding plasmids using custom-made electrodes. Electroporated retinas were then dissociated into single cells and plated in low density conditions, to be analyzed after 4 days of incubation. Parameters such as voltage and number of electric pulses, as well as plasmid concentration and developmental stage of the animal were optimized for efficiency. The characteristics of the cultures were assessed by morphology and immunocytochemistry, and cell viability was determined by ethidium homodimer staining. Cell imaging and counting was performed using an automated high-throughput system. This procedure resulted in transfection efficiencies in the order of 22–25 % of cultured cells, encompassing both photoreceptors and non-photoreceptor neurons, and without affecting normal cell survival and differentiation. Finally, the feasibility of the technique for cell-autonomous studies of gene function in a biologically relevant context was tested by carrying out gain and loss-of-function experiments for the transcription factor PAX6. Electroporation of a plasmid construct expressing PAX6 resulted in a marked upregulation in the expression levels of this protein that could be measured in the whole culture as well as cell-intrinsically. This was accompanied by a significant decrease in the percentage of cells differentiating as photoreceptors among the transfected population. Conversely, electroporation of an RNAi construct targeting PAX6 resulted in a significant decrease in the levels of this protein, with a concomitant increase in the proportion of photoreceptors. Taken together these results provide strong proof-of-principle of the suitability of this technique for genetic studies in retinal cultures. The combination of the high transfection efficiency obtained by this method with automated high-throughput cell analysis supplies the scientific community with a powerful system for performing functional studies in a cell-autonomous manner. PMID:23370269

  14. Ex vivo electroporation of retinal cells: a novel, high efficiency method for functional studies in primary retinal cultures.

    PubMed

    Vergara, M Natalia; Gutierrez, Christian; O'Brien, David R; Canto-Soler, M Valeria

    2013-04-01

    Primary retinal cultures constitute valuable tools not only for basic research on retinal cell development and physiology, but also for the identification of factors or drugs that promote cell survival and differentiation. In order to take full advantage of the benefits of this system it is imperative to develop efficient and reliable techniques for the manipulation of gene expression. However, achieving appropriate transfection efficiencies in these cultures has remained challenging. The purpose of this work was to develop and optimize a technique that would allow the transfection of chick retinal cells with high efficiency and reproducibility for multiple applications. We developed an ex vivo electroporation method applied to dissociated retinal cell cultures that offers a significant improvement over other currently available transfection techniques, increasing efficiency by five-fold. In this method, eyes were enucleated, devoid of RPE, and electroporated with GFP-encoding plasmids using custom-made electrodes. Electroporated retinas were then dissociated into single cells and plated in low density conditions, to be analyzed after 4 days of incubation. Parameters such as voltage and number of electric pulses, as well as plasmid concentration and developmental stage of the animal were optimized for efficiency. The characteristics of the cultures were assessed by morphology and immunocytochemistry, and cell viability was determined by ethidium homodimer staining. Cell imaging and counting was performed using an automated high-throughput system. This procedure resulted in transfection efficiencies in the order of 22-25% of cultured cells, encompassing both photoreceptors and non-photoreceptor neurons, and without affecting normal cell survival and differentiation. Finally, the feasibility of the technique for cell-autonomous studies of gene function in a biologically relevant context was tested by carrying out gain and loss-of-function experiments for the transcription factor PAX6. Electroporation of a plasmid construct expressing PAX6 resulted in a marked upregulation in the expression levels of this protein that could be measured in the whole culture as well as cell-intrinsically. This was accompanied by a significant decrease in the percentage of cells differentiating as photoreceptors among the transfected population. Conversely, electroporation of an RNAi construct targeting PAX6 resulted in a significant decrease in the levels of this protein, with a concomitant increase in the proportion of photoreceptors. Taken together these results provide strong proof-of-principle of the suitability of this technique for genetic studies in retinal cultures. The combination of the high transfection efficiency obtained by this method with automated high-throughput cell analysis supplies the scientific community with a powerful system for performing functional studies in a cell-autonomous manner. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. An automated hand hygiene training system improves hand hygiene technique but not compliance.

    PubMed

    Kwok, Yen Lee Angela; Callard, Michelle; McLaws, Mary-Louise

    2015-08-01

    The hand hygiene technique that the World Health Organization recommends for cleansing hands with soap and water or alcohol-based handrub consists of 7 poses. We used an automated training system to improve clinicians' hand hygiene technique and test whether this affected hospitalwide hand hygiene compliance. Seven hundred eighty-nine medical and nursing staff volunteered to participate in a self-directed training session using the automated training system. The proportion of successful first attempts was reported for each of the 7 poses. Hand hygiene compliance was collected according to the national requirement and rates for 2011-2014 were used to determine the effect of the training system on compliance. The highest pass rate was for pose 1 (palm to palm) at 77% (606 out of 789), whereas pose 6 (clean thumbs) had the lowest pass rate at 27% (216 out of 789). One hundred volunteers provided feedback to 8 items related to satisfaction with the automated training system and most (86%) expressed a high degree of satisfaction and all reported that this method was time-efficient. There was no significant change in compliance rates after the introduction of the automated training system. Observed compliance during the posttraining period declined but increased to 82% in response to other strategies. Technology for training clinicians in the 7 poses played an important education role but did not affect compliance rates. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  16. Bacterial screening by flow cytometry offers potential for extension of platelet storage: results of 14 months of active surveillance.

    PubMed

    Vollmer, T; Engemann, J; Kleesiek, K; Dreier, J

    2011-06-01

    Bacterial contamination is currently the major infectious hazard of platelet transfusion in developed countries. It has been demonstrated that a significant transfusion risk remains, in particular with older platelet concentrates (PCs). In 2009, the shelf life of PCs was therefore reduced in Germany to 4 days after the day of production according to Vote 38. The aim of the present study was the application and implementation of a recently developed flow cytometry-based rapid screening method (BactiFlow) for bacterial contamination at the end of PC shelf life as a routine in-process control. A total of 472 apheresis-derived PCs were tested using the BactiFlow flow cytometric assay to detect and count bacteria based on esterase activity in viable bacterial cells, while the BacT/Alert automated culture system served as the reference method. The automation potential of the flow cytometric assay was analysed by applying the semi-automated BactiFlow ALS system. An algorithm was developed for use in routine blood bank operations to extend the storage period of PCs. Two of the 472 apheresis PCs tested were positive in culture and identified as Propionibacterium species. One PC was positive for Staphylococcus aureus by both methods. All remaining specimens were tested negative by both methods. Our study demonstrates that routine bacterial testing of PCs was successfully implemented and the established algorithm proved efficient. The BactiFlow flow cytometric assay is the first rapid screening method which is suitable for a routine application combined with a high sensitivity. © 2011 The Authors. Transfusion Medicine © 2011 British Blood Transfusion Society.

  17. Automated lettuce nutrient solution management using an array of ion-selective electrodes

    USDA-ARS?s Scientific Manuscript database

    Automated sensing and control of macronutrients in hydroponic solutions would allow more efficient management of nutrients for crop growth in closed systems. This paper describes the development and evaluation of a computer-controlled nutrient management system with an array of ion-selective electro...

  18. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement

    PubMed Central

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-01-01

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520

  19. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement.

    PubMed

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-09-03

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.

  20. A LabVIEW based experiment system for the efficient collection and analysis of cyclic voltametry and electrode charge capacity measurements.

    PubMed

    Detlefsen, D; Hu, Z; Troyk, P R

    2006-01-01

    Cyclic voltametry and recording of stimulation electrode voltage excursions are two critical methods of measurement for understanding the performance of implantable electrodes. Because implanted electrodes cannot easily be replaced, it is necessary to have an a-priori understanding of an electrode's implanted performance and capabilities. In-vitro exhaustive tests are often needed to quantify an electrodes performance. Using commonly available equipment, the human labor cost to conduct this work is immense. Presented is an automated experiment system that is highly configurable that can efficiently conduct a battery of repeatable CV and stimulation recording measurements. Results of preparing 96 electrodes prior to an animal implantation are also discussed.

  1. Automated systems to identify relevant documents in product risk management

    PubMed Central

    2012-01-01

    Background Product risk management involves critical assessment of the risks and benefits of health products circulating in the market. One of the important sources of safety information is the primary literature, especially for newer products which regulatory authorities have relatively little experience with. Although the primary literature provides vast and diverse information, only a small proportion of which is useful for product risk assessment work. Hence, the aim of this study is to explore the possibility of using text mining to automate the identification of useful articles, which will reduce the time taken for literature search and hence improving work efficiency. In this study, term-frequency inverse document-frequency values were computed for predictors extracted from the titles and abstracts of articles related to three tumour necrosis factors-alpha blockers. A general automated system was developed using only general predictors and was tested for its generalizability using articles related to four other drug classes. Several specific automated systems were developed using both general and specific predictors and training sets of different sizes in order to determine the minimum number of articles required for developing such systems. Results The general automated system had an area under the curve value of 0.731 and was able to rank 34.6% and 46.2% of the total number of 'useful' articles among the first 10% and 20% of the articles presented to the evaluators when tested on the generalizability set. However, its use may be limited by the subjective definition of useful articles. For the specific automated system, it was found that only 20 articles were required to develop a specific automated system with a prediction performance (AUC 0.748) that was better than that of general automated system. Conclusions Specific automated systems can be developed rapidly and avoid problems caused by subjective definition of useful articles. Thus the efficiency of product risk management can be improved with the use of specific automated systems. PMID:22380483

  2. Automated and assisted RNA resonance assignment using NMR chemical shift statistics

    PubMed Central

    Aeschbacher, Thomas; Schmidt, Elena; Blatter, Markus; Maris, Christophe; Duss, Olivier; Allain, Frédéric H.-T.; Güntert, Peter; Schubert, Mario

    2013-01-01

    The three-dimensional structure determination of RNAs by NMR spectroscopy relies on chemical shift assignment, which still constitutes a bottleneck. In order to develop more efficient assignment strategies, we analysed relationships between sequence and 1H and 13C chemical shifts. Statistics of resonances from regularly Watson–Crick base-paired RNA revealed highly characteristic chemical shift clusters. We developed two approaches using these statistics for chemical shift assignment of double-stranded RNA (dsRNA): a manual approach that yields starting points for resonance assignment and simplifies decision trees and an automated approach based on the recently introduced automated resonance assignment algorithm FLYA. Both strategies require only unlabeled RNAs and three 2D spectra for assigning the H2/C2, H5/C5, H6/C6, H8/C8 and H1′/C1′ chemical shifts. The manual approach proved to be efficient and robust when applied to the experimental data of RNAs with a size between 20 nt and 42 nt. The more advanced automated assignment approach was successfully applied to four stem-loop RNAs and a 42 nt siRNA, assigning 92–100% of the resonances from dsRNA regions correctly. This is the first automated approach for chemical shift assignment of non-exchangeable protons of RNA and their corresponding 13C resonances, which provides an important step toward automated structure determination of RNAs. PMID:23921634

  3. Ideas that Work!. Retuning the Building Automation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, Steven

    A building automation system (BAS) can save considerable energy by effectively and efficiently operating building energy systems (fans, pumps, chillers boilers, etc.), but only when the BAS is properly set up and operated. Tuning, or retuning, the BAS is a cost effective process worthy of your time and attention.

  4. 75 FR 29374 - Self-Regulatory Organizations; The Depository Trust Company; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-25

    ... automation. For instance, Participants could identify appropriate tax treatment prior to allocation which... offer greater automation and efficiency to consent solicitation collection, which is currently a manual... submit written data, views and arguments concerning the foregoing, including whether the proposed rule...

  5. Final-Approach-Spacing Subsystem For Air Traffic

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh

    1992-01-01

    Automation subsystem of computers, computer workstations, communication equipment, and radar helps air-traffic controllers in terminal radar approach-control (TRACON) facility manage sequence and spacing of arriving aircraft for both efficiency and safety. Called FAST (Final Approach Spacing Tool), subsystem enables controllers to choose among various levels of automation.

  6. Teacherbot: Interventions in Automated Teaching

    ERIC Educational Resources Information Center

    Bayne, Sian

    2015-01-01

    Promises of "teacher-light" tuition and of enhanced "efficiency" via the automation of teaching have been with us since the early days of digital education, sometimes embraced by academics and institutions, and sometimes resisted as a set of moves which are damaging to teacher professionalism and to the humanistic values of…

  7. Improving Learning Object Quality: Moodle HEODAR Implementation

    ERIC Educational Resources Information Center

    Munoz, Carlos; Garcia-Penalvo, Francisco J.; Morales, Erla Mariela; Conde, Miguel Angel; Seoane, Antonio M.

    2012-01-01

    Automation toward efficiency is the aim of most intelligent systems in an educational context in which results calculation automation that allows experts to spend most of their time on important tasks, not on retrieving, ordering, and interpreting information. In this paper, the authors provide a tool that easily evaluates Learning Objects quality…

  8. Automating the conflict resolution process

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  9. Automation for Primary Processing of Hardwoods

    Treesearch

    Daniel L. Schmoldt

    1992-01-01

    Hardwood sawmills critically need to incorporate automation and computer technology into their operations. Social constraints, forest biology constraints, forest product market changes, and financial necessity are forcing primary processors to boost their productivity and efficiency to higher levels. The locations, extent, and types of defects found in logs and on...

  10. An automated sleep-state classification algorithm for quantifying sleep timing and sleep-dependent dynamics of electroencephalographic and cerebral metabolic parameters

    PubMed Central

    Rempe, Michael J; Clegern, William C; Wisor, Jonathan P

    2015-01-01

    Introduction Rodent sleep research uses electroencephalography (EEG) and electromyography (EMG) to determine the sleep state of an animal at any given time. EEG and EMG signals, typically sampled at >100 Hz, are segmented arbitrarily into epochs of equal duration (usually 2–10 seconds), and each epoch is scored as wake, slow-wave sleep (SWS), or rapid-eye-movement sleep (REMS), on the basis of visual inspection. Automated state scoring can minimize the burden associated with state and thereby facilitate the use of shorter epoch durations. Methods We developed a semiautomated state-scoring procedure that uses a combination of principal component analysis and naïve Bayes classification, with the EEG and EMG as inputs. We validated this algorithm against human-scored sleep-state scoring of data from C57BL/6J and BALB/CJ mice. We then applied a general homeostatic model to characterize the state-dependent dynamics of sleep slow-wave activity and cerebral glycolytic flux, measured as lactate concentration. Results More than 89% of epochs scored as wake or SWS by the human were scored as the same state by the machine, whether scoring in 2-second or 10-second epochs. The majority of epochs scored as REMS by the human were also scored as REMS by the machine. However, of epochs scored as REMS by the human, more than 10% were scored as SWS by the machine and 18 (10-second epochs) to 28% (2-second epochs) were scored as wake. These biases were not strain-specific, as strain differences in sleep-state timing relative to the light/dark cycle, EEG power spectral profiles, and the homeostatic dynamics of both slow waves and lactate were detected equally effectively with the automated method or the manual scoring method. Error associated with mathematical modeling of temporal dynamics of both EEG slow-wave activity and cerebral lactate either did not differ significantly when state scoring was done with automated versus visual scoring, or was reduced with automated state scoring relative to manual classification. Conclusions Machine scoring is as effective as human scoring in detecting experimental effects in rodent sleep studies. Automated scoring is an efficient alternative to visual inspection in studies of strain differences in sleep and the temporal dynamics of sleep-related physiological parameters. PMID:26366107

  11. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality

    PubMed Central

    Albert, Océane; Reintsch, Wolfgang E.; Chan, Peter; Robaire, Bernard

    2016-01-01

    STUDY QUESTION Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? SUMMARY ANSWER We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. WHAT IS KNOWN ALREADY The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. STUDY DESIGN, SIZE, DURATION The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses (n = 3–5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. PARTICIPANTS/MATERIALS, SETTING, METHODS Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. MAIN RESULTS AND THE ROLE OF CHANCE We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. LIMITATIONS, REASONS FOR CAUTION The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. WIDER IMPLICATIONS OF THE FINDINGS This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. STUDY FUNDING/COMPETING INTEREST(S) Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. PMID:26975326

  12. The clinical applicability of an automated plethysmographic determination of the ankle-brachial index after vascular surgery.

    PubMed

    van der Slegt, Jasper; Verbogt, Nathalie Pa; Mulder, Paul Gh; Steunenberg, Stijn L; Steunenberg, Bastiaan E; van der Laan, Lijckle

    2016-10-01

    An automated ankle-brachial index device could lead to potential time savings and more accuracy in ankle-brachial index-determination after vascular surgery. This prospective cross-sectional study compared postprocedural ankle-brachial indices measured by a manual method with ankle-brachial indices of an automated plethysmographic method. Forty-two patients were included. No significant difference in time performing a measurement was observed (1.1 min, 95% CI: -0.2 to +2.4; P = 0.095). Mean ankle-brachial index with the automated method was 0.105 higher (95% CI: 0.017 to 0.193; P = 0.020) than with the manual method, with limits of agreement of -0.376 and +0.587. Total variance amounted to 0.0759 and the correlation between both methods was 0.60. Reliability expressed as maximum absolute difference (95% level) between duplicate ankle-brachial index-measurements under identical conditions was 0.350 (manual) and 0.152 (automated), although not significant (p = 0.053). Finally, the automated method had 34% points higher failure rate than the manual method. In conclusion based on this study, the automated ankle-brachial index-method seems not to be clinically applicable for measuring ankle-brachial index postoperatively in patients with vascular disease. © The Author(s) 2016.

  13. Optimized anion exchange column isolation of zirconium-89 ( 89Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.

    Zirconium-89 ( 89Zr), produced by the (p, n) reaction from naturally monoisotopic yttrium ( natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its abilitymore » to quantitatively capture Zr from a load solution high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>10 5) and has been shown to remove Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the method was evaluated using cyclotron bombarded Y foil targets; the method was shown to achieve >95% recovery of the 89Zr present in the foils. The anion exchange column method described here is intended to be the first 89Zr isolation stage in a dual-column purification process.« less

  14. Optimized anion exchange column isolation of zirconium-89 ( 89Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    DOE PAGES

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.; ...

    2018-02-24

    Zirconium-89 ( 89Zr), produced by the (p, n) reaction from naturally monoisotopic yttrium ( natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its abilitymore » to quantitatively capture Zr from a load solution high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>10 5) and has been shown to remove Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the method was evaluated using cyclotron bombarded Y foil targets; the method was shown to achieve >95% recovery of the 89Zr present in the foils. The anion exchange column method described here is intended to be the first 89Zr isolation stage in a dual-column purification process.« less

  15. An automated retinal imaging method for the early diagnosis of diabetic retinopathy.

    PubMed

    Franklin, S Wilfred; Rajan, S Edward

    2013-01-01

    Diabetic retinopathy is a microvascular complication of long-term diabetes and is the major cause for eyesight loss due to changes in blood vessels of the retina. Major vision loss due to diabetic retinopathy is highly preventable with regular screening and timely intervention at the earlier stages. Retinal blood vessel segmentation methods help to identify the successive stages of such sight threatening diseases like diabetes. To develop and test a novel retinal imaging method which segments the blood vessels automatically from retinal images, which helps the ophthalmologists in the diagnosis and follow-up of diabetic retinopathy. This method segments each image pixel as vessel or nonvessel, which in turn, used for automatic recognition of the vasculature in retinal images. Retinal blood vessels were identified by means of a multilayer perceptron neural network, for which the inputs were derived from the Gabor and moment invariants-based features. Back propagation algorithm, which provides an efficient technique to change the weights in a feed forward network, is utilized in our method. Quantitative results of sensitivity, specificity and predictive values were obtained in our method and the measured accuracy of our segmentation algorithm was 95.3%, which is better than that presented by state-of-the-art approaches. The evaluation procedure used and the demonstrated effectiveness of our automated retinal imaging method proves itself as the most powerful tool to diagnose diabetic retinopathy in the earlier stages.

  16. Slotting optimization of automated storage and retrieval system (AS/RS) for efficient delivery of parts in an assembly shop using genetic algorithm: A case Study

    NASA Astrophysics Data System (ADS)

    Yue, L.; Guan, Z.; He, C.; Luo, D.; Saif, U.

    2017-06-01

    In recent years, the competitive pressure on manufacturing companies shifted them from mass production to mass customization to produce large variety of products. It is a great challenge for companies nowadays to produce customized mixed flow mode of production to meet customized demand on time. Due to large variety of products, the storage system to deliver variety of products to production lines influences on the timely production of variety of products, as investigated from by simulation study of an inefficient storage system of a real Company, in the current research. Therefore, current research proposed a slotting optimization model with mixed model sequence to assemble in consideration of the final flow lines to optimize whole automated storage and retrieval system (AS/RS) and distribution system in the case company. Current research is aimed to minimize vertical height of centre of gravity of AS/RS and total time spent for taking the materials out from the AS/RS simultaneously. Genetic algorithm is adopted to solve the proposed problem and computational result shows significant improvement in stability and efficiency of AS/RS as compared to the existing method used in the case company.

  17. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  18. Brain tissues volume measurements from 2D MRI using parametric approach

    NASA Astrophysics Data System (ADS)

    L'vov, A. A.; Toropova, O. A.; Litovka, Yu. V.

    2018-04-01

    The purpose of the paper is to propose a fully automated method of volume assessment of structures within human brain. Our statistical approach uses maximum interdependency principle for decision making process of measurements consistency and unequal observations. Detecting outliers performed using maximum normalized residual test. We propose a statistical model which utilizes knowledge of tissues distribution in human brain and applies partial data restoration for precision improvement. The approach proposes completed computationally efficient and independent from segmentation algorithm used in the application.

  19. System and method for automated object detection in an image

    DOEpatents

    Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.

    2015-10-06

    A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.

  20. Screening for illicit and medicinal drugs in whole blood using fully automated SPE and ultra-high-performance liquid chromatography with TOF-MS with data-independent acquisition.

    PubMed

    Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav; Rasmussen, Brian Schou; Müller, Irene Breum; Johansen, Sys Stybe; Linnet, Kristian

    2013-07-01

    A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and ultra-high-performance liquid chromatography (UHPLC) with TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with regard to matrix effects, extraction recovery, and process efficiency. The limit of identification ranged from 0.001 to 0.1 mg/kg, and the process efficiency exceeded 50% for 73 of the 95 analytes. As an example of application, 1335 forensic traffic cases were analyzed with the presented screening method. Of these, 992 cases (74%) were positive for one or more traffic-relevant drugs above the Danish legal limits. Commonly abused drugs such as amphetamine, cocaine, and frequent types of benzodiazepines were the major findings. Nineteen less frequently encountered drugs were detected e.g. buprenorphine, butylone, cathine, fentanyl, lysergic acid diethylamide, m-chlorophenylpiperazine, 3,4-methylenedioxypyrovalerone, mephedrone, 4-methylamphetamine, p-fluoroamphetamine, and p-methoxy-N-methylamphetamine. In conclusion, using UHPLC-TOF-MS screening with data-independent acquisition resulted in the detection of common drugs of abuse as well as new designer drugs and more rarely occurring drugs. Thus, TOF-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of δ-9-tetrahydrocannabinol, which should be handled in a separate method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A deep convolutional neural network approach to single-particle recognition in cryo-electron microscopy.

    PubMed

    Zhu, Yanan; Ouyang, Qi; Mao, Youdong

    2017-07-21

    Single-particle cryo-electron microscopy (cryo-EM) has become a mainstream tool for the structural determination of biological macromolecular complexes. However, high-resolution cryo-EM reconstruction often requires hundreds of thousands of single-particle images. Particle extraction from experimental micrographs thus can be laborious and presents a major practical bottleneck in cryo-EM structural determination. Existing computational methods for particle picking often use low-resolution templates for particle matching, making them susceptible to reference-dependent bias. It is critical to develop a highly efficient template-free method for the automatic recognition of particle images from cryo-EM micrographs. We developed a deep learning-based algorithmic framework, DeepEM, for single-particle recognition from noisy cryo-EM micrographs, enabling automated particle picking, selection and verification in an integrated fashion. The kernel of DeepEM is built upon a convolutional neural network (CNN) composed of eight layers, which can be recursively trained to be highly "knowledgeable". Our approach exhibits an improved performance and accuracy when tested on the standard KLH dataset. Application of DeepEM to several challenging experimental cryo-EM datasets demonstrated its ability to avoid the selection of un-wanted particles and non-particles even when true particles contain fewer features. The DeepEM methodology, derived from a deep CNN, allows automated particle extraction from raw cryo-EM micrographs in the absence of a template. It demonstrates an improved performance, objectivity and accuracy. Application of this novel method is expected to free the labor involved in single-particle verification, significantly improving the efficiency of cryo-EM data processing.

  2. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    PubMed

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. SU-E-CAMPUS-T-01: Automation of the Winston-Lutz Test for Stereotactic Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litzenberg, D; Irrer, J; Kessler, M

    Purpose: To optimize clinical efficiency and shorten patient wait time by minimizing the time and effort required to perform the Winston-Lutz test before stereotactic radiosurgery (SRS) through automation of the delivery, analysis, and documentation of results. Methods: The radiation fields of the Winston-Lutz (WL) test were created in a “machine-QA patient” saved in ARIA for use before SRS cases. Images of the BRW target ball placed at mechanical isocenter are captured with the portal imager for each of four, 2cm×2cm, MLC-shaped beams. When the WL plan is delivered and closed, this event is detected by in-house software called EventNet whichmore » automates subsequent processes with the aid of the ARIA web services. Images are automatically retrieved from the ARIA database and analyzed to determine the offset of the target ball from radiation isocenter. The results are posted to a website and a composite summary image of the results is pushed back into ImageBrowser for review and authenticated documentation. Results: The total time to perform the test was reduced from 20-25 minutes to less than 4 minutes. The results were found to be more accurate and consistent than the previous method which used radiochromic film. The images were also analyzed with DoseLab for comparison. The difference between the film and automated WL results in the X and Y direction and the radius were (−0.17 +/− 0.28) mm, (0.21 +/− 0.20) mm and (−0.14 +/− 0.27) mm, respectively. The difference between the DoseLab and automated WL results were (−0.05 +/− 0.06) mm, (−0.01 +/− 0.02) mm and (0.01 +/− 0.07) mm, respectively. Conclusions: This process reduced patient wait times by 15–20 minutes making the treatment machine available to treat another patient. Accuracy and consistency of results were improved over the previous method and were comparable to other commercial solutions. Access to the ARIA web services is made possible through an Eclipse co-development agreement with Varian Medical Systems.« less

  4. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    PubMed

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  5. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  6. Crew/Automation Interaction in Space Transportation Systems: Lessons Learned from the Glass Cockpit

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne

    2000-01-01

    The progressive integration of automation technologies in commercial transport aircraft flight decks - the 'glass cockpit' - has had a major, and generally positive, impact on flight crew operations. Flight deck automation has provided significant benefits, such as economic efficiency, increased precision and safety, and enhanced functionality within the crew interface. These enhancements, however, may have been accrued at a price, such as complexity added to crew/automation interaction that has been implicated in a number of aircraft incidents and accidents. This report briefly describes 'glass cockpit' evolution. Some relevant aircraft accidents and incidents are described, followed by a more detailed description of human/automation issues and problems (e.g., crew error, monitoring, modes, command authority, crew coordination, workload, and training). This paper concludes with example principles and guidelines for considering 'glass cockpit' human/automation integration within space transportation systems.

  7. The design of the automated control system for warehouse equipment under radio-electronic manufacturing

    NASA Astrophysics Data System (ADS)

    Kapulin, D. V.; Chemidov, I. V.; Kazantsev, M. A.

    2017-01-01

    In the paper, the aspects of design, development and implementation of the automated control system for warehousing under the manufacturing process of the radio-electronic enterprise JSC «Radiosvyaz» are discussed. The architecture of the automated control system for warehousing proposed in the paper consists of a server which is connected to the physically separated information networks: the network with a database server, which stores information about the orders for picking, and the network with the automated storage and retrieval system. This principle allows implementing the requirements for differentiation of access, ensuring the information safety and security requirements. Also, the efficiency of the developed automated solutions in terms of optimizing the warehouse’s logistic characteristics is researched.

  8. Technology demonstration of space intravehicular automation and robotics

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Barker, L. Keith

    1994-01-01

    Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.

  9. A Toolset for Supporting Iterative Human Automation: Interaction in Design

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2010-01-01

    The addition of automation has greatly extended humans' capability to accomplish tasks, including those that are difficult, complex and safety critical. The majority of Human - Automation Interacton (HAl) results in more efficient and safe operations, ho,,:,ever ertain unpected atomatlon behaviors or "automation surprises" can be frustrating and, In certain safety critical operations (e.g. transporttion, manufacturing control, medicine), may result in injuries or. the loss of life.. (Mellor, 1994; Leveson, 1995; FAA, 1995; BASI, 1998; Sheridan, 2002). This papr describes he development of a design tool that enables on the rapid development and evaluation. of automaton prototypes. The ultimate goal of the work is to provide a design platform upon which automation surprise vulnerability analyses can be integrated.

  10. Energy monitoring based on human activity in the workplace

    NASA Astrophysics Data System (ADS)

    Mustafa, N. H.; Husain, M. N.; Abd Aziz, M. Z. A.; Othman, M. A.; Malek, F.

    2014-04-01

    Human behavior is the most important factor in order to manage energy usage. Nowadays, smart house technology offers a better quality of life by introducing automated appliance control and assistive services. However, human behaviors will contribute to the efficiency of the system. This paper will focus on monitoring efficiency based on duration time in office hours around 8am until 5pm which depend on human behavior atb the workplace. Then, the correlation coefficient method is used to show the relation between energy consumption and energy saving based on the total hours of time energy spent. In future, the percentages of energy monitoring system usage will be increase to manage energy in efficient ways based on human behaviours. This scenario will lead to the positive impact in order to achieve the energy saving in the building and support the green environment.

  11. A comparison study of size-specific dose estimate calculation methods.

    PubMed

    Parikh, Roshni A; Wien, Michael A; Novak, Ronald D; Jordan, David W; Klahr, Paul; Soriano, Stephanie; Ciancibello, Leslie; Berlin, Sheila C

    2018-01-01

    The size-specific dose estimate (SSDE) has emerged as an improved metric for use by medical physicists and radiologists for estimating individual patient dose. Several methods of calculating SSDE have been described, ranging from patient thickness or attenuation-based (automated and manual) measurements to weight-based techniques. To compare the accuracy of thickness vs. weight measurement of body size to allow for the calculation of the size-specific dose estimate (SSDE) in pediatric body CT. We retrospectively identified 109 pediatric body CT examinations for SSDE calculation. We examined two automated methods measuring a series of level-specific diameters of the patient's body: method A used the effective diameter and method B used the water-equivalent diameter. Two manual methods measured patient diameter at two predetermined levels: the superior endplate of L2, where body width is typically most thin, and the superior femoral head or iliac crest (for scans that did not include the pelvis), where body width is typically most thick; method C averaged lateral measurements at these two levels from the CT projection scan, and method D averaged lateral and anteroposterior measurements at the same two levels from the axial CT images. Finally, we used body weight to characterize patient size, method E, and compared this with the various other measurement methods. Methods were compared across the entire population as well as by subgroup based on body width. Concordance correlation (ρ c ) between each of the SSDE calculation methods (methods A-E) was greater than 0.92 across the entire population, although the range was wider when analyzed by subgroup (0.42-0.99). When we compared each SSDE measurement method with CTDI vol, there was poor correlation, ρ c <0.77, with percentage differences between 20.8% and 51.0%. Automated computer algorithms are accurate and efficient in the calculation of SSDE. Manual methods based on patient thickness provide acceptable dose estimates for pediatric patients <30 cm in body width. Body weight provides a quick and practical method to identify conversion factors that can be used to estimate SSDE with reasonable accuracy in pediatric patients with body width ≥20 cm.

  12. Evaluation of immunoturbidimetric rheumatoid factor method from Diagam on Abbott c8000 analyzer: comparison with immunonephelemetric method.

    PubMed

    Dupuy, Anne Marie; Hurstel, Rémy; Bargnoux, Anne Sophie; Badiou, Stéphanie; Cristol, Jean Paul

    2014-01-01

    Rheumatoid factor (RF) consists of autoantibodies and because of its heterogeneity its determination is not easy. Currently, nephelometry and Elisa method are considered as reference methods. Due to consolidation, many laboratories have fully automated turbidimetric apparatus, and specific nephelemetric systems are not always available. In addition, nephelemetry is more accurate, but time consuming, expensive, and requires a specific device, resulting in a lower efficiency. Turbidimetry could be an attractive alternative. The turbidimetric RF test from Diagam meets the requirements of accuracy and precision for optimal clinical use, with an acceptable measuring range, and could be an alternative in the determination of RF, without the associated cost of a dedicated instrument, making consolidation and saving blood possible.

  13. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  14. Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.

    PubMed

    Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt

    2015-08-24

    High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Automation bias: decision making and performance in high-tech cockpits.

    PubMed

    Mosier, K L; Skitka, L J; Heers, S; Burdick, M

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  16. A vision-based automated guided vehicle system with marker recognition for indoor use.

    PubMed

    Lee, Jeisung; Hyun, Chang-Ho; Park, Mignon

    2013-08-07

    We propose an intelligent vision-based Automated Guided Vehicle (AGV) system using fiduciary markers. In this paper, we explore a low-cost, efficient vehicle guiding method using a consumer grade web camera and fiduciary markers. In the proposed method, the system uses fiduciary markers with a capital letter or triangle indicating direction in it. The markers are very easy to produce, manipulate, and maintain. The marker information is used to guide a vehicle. We use hue and saturation values in the image to extract marker candidates. When the known size fiduciary marker is detected by using a bird's eye view and Hough transform, the positional relation between the marker and the vehicle can be calculated. To recognize the character in the marker, a distance transform is used. The probability of feature matching was calculated by using a distance transform, and a feature having high probability is selected as a captured marker. Four directional signals and 10 alphabet features are defined and used as markers. A 98.87% recognition rate was achieved in the testing phase. The experimental results with the fiduciary marker show that the proposed method is a solution for an indoor AGV system.

  17. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Real-time OHT Dispatching Mechanism for the Interbay Automated Material Handling System with Shortcuts and Bypasses

    NASA Astrophysics Data System (ADS)

    Pan, Cong; Zhang, Jie; Qin, Wei

    2017-05-01

    As a key to improve the performance of the interbay automated material handling system (AMHS) in 300 mm semiconductor wafer fabrication system, the real-time overhead hoist transport (OHT) dispatching problem has received much attention. This problem is first formulated as a special form of assignment problem and it is proved that more than one solution will be obtained by Hungarian algorithm simultaneously. Through proposing and strictly proving two propositions related to the characteristics of these solutions, a modified Hungarian algorithm is designed to distinguish these solutions. Finally, a new real-time OHT dispatching method is carefully designed by implementing the solution obtained by the modified Hungarian algorithm. The experimental results of discrete event simulations show that, compared with conventional Hungarian algorithm dispatching method, the proposed dispatching method that chooses the solution with the maximum variance respectively reduces on average 4 s of the average waiting time and average lead time of wafer lots, and its performance is rather stable in multiple different scenarios of the interbay AMHS with different quantities of shortcuts. This research provides an efficient real-time OHT dispatching mechanism for the interbay AMHS with shortcuts and bypasses.

  19. An automated workflow for enhancing microbial bioprocess optimization on a novel microbioreactor platform

    PubMed Central

    2012-01-01

    Background High-throughput methods are widely-used for strain screening effectively resulting in binary information regarding high or low productivity. Nevertheless achieving quantitative and scalable parameters for fast bioprocess development is much more challenging, especially for heterologous protein production. Here, the nature of the foreign protein makes it impossible to predict the, e.g. best expression construct, secretion signal peptide, inductor concentration, induction time, temperature and substrate feed rate in fed-batch operation to name only a few. Therefore, a high number of systematic experiments are necessary to elucidate the best conditions for heterologous expression of each new protein of interest. Results To increase the throughput in bioprocess development, we used a microtiter plate based cultivation system (Biolector) which was fully integrated into a liquid-handling platform enclosed in laminar airflow housing. This automated cultivation platform was used for optimization of the secretory production of a cutinase from Fusarium solani pisi with Corynebacterium glutamicum. The online monitoring of biomass, dissolved oxygen and pH in each of the microtiter plate wells enables to trigger sampling or dosing events with the pipetting robot used for a reliable selection of best performing cutinase producers. In addition to this, further automated methods like media optimization and induction profiling were developed and validated. All biological and bioprocess parameters were exclusively optimized at microtiter plate scale and showed perfect scalable results to 1 L and 20 L stirred tank bioreactor scale. Conclusions The optimization of heterologous protein expression in microbial systems currently requires extensive testing of biological and bioprocess engineering parameters. This can be efficiently boosted by using a microtiter plate cultivation setup embedded into a liquid-handling system, providing more throughput by parallelization and automation. Due to improved statistics by replicate cultivations, automated downstream analysis, and scalable process information, this setup has superior performance compared to standard microtiter plate cultivation. PMID:23113930

  20. 1366 Project Automate: Enabling Automation for <$0.10/W High-Efficiency Kerfless Wafers Manufactured in the US

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, Adam

    For photovoltaic (PV) manufacturing to thrive in the U.S., there must be an innovative core to the technology. Project Automate builds on 1366’s proprietary Direct Wafer® kerfless wafer technology and aims to unlock the cost and efficiency advantages of thin kerfless wafers. Direct Wafer is an innovative, U.S.-friendly (efficient, low-labor content) manufacturing process that addresses the main cost barrier limiting silicon PV cost-reductions – the 35-year-old grand challenge of manufacturing quality wafers (40% of the cost of modules) without the cost and waste of sawing. This simple, scalable process will allow 1366 to manufacture “drop-in” replacement wafers for the $10more » billion silicon PV wafer market at 50% of the cost, 60% of the capital, and 30% of the electricity of conventional casting and sawing manufacturing processes. This SolarMat project developed the Direct Wafer processes’ unique capability to tailor the shape of wafers to simultaneously make thinner AND stronger wafers (with lower silicon usage) that enable high-efficiency cell architectures. By producing wafers with a unique target geometry including a thick border (which determines handling characteristics) and thin interior regions (which control light capture and electron transport and therefore determine efficiency), 1366 can simultaneously improve quality and lower cost (using less silicon).« less

  1. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  2. Novel strategy to implement active-space coupled-cluster methods

    NASA Astrophysics Data System (ADS)

    Rolik, Zoltán; Kállay, Mihály

    2018-03-01

    A new approach is presented for the efficient implementation of coupled-cluster (CC) methods including higher excitations based on a molecular orbital space partitioned into active and inactive orbitals. In the new framework, the string representation of amplitudes and intermediates is used as long as it is beneficial, but the contractions are evaluated as matrix products. Using a new diagrammatic technique, the CC equations are represented in a compact form due to the string notations we introduced. As an application of these ideas, a new automated implementation of the single-reference-based multi-reference CC equations is presented for arbitrary excitation levels. The new program can be considered as an improvement over the previous implementations in many respects; e.g., diagram contributions are evaluated by efficient vectorized subroutines. Timings for test calculations for various complete active-space problems are presented. As an application of the new code, the weak interactions in the Be dimer were studied.

  3. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; ...

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  4. Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Aidan Patrick; Schultz, Peter Andrew; Crozier, Paul

    2014-09-01

    This report summarizes the result of LDRD project 12-0395, titled "Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations." During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projectedmore » on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel computers and advanced processor ar- chitectures. Finally, we briefly describe the MSM method for efficient calculation of electrostatic interactions on massively parallel computers.« less

  5. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    PubMed

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without compromise in image quality or information loss in associated spectra. These results motivate further use of label free microscopy techniques in real-time imaging of live immune cells.

  6. Automated SDI Services. (Selective Dissemination of Information).

    ERIC Educational Resources Information Center

    Altmann, Berthold

    An automated SDI service based on tapes supplied by DDC, Science Abstracts, and Engineering Index is evaluated as a component element of the entire HDL information system. Current studies for improving the efficiency are briefly described,--in particular, the establishment of a parameter reference service that should shorten the lead-time for the…

  7. Simple and Efficient Technique for Spatial/Temporal Composite Imagery

    DTIC Science & Technology

    2007-08-01

    visible spectrum between 412nm and 869nm, three bands at 500m and two bands at 250m. The MODIS data was processed using the Automated Processing System2...Version 3.6 developed by the Naval Research Labo- ratory (NRL). The Automated Processing System (APS) is a collection of software programs assembled

  8. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    ERIC Educational Resources Information Center

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  9. Automation of Random Conical Tilt and Orthogonal Tilt Data Collection using Feature Based Correlation

    PubMed Central

    Yoshioka, Craig; Pulokas, James; Fellmann, Denis; Potter, Clinton S.; Milligan, Ronald A.; Carragher, Bridget

    2007-01-01

    Visualization by electron microscopy has provided many insights into the composition, quaternary structure, and mechanism of macromolecular assemblies. By preserving samples in stain or vitreous ice it is possible to image them as discrete particles, and from these images generate three-dimensional structures. This ‘single-particle’ approach suffers from two major shortcomings; it requires an initial model to reconstitute 2D data into a 3D volume, and it often fails when faced with conformational variability. Random conical tilt (RCT) and orthogonal tilt (OTR) are methods developed to overcome these problems, but the data collection required, particularly for vitreous ice specimens, is difficult and tedious. In this paper we present an automated approach to RCT/OTR data collection that removes the burden of manual collection and offers higher quality and throughput than is otherwise possible. We show example datasets collected under stain and cryo conditions and provide statistics related to the efficiency and robustness of the process. Furthermore, we describe the new algorithms that make this method possible, which include new calibrations, improved targeting and feature-based tracking. PMID:17524663

  10. A conceptual framework for automating the operational and strategic decision-making process in the health care delivery system.

    PubMed

    Ruohonen, Toni; Ennejmy, Mohammed

    2013-01-01

    Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.

  11. SAND: an automated VLBI imaging and analysing pipeline - I. Stripping component trajectories

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Collioud, A.; Charlot, P.

    2018-02-01

    We present our implementation of an automated very long baseline interferometry (VLBI) data-reduction pipeline that is dedicated to interferometric data imaging and analysis. The pipeline can handle massive VLBI data efficiently, which makes it an appropriate tool to investigate multi-epoch multiband VLBI data. Compared to traditional manual data reduction, our pipeline provides more objective results as less human interference is involved. The source extraction is carried out in the image plane, while deconvolution and model fitting are performed in both the image plane and the uv plane for parallel comparison. The output from the pipeline includes catalogues of CLEANed images and reconstructed models, polarization maps, proper motion estimates, core light curves and multiband spectra. We have developed a regression STRIP algorithm to automatically detect linear or non-linear patterns in the jet component trajectories. This algorithm offers an objective method to match jet components at different epochs and to determine their proper motions.

  12. A Binary Segmentation Approach for Boxing Ribosome Particles in Cryo EM Micrographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adiga, Umesh P.S.; Malladi, Ravi; Baxter, William

    Three-dimensional reconstruction of ribosome particles from electron micrographs requires selection of many single-particle images. Roughly 100,000 particles are required to achieve approximately 10 angstrom resolution. Manual selection of particles, by visual observation of the micrographs on a computer screen, is recognized as a bottleneck in automated single particle reconstruction. This paper describes an efficient approach for automated boxing of ribosome particles in micrographs. Use of a fast, anisotropic non-linear reaction-diffusion method to pre-process micrographs and rank-leveling to enhance the contrast between particles and the background, followed by binary and morphological segmentation constitute the core of this technique. Modifying the shapemore » of the particles to facilitate segmentation of individual particles within clusters and boxing the isolated particles is successfully attempted. Tests on a limited number of micrographs have shown that over 80 percent success is achieved in automatic particle picking.« less

  13. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  14. Automated image quality assessment for chest CT scans.

    PubMed

    Reeves, Anthony P; Xie, Yiting; Liu, Shuang

    2018-02-01

    Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.

  15. Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.

    PubMed Central

    Musen, M. A.

    1998-01-01

    When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community. PMID:9929181

  16. Fast vessel segmentation in retinal images using multi-scale enhancement and second-order local entropy

    NASA Astrophysics Data System (ADS)

    Yu, H.; Barriga, S.; Agurto, C.; Zamora, G.; Bauman, W.; Soliz, P.

    2012-03-01

    Retinal vasculature is one of the most important anatomical structures in digital retinal photographs. Accurate segmentation of retinal blood vessels is an essential task in automated analysis of retinopathy. This paper presents a new and effective vessel segmentation algorithm that features computational simplicity and fast implementation. This method uses morphological pre-processing to decrease the disturbance of bright structures and lesions before vessel extraction. Next, a vessel probability map is generated by computing the eigenvalues of the second derivatives of Gaussian filtered image at multiple scales. Then, the second order local entropy thresholding is applied to segment the vessel map. Lastly, a rule-based decision step, which measures the geometric shape difference between vessels and lesions is applied to reduce false positives. The algorithm is evaluated on the low-resolution DRIVE and STARE databases and the publicly available high-resolution image database from Friedrich-Alexander University Erlangen-Nuremberg, Germany). The proposed method achieved comparable performance to state of the art unsupervised vessel segmentation methods with a competitive faster speed on the DRIVE and STARE databases. For the high resolution fundus image database, the proposed algorithm outperforms an existing approach both on performance and speed. The efficiency and robustness make the blood vessel segmentation method described here suitable for broad application in automated analysis of retinal images.

  17. Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.

    PubMed

    Musen, M A

    1998-01-01

    When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  18. Portable Automation of Static Chamber Sample Collection for Quantifying Soil Gas Flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Morgan P.; Groh, Tyler A.; Parkin, Timothy B.

    Quantification of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled is limited by the spacing between chambers and the availability of trained research technicians. An automated system for collecting gas samples from chambers in the field would eliminate the need for personnel to return to the chamber during a flux measurement period and would allow a single technician to sample multiple chambers simultaneously. This study describes Chamber Automated Sampling Equipment (FluxCASE) to collect and store chamber headspace gas samples at assigned time points for the measurement of soil gasmore » flux. The FluxCASE design and operation is described, and the accuracy and precision of the FluxCASE system is evaluated. In laboratory measurements of nitrous oxide (N2O), carbon dioxide (CO2), and methane (CH4) concentrations of a standardized gas mixture, coefficients of variation associated with automated and manual sample collection were comparable, indicating no loss of precision. In the field, soil gas fluxes measured from FluxCASEs were in agreement with manual sampling for both N2O and CO2. Slopes of regression equations were 1.01 for CO2 and 0.97 for N2O. The 95% confidence limits of the slopes of the regression lines included the value of one, indicating no bias. Additionally, an expense analysis found a cost recovery ranging from 0.6 to 2.2 yr. Implementing the FluxCASE system is an alternative to improve the efficiency of the static chamber method for measuring soil gas flux while maintaining the accuracy and precision of manual sampling.« less

  19. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    PubMed

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  20. Old and new techniques mixed up into optical photomask measurement method

    NASA Astrophysics Data System (ADS)

    Fukui, Jumpei; Tachibana, Yusaku; Osanai, Makoto

    2017-07-01

    It has been still highly required for cost efficient solution with easy operation for full-automated CD measurement for line width about 500nm up to 5μm on photomask, because it is frequently use such photomask in the process of manufacturing MEMS sensor for IoT and some devices made in BCD (Bipola CMOS DMOS). As reply to such demand in photomask manufacturing field, we try to take a low noise digital camera technology and LED light source for i-line, which are recently developed, into new measuring tool in order to achieve 1nm (3σ) repeatability for line width measurement between 300nm to 10μm. In addition, for the purpose of full-automated operation, it is very important to find where an initial target line in dense pattern. To achieve such auto line detection precisely, we have improved accuracy of high precision stage (20nm as 3σ) and an alignment algorithm of MEMS Stepper to combine with this tool. As for user-friendly interface, Windows based software helps a lot for not only the operation but also recipe creation or edition in Excel. Actually, in the MEMS manufacturing process, there are various photomasks which need to be check and measure frequently therefore various recipe files are also have to be created and edited frequently.. In order to meet such a requirement in photomask management, we try to make it true by mixing old and new techniques together into one system, which comes to fully automated and cost efficient tool with 1nm repeatability in CD measurement.

  1. Evaluation of effectiveness of information systems implementation in organization (by example of ERP-systems)

    NASA Astrophysics Data System (ADS)

    Demyanova, O. V.; Andreeva, E. V.; Sibgatullina, D. R.; Kireeva-Karimova, A. M.; Gafurova, A. Y.; Zakirova, Ch S.

    2018-05-01

    ERP in a modern enterprise information system allowed optimizing internal business processes, reducing production costs and increasing the attractiveness of enterprises for investors. It is an important component of success in the competition and an important condition for attracting investments in the key sector of the state. A vivid example of these systems are enterprise information systems using the methodology of ERP (Enterprise Resource Planning - enterprise resource planning). ERP is an integrated set of methods, processes, technologies and tools. It is based on: supply chain management; advanced planning and scheduling; sales automation; tool responsible for configuring; final resource planning; intelligence business; OLAP technology; block e- Commerce; management of product data. The main purpose of ERP systems is the automation of interrelated processes of planning, accounting and management in key areas of the company. ERP systems are automated systems that effectively address complex problems, including optimal allocation of business resources, ensuring quick and efficient delivery of goods and services to the consumer. Knowledge embedded in ERP systems provided enterprise-wide automation to introduce the activities of all functional departments of the company as a single complex system. At the level of quality estimates, most managers understand that the implementations of ERP systems is a necessary and useful procedure. Assessment of the effectiveness of the information systems implementation is relevant.

  2. Clinical brain MR imaging prescriptions in Talairach space: technologist- and computer-driven methods.

    PubMed

    Weiss, Kenneth L; Pan, Hai; Storrs, Judd; Strub, William; Weiss, Jane L; Jia, Li; Eldevik, O Petter

    2003-05-01

    Variability in patient head positioning may yield substantial interstudy image variance in the clinical setting. We describe and test three-step technologist and computer-automated algorithms designed to image the brain in a standard reference system and reduce variance. Triple oblique axial images obtained parallel to the Talairach anterior commissure (AC)-posterior commissure (PC) plane were reviewed in a prospective analysis of 126 consecutive patients. Requisite roll, yaw, and pitch correction, as three authors determined independently and subsequently by consensus, were compared with the technologists' actual graphical prescriptions and those generated by a novel computer automated three-step (CATS) program. Automated pitch determinations generated with Statistical Parametric Mapping '99 (SPM'99) were also compared. Requisite pitch correction (15.2 degrees +/- 10.2 degrees ) far exceeded that for roll (-0.6 degrees +/- 3.7 degrees ) and yaw (-0.9 degrees +/- 4.7 degrees ) in terms of magnitude and variance (P <.001). Technologist and computer-generated prescriptions substantially reduced interpatient image variance with regard to roll (3.4 degrees and 3.9 degrees vs 13.5 degrees ), yaw (0.6 degrees and 2.5 degrees vs 22.3 degrees ), and pitch (28.6 degrees, 18.5 degrees with CATS, and 59.3 degrees with SPM'99 vs 104 degrees ). CATS performed worse than the technologists in yaw prescription, and it was equivalent in roll and pitch prescriptions. Talairach prescriptions better approximated standard CT canthomeatal angulations (9 degrees vs 24 degrees ) and provided more efficient brain coverage than that of routine axial imaging. Brain MR prescriptions corrected for direct roll, yaw, and Talairach AC-PC pitch can be readily achieved by trained technologists or automated computer algorithms. This ability will substantially reduce interpatient variance, allow better approximation of standard CT angulation, and yield more efficient brain coverage than that of routine clinical axial imaging.

  3. The Abbreviation of Personality, or how to Measure 200 Personality Scales with 200 Items

    PubMed Central

    Yarkoni, Tal

    2010-01-01

    Personality researchers have recently advocated the use of very short personality inventories in order to minimize administration time. However, few such inventories are currently available. Here I introduce an automated method that can be used to abbreviate virtually any personality inventory with minimal effort. After validating the method against existing measures in Studies 1 and 2, a new 181-item inventory is generated in Study 3 that accurately recaptures scores on 8 different broadband inventories comprising 203 distinct scales. Collectively, the results validate a powerful new way to improve the efficiency of personality measurement in research settings. PMID:20419061

  4. Automated grain mapping using wide angle convergent beam electron diffraction in transmission electron microscope for nanomaterials.

    PubMed

    Kumar, Vineet

    2011-12-01

    The grain size statistics, commonly derived from the grain map of a material sample, are important microstructure characteristics that greatly influence its properties. The grain map for nanomaterials is usually obtained manually by visual inspection of the transmission electron microscope (TEM) micrographs because automated methods do not perform satisfactorily. While the visual inspection method provides reliable results, it is a labor intensive process and is often prone to human errors. In this article, an automated grain mapping method is developed using TEM diffraction patterns. The presented method uses wide angle convergent beam diffraction in the TEM. The automated technique was applied on a platinum thin film sample to obtain the grain map and subsequently derive grain size statistics from it. The grain size statistics obtained with the automated method were found in good agreement with the visual inspection method.

  5. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  6. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Design on wireless auto-measurement system for lead rail straightness measurement based on PSD

    NASA Astrophysics Data System (ADS)

    Yan, Xiugang; Zhang, Shuqin; Dong, Dengfeng; Cheng, Zhi; Wu, Guanghua; Wang, Jie; Zhou, Weihu

    2016-10-01

    Straightness detection is not only one of the key technologies for the product quality and installation accuracy of all types of lead rail, but also an important dimensional measurement technology. The straightness measuring devices now available have disadvantages of low automation level, limiting by measuring environment, and low measurement efficiency. In this paper, a wireless measurement system for straightness detection based on position sensitive detector (PSD) is proposed. The system has some advantage of high automation-level, convenient, high measurement efficiency, easy to transplanting and expanding, and can detect straightness of lead rail in real-time.

  8. Automated visual imaging interface for the plant floor

    NASA Astrophysics Data System (ADS)

    Wutke, John R.

    1991-03-01

    The paper will provide an overview of the challenges facing a user of automated visual imaging (" AVI" ) machines and the philosophies that should be employed in designing them. As manufacturing tools and equipment become more sophisticated it is increasingly difficult to maintain an efficient interaction between the operator and machine. The typical user of an AVI machine in a production environment is technically unsophisticated. Also operator and machine ergonomics are often a neglected or poorly addressed part of an efficient manufacturing process. This paper presents a number of man-machine interface design techniques and philosophies that effectively solve these problems.

  9. A Graphical Operator Interface for a Telerobotic Inspection System

    NASA Technical Reports Server (NTRS)

    Kim, W. S.; Tso, K. S.; Hayati, S.

    1993-01-01

    Operator interface has recently emerged as an important element for efficient and safe operatorinteractions with the telerobotic system. Recent advances in graphical user interface (GUI) andgraphics/video merging technologies enable development of more efficient, flexible operatorinterfaces. This paper describes an advanced graphical operator interface newly developed for aremote surface inspection system at Jet Propulsion Laboratory. The interface has been designed sothat remote surface inspection can be performed by a single operator with an integrated robot controland image inspection capability. It supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.

  10. Automated system for on-line desorption of dried blood spots applied to LC/MS/MS pharmacokinetic study of flurbiprofen and its metabolite.

    PubMed

    Déglon, Julien; Thomas, Aurélien; Daali, Youssef; Lauer, Estelle; Samer, Caroline; Desmeules, Jules; Dayer, Pierre; Mangin, Patrice; Staub, Christian

    2011-01-25

    This paper illustrates the development of an automated system for the on-line bioanalysis of dried blood spots (on-line DBS). In this way, a prototype was designed for integration into a conventional LC/MS/MS, allowing the successive extraction of 30 DBS toward the analytical system without any sample pretreatment. The developed method was assessed for the DBS analysis of flurbiprofen (FLB) and its metabolite 4-hydroxyflurbiprofen (OH-FLB) in human whole blood (i.e. 5 μL). The automated procedure was fully validated based on international criteria and showed good precision, trueness, and linearity over the expected concentration range (from 10 to 1000 ng/mL and 100 to 10,000 ng/mL for OH-FLB and FLB respectively). Furthermore, the prototype showed good results in terms of recovery and carry-over. Stability of both analytes on filter paper was also investigated and the results suggested that DBS could be stored at ambient temperature for over 1 month. The on-line DBS automated system was then successfully applied to a pharmacokinetic study performed on healthy male volunteers after oral administration of a single 50-mg dose of FLB. Additionally, a comparison between finger capillary DBS and classic venous plasma concentrations was investigated. A good correlation was observed, demonstrating the complementarity of both sampling forms. The automated system described in this article represents an efficient tool for the LC/MS/MS analysis of DBS samples in many bioanalytical applications. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    PubMed

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  12. Automated packing systems: review of industrial implementations

    NASA Astrophysics Data System (ADS)

    Whelan, Paul F.; Batchelor, Bruce G.

    1993-08-01

    A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.

  13. HT-COMET: a novel automated approach for high throughput assessment of human sperm chromatin quality.

    PubMed

    Albert, Océane; Reintsch, Wolfgang E; Chan, Peter; Robaire, Bernard

    2016-05-01

    Can we make the comet assay (single-cell gel electrophoresis) for human sperm a more accurate and informative high throughput assay? We developed a standardized automated high throughput comet (HT-COMET) assay for human sperm that improves its accuracy and efficiency, and could be of prognostic value to patients in the fertility clinic. The comet assay involves the collection of data on sperm DNA damage at the level of the single cell, allowing the use of samples from severe oligozoospermic patients. However, this makes comet scoring a low throughput procedure that renders large cohort analyses tedious. Furthermore, the comet assay comes with an inherent vulnerability to variability. Our objective is to develop an automated high throughput comet assay for human sperm that will increase both its accuracy and efficiency. The study comprised two distinct components: a HT-COMET technical optimization section based on control versus DNAse treatment analyses ( ITALIC! n = 3-5), and a cross-sectional study on 123 men presenting to a reproductive center with sperm concentrations categorized as severe oligozoospermia, oligozoospermia or normozoospermia. Sperm chromatin quality was measured using the comet assay: on classic 2-well slides for software comparison; on 96-well slides for HT-COMET optimization; after exposure to various concentrations of a damage-inducing agent, DNAse, using HT-COMET; on 123 subjects with different sperm concentrations using HT-COMET. Data from the 123 subjects were correlated to classic semen quality parameters and plotted as single-cell data in individual DNA damage profiles. We have developed a standard automated HT-COMET procedure for human sperm. It includes automated scoring of comets by a fully integrated high content screening setup that compares well with the most commonly used semi-manual analysis software. Using this method, a cross-sectional study on 123 men showed no significant correlation between sperm concentration and sperm DNA damage, confirming the existence of hidden chromatin damage in men with apparently normal semen characteristics, and a significant correlation between percentage DNA in the tail and percentage of progressively motile spermatozoa. Finally, the use of DNA damage profiles helped to distinguish subjects between and within sperm concentration categories, and allowed a determination of the proportion of highly damaged cells. The main limitations of the HT-COMET are the high, yet indispensable, investment in an automated liquid handling system and heating block to ensure accuracy, and the availability of an automated plate reading microscope and analysis software. This standardized HT-COMET assay offers many advantages, including higher accuracy and evenness due to automation of sensitive steps, a 14.4-fold increase in sample analysis capacity, and an imaging and scoring time of 1 min/well. Overall, HT-COMET offers a decrease in total experimental time of more than 90%. Hence, this assay constitutes a more efficient option to assess sperm chromatin quality, paves the way to using this assay to screen large cohorts, and holds prognostic value for infertile patients. Funded by the CIHR Institute of Human Development, Child and Youth Health (IHDCYH; RHF 100625). O.A. is a fellow supported by the Fonds de la Recherche du Québec - Santé (FRQS) and the CIHR Training Program in Reproduction, Early Development, and the Impact on Health (REDIH). B.R. is a James McGill Professor. The authors declare no conflicts of interest. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Automation of testing modules of controller ELSY-ТМК

    NASA Astrophysics Data System (ADS)

    Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.

    2017-01-01

    In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.

  15. Improved and Robust Detection of Cell Nuclei from Four Dimensional Fluorescence Images

    PubMed Central

    Bashar, Md. Khayrul; Yamagata, Kazuo; Kobayashi, Tetsuya J.

    2014-01-01

    Segmentation-free direct methods are quite efficient for automated nuclei extraction from high dimensional images. A few such methods do exist but most of them do not ensure algorithmic robustness to parameter and noise variations. In this research, we propose a method based on multiscale adaptive filtering for efficient and robust detection of nuclei centroids from four dimensional (4D) fluorescence images. A temporal feedback mechanism is employed between the enhancement and the initial detection steps of a typical direct method. We estimate the minimum and maximum nuclei diameters from the previous frame and feed back them as filter lengths for multiscale enhancement of the current frame. A radial intensity-gradient function is optimized at positions of initial centroids to estimate all nuclei diameters. This procedure continues for processing subsequent images in the sequence. Above mechanism thus ensures proper enhancement by automated estimation of major parameters. This brings robustness and safeguards the system against additive noises and effects from wrong parameters. Later, the method and its single-scale variant are simplified for further reduction of parameters. The proposed method is then extended for nuclei volume segmentation. The same optimization technique is applied to final centroid positions of the enhanced image and the estimated diameters are projected onto the binary candidate regions to segment nuclei volumes.Our method is finally integrated with a simple sequential tracking approach to establish nuclear trajectories in the 4D space. Experimental evaluations with five image-sequences (each having 271 3D sequential images) corresponding to five different mouse embryos show promising performances of our methods in terms of nuclear detection, segmentation, and tracking. A detail analysis with a sub-sequence of 101 3D images from an embryo reveals that the proposed method can improve the nuclei detection accuracy by 9 over the previous methods, which used inappropriate large valued parameters. Results also confirm that the proposed method and its variants achieve high detection accuracies ( 98 mean F-measure) irrespective of the large variations of filter parameters and noise levels. PMID:25020042

  16. Automated Segmentation of Kidneys from MR Images in Patients with Autosomal Dominant Polycystic Kidney Disease

    PubMed Central

    Kim, Youngwoo; Ge, Yinghui; Tao, Cheng; Zhu, Jianbing; Chapman, Arlene B.; Torres, Vicente E.; Yu, Alan S.L.; Mrug, Michal; Bennett, William M.; Flessner, Michael F.; Landsittel, Doug P.

    2016-01-01

    Background and objectives Our study developed a fully automated method for segmentation and volumetric measurements of kidneys from magnetic resonance images in patients with autosomal dominant polycystic kidney disease and assessed the performance of the automated method with the reference manual segmentation method. Design, setting, participants, & measurements Study patients were selected from the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease. At the enrollment of the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease Study in 2000, patients with autosomal dominant polycystic kidney disease were between 15 and 46 years of age with relatively preserved GFRs. Our fully automated segmentation method was on the basis of a spatial prior probability map of the location of kidneys in abdominal magnetic resonance images and regional mapping with total variation regularization and propagated shape constraints that were formulated into a level set framework. T2–weighted magnetic resonance image sets of 120 kidneys were selected from 60 patients with autosomal dominant polycystic kidney disease and divided into the training and test datasets. The performance of the automated method in reference to the manual method was assessed by means of two metrics: Dice similarity coefficient and intraclass correlation coefficient of segmented kidney volume. The training and test sets were swapped for crossvalidation and reanalyzed. Results Successful segmentation of kidneys was performed with the automated method in all test patients. The segmented kidney volumes ranged from 177.2 to 2634 ml (mean, 885.4±569.7 ml). The mean Dice similarity coefficient ±SD between the automated and manual methods was 0.88±0.08. The mean correlation coefficient between the two segmentation methods for the segmented volume measurements was 0.97 (P<0.001 for each crossvalidation set). The results from the crossvalidation sets were highly comparable. Conclusions We have developed a fully automated method for segmentation of kidneys from abdominal magnetic resonance images in patients with autosomal dominant polycystic kidney disease with varying kidney volumes. The performance of the automated method was in good agreement with that of manual method. PMID:26797708

  17. Automated Segmentation of Kidneys from MR Images in Patients with Autosomal Dominant Polycystic Kidney Disease.

    PubMed

    Kim, Youngwoo; Ge, Yinghui; Tao, Cheng; Zhu, Jianbing; Chapman, Arlene B; Torres, Vicente E; Yu, Alan S L; Mrug, Michal; Bennett, William M; Flessner, Michael F; Landsittel, Doug P; Bae, Kyongtae T

    2016-04-07

    Our study developed a fully automated method for segmentation and volumetric measurements of kidneys from magnetic resonance images in patients with autosomal dominant polycystic kidney disease and assessed the performance of the automated method with the reference manual segmentation method. Study patients were selected from the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease. At the enrollment of the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease Study in 2000, patients with autosomal dominant polycystic kidney disease were between 15 and 46 years of age with relatively preserved GFRs. Our fully automated segmentation method was on the basis of a spatial prior probability map of the location of kidneys in abdominal magnetic resonance images and regional mapping with total variation regularization and propagated shape constraints that were formulated into a level set framework. T2-weighted magnetic resonance image sets of 120 kidneys were selected from 60 patients with autosomal dominant polycystic kidney disease and divided into the training and test datasets. The performance of the automated method in reference to the manual method was assessed by means of two metrics: Dice similarity coefficient and intraclass correlation coefficient of segmented kidney volume. The training and test sets were swapped for crossvalidation and reanalyzed. Successful segmentation of kidneys was performed with the automated method in all test patients. The segmented kidney volumes ranged from 177.2 to 2634 ml (mean, 885.4±569.7 ml). The mean Dice similarity coefficient ±SD between the automated and manual methods was 0.88±0.08. The mean correlation coefficient between the two segmentation methods for the segmented volume measurements was 0.97 (P<0.001 for each crossvalidation set). The results from the crossvalidation sets were highly comparable. We have developed a fully automated method for segmentation of kidneys from abdominal magnetic resonance images in patients with autosomal dominant polycystic kidney disease with varying kidney volumes. The performance of the automated method was in good agreement with that of manual method. Copyright © 2016 by the American Society of Nephrology.

  18. Color Retinal Image Enhancement Based on Luminosity and Contrast Adjustment.

    PubMed

    Zhou, Mei; Jin, Kai; Wang, Shaoze; Ye, Juan; Qian, Dahong

    2018-03-01

    Many common eye diseases and cardiovascular diseases can be diagnosed through retinal imaging. However, due to uneven illumination, image blurring, and low contrast, retinal images with poor quality are not useful for diagnosis, especially in automated image analyzing systems. Here, we propose a new image enhancement method to improve color retinal image luminosity and contrast. A luminance gain matrix, which is obtained by gamma correction of the value channel in the HSV (hue, saturation, and value) color space, is used to enhance the R, G, and B (red, green and blue) channels, respectively. Contrast is then enhanced in the luminosity channel of L * a * b * color space by CLAHE (contrast-limited adaptive histogram equalization). Image enhancement by the proposed method is compared to other methods by evaluating quality scores of the enhanced images. The performance of the method is mainly validated on a dataset of 961 poor-quality retinal images. Quality assessment (range 0-1) of image enhancement of this poor dataset indicated that our method improved color retinal image quality from an average of 0.0404 (standard deviation 0.0291) up to an average of 0.4565 (standard deviation 0.1000). The proposed method is shown to achieve superior image enhancement compared to contrast enhancement in other color spaces or by other related methods, while simultaneously preserving image naturalness. This method of color retinal image enhancement may be employed to assist ophthalmologists in more efficient screening of retinal diseases and in development of improved automated image analysis for clinical diagnosis.

  19. Logistics support economy and efficiency through consolidation and automation

    NASA Technical Reports Server (NTRS)

    Savage, G. R.; Fontana, C. J.; Custer, J. D.

    1985-01-01

    An integrated logistics support system, which would provide routine access to space and be cost-competitive as an operational space transportation system, was planned and implemented to support the NSTS program launch-on-time goal of 95 percent. A decision was made to centralize the Shuttle logistics functions in a modern facility that would provide office and training space and an efficient warehouse area. In this warehouse, the emphasis is on automation of the storage and retrieval function, while utilizing state-of-the-art warehousing and inventory management technology. This consolidation, together with the automation capabilities being provided, will allow for more effective utilization of personnel and improved responsiveness. In addition, this facility will be the prime support for the fully integrated logistics support of the operations era NSTS and reduce the program's management, procurement, transportation, and supply costs in the operations era.

  20. Auto-rickshaw: an automated crystal structure determination platform as an efficient tool for the validation of an X-ray diffraction experiment.

    PubMed

    Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A

    2005-04-01

    The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.

Top