Enhancer scanning to locate regulatory regions in genomic loci
Buckley, Melissa; Gjyshi, Anxhela; Mendoza-Fandiño, Gustavo; Baskin, Rebekah; Carvalho, Renato S.; Carvalho, Marcelo A.; Woods, Nicholas T.; Monteiro, Alvaro N.A.
2016-01-01
The present protocol provides a rapid, streamlined and scalable strategy to systematically scan genomic regions for the presence of transcriptional regulatory regions active in a specific cell type. It creates genomic tiles spanning a region of interest that are subsequently cloned by recombination into a luciferase reporter vector containing the Simian Virus 40 promoter. Tiling clones are transfected into specific cell types to test for the presence of transcriptional regulatory regions. The protocol includes testing of different SNP (single nucleotide polymorphism) alleles to determine their effect on regulatory activity. This procedure provides a systematic framework to identify candidate functional SNPs within a locus during functional analysis of genome-wide association studies. This protocol adapts and combines previous well-established molecular biology methods to provide a streamlined strategy, based on automated primer design and recombinational cloning to rapidly go from a genomic locus to a set of candidate functional SNPs in eight weeks. PMID:26658467
Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.
Vasdev, Neil; Collier, Thomas Lee
2016-08-17
Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer.
A Flexible Workflow for Automated Bioluminescent Kinase Selectivity Profiling.
Worzella, Tracy; Butzler, Matt; Hennek, Jacquelyn; Hanson, Seth; Simdon, Laura; Goueli, Said; Cowan, Cris; Zegzouti, Hicham
2017-04-01
Kinase profiling during drug discovery is a necessary process to confirm inhibitor selectivity and assess off-target activities. However, cost and logistical limitations prevent profiling activities from being performed in-house. We describe the development of an automated and flexible kinase profiling workflow that combines ready-to-use kinase enzymes and substrates in convenient eight-tube strips, a bench-top liquid handling device, ADP-Glo Kinase Assay (Promega, Madison, WI) technology to quantify enzyme activity, and a multimode detection instrument. Automated methods were developed for kinase reactions and quantification reactions to be assembled on a Gilson (Middleton, WI) PIPETMAX, following standardized plate layouts for single- and multidose compound profiling. Pipetting protocols were customized at runtime based on user-provided information, including compound number, increment for compound titrations, and number of kinase families to use. After the automated liquid handling procedures, a GloMax Discover (Promega) microplate reader preloaded with SMART protocols was used for luminescence detection and automatic data analysis. The functionality of the automated workflow was evaluated with several compound-kinase combinations in single-dose or dose-response profiling formats. Known target-specific inhibitions were confirmed. Novel small molecule-kinase interactions, including off-target inhibitions, were identified and confirmed in secondary studies. By adopting this streamlined profiling process, researchers can quickly and efficiently profile compounds of interest on site.
Honest broker protocol streamlines research access to data while safeguarding patient privacy.
Silvey, Scott A; Silvey, Scott Andrew; Schulte, Janet; Smaltz, Detlev H; Smaltz, Detlev Herb; Kamal, Jyoti
2008-11-06
At Ohio State University Medical Center, The Honest Broker Protocol provides a streamlined mechanism whereby investigators can obtain de-identified clinical data for non-FDA research without having to invest the significant time and effort necessary to craft a formalized protocol for IRB approval.
Automated electric valve for electrokinetic separation in a networked microfluidic chip.
Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F
2007-02-15
This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.
High-resolution Single Particle Analysis from Electron Cryo-microscopy Images Using SPHIRE
Moriya, Toshio; Saur, Michael; Stabrin, Markus; Merino, Felipe; Voicu, Horatiu; Huang, Zhong; Penczek, Pawel A.; Raunser, Stefan; Gatsogiannis, Christos
2017-01-01
SPHIRE (SPARX for High-Resolution Electron Microscopy) is a novel open-source, user-friendly software suite for the semi-automated processing of single particle electron cryo-microscopy (cryo-EM) data. The protocol presented here describes in detail how to obtain a near-atomic resolution structure starting from cryo-EM micrograph movies by guiding users through all steps of the single particle structure determination pipeline. These steps are controlled from the new SPHIRE graphical user interface and require minimum user intervention. Using this protocol, a 3.5 Å structure of TcdA1, a Tc toxin complex from Photorhabdus luminescens, was derived from only 9500 single particles. This streamlined approach will help novice users without extensive processing experience and a priori structural information, to obtain noise-free and unbiased atomic models of their purified macromolecular complexes in their native state. PMID:28570515
Improving Productivity in Copyright Registration. Report by the U.S. General Accounting Office.
ERIC Educational Resources Information Center
Comptroller General of the U.S., Washington, DC.
The productivity of the copyright registration process, which is administered by the Copyright Office within the Library of Congress, can be improved by streamlining the workflow, reducing and streamlining the handling of correspondence, measuring productivity/performance, increasing the use of automation, improving records management, and…
Scalable Device for Automated Microbial Electroporation in a Digital Microfluidic Platform.
Madison, Andrew C; Royal, Matthew W; Vigneault, Frederic; Chen, Liji; Griffin, Peter B; Horowitz, Mark; Church, George M; Fair, Richard B
2017-09-15
Electrowetting-on-dielectric (EWD) digital microfluidic laboratory-on-a-chip platforms demonstrate excellent performance in automating labor-intensive protocols. When coupled with an on-chip electroporation capability, these systems hold promise for streamlining cumbersome processes such as multiplex automated genome engineering (MAGE). We integrated a single Ti:Au electroporation electrode into an otherwise standard parallel-plate EWD geometry to enable high-efficiency transformation of Escherichia coli with reporter plasmid DNA in a 200 nL droplet. Test devices exhibited robust operation with more than 10 transformation experiments performed per device without cross-contamination or failure. Despite intrinsic electric-field nonuniformity present in the EP/EWD device, the peak on-chip transformation efficiency was measured to be 8.6 ± 1.0 × 10 8 cfu·μg -1 for an average applied electric field strength of 2.25 ± 0.50 kV·mm -1 . Cell survival and transformation fractions at this electroporation pulse strength were found to be 1.5 ± 0.3 and 2.3 ± 0.1%, respectively. Our work expands the EWD toolkit to include on-chip microbial electroporation and opens the possibility of scaling advanced genome engineering methods, like MAGE, into the submicroliter regime.
Lean coding machine. Facilities target productivity and job satisfaction with coding automation.
Rollins, Genna
2010-07-01
Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.
Computerized Manufacturing Automation. Employment, Education, and the Workplace. Summary.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
The application of programmable automation (PA) offers new opportunities to enhance and streamline manufacturing processes. Five PA technologies are examined in this report: computer-aided design, robots, numerically controlled machine tools, flexible manufacturing systems, and computer-integrated manufacturing. Each technology is in a relatively…
Powsiri Klinkhachorn; J. Moody; Philip A. Araman
1995-01-01
For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...
Analysis of Jupiter's Oval BA: A Streamlined Approach
NASA Technical Reports Server (NTRS)
Sussman, Michael G.; Chanover, Nancy J.; Simon-Miller, Amy A.; Vasavada, Ashwin R.; Beebe, Reta F.
2010-01-01
We present a novel method of constructing streamlines to derive wind speeds within jovian vortices and demonstrate its application to Oval BA for 2001 pre-reddened Cassini flyby data, 2007 post-reddened New Horizons flyby data, and 1998 Galileo data of precursor Oval DE. Our method, while automated, attempts to combine the advantages of both automated and manual cloud tracking methods. The southern maximum wind speed of Oval BA does not show significant changes between these data sets to within our measurement uncertainty. The northern maximum dries appear to have increased in strength during this time interval, tvhich likely correlates with the oval's return to a symmetric shape. We demonstrate how the use of closed streamlines can provide measurements of vorticity averaged over the encircled area with no a priori assumptions concerning oval shape. We find increased averaged interior vorticity between pre- and post-reddened Oval BA, with the precursor Oval DE occupying a middle value of vorticity between these two.
Space Wire Upper Layer Protocols
NASA Technical Reports Server (NTRS)
Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parkes, Steve
2004-01-01
This viewgraph presentation addresses efforts to provide a streamlined approach for developing SpaceWire Upper layer protocols which allows industry to drive standardized communication solutions for real projects. The presentation proposes a simple packet header that will allow flexibility in implementing a diverse range of protocols.
Just-In-Time Inventory Management; Application and Recommendations for Naval Hospital, Oakland.
1992-12-01
108 c. Break Bulk on Stored Material .................. 110 d. Emphasize Continuous Quality Improvement ...... 111 4. Streamline Order Processing for...manpower. 4. Use of existing industry automation to expedite order processing to the prime vendor. The intent of this research is to present the JIT...34* Collection of baseline data. "* Break bulk on stored material. 85 • Emphasize continuous quality improvement. 4. Streamline order processing for PV
A Liquid-Handling Robot for Automated Attachment of Biomolecules to Microbeads.
Enten, Aaron; Yang, Yujia; Ye, Zihan; Chu, Ryan; Van, Tam; Rothschild, Ben; Gonzalez, Francisco; Sulchek, Todd
2016-08-01
Diagnostics, drug delivery, and other biomedical industries rely on cross-linking ligands to microbead surfaces. Microbead functionalization requires multiple steps of liquid exchange, incubation, and mixing, which are laborious and time intensive. Although automated systems exist, they are expensive and cumbersome, limiting their routine use in biomedical laboratories. We present a small, bench-top robotic system that automates microparticle functionalization and streamlines sample preparation. The robot uses a programmable microcontroller to regulate liquid exchange, incubation, and mixing functions. Filters with a pore diameter smaller than the minimum bead diameter are used to prevent bead loss during liquid exchange. The robot uses three liquid reagents and processes up to 10(7) microbeads per batch. The effectiveness of microbead functionalization was compared with a manual covalent coupling process and evaluated via flow cytometry and fluorescent imaging. The mean percentages of successfully functionalized beads were 91% and 92% for the robot and manual methods, respectively, with less than 5% bead loss. Although the two methods share similar qualities, the automated approach required approximately 10 min of active labor, compared with 3 h for the manual approach. These results suggest that a low-cost, automated microbead functionalization system can streamline sample preparation with minimal operator intervention. © 2015 Society for Laboratory Automation and Screening.
Automating payroll, billing, and medical records. Using technology to do more with less.
Vetter, E
1995-08-01
As home care agencies grow, so does the need to streamline the paperwork involved in running an agency. One agency found a way to reduce its payroll, billing, and medical records paperwork by implementing an automated, image-based data collection system that saves time, money, and paper.
Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio
2017-02-01
Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.
A Recipe for Streamlining Mission Management
NASA Technical Reports Server (NTRS)
Mitchell, Andrew E.; Semancik, Susan K.
2004-01-01
This paper describes a project's design and implementation for streamlining mission management with knowledge capture processes across multiple organizations of a NASA directorate. Thc project's focus is on standardizing processes and reports; enabling secure information access and case of maintenance; automating and tracking appropriate workflow rules through process mapping; and infusing new technologies. This paper will describe a small team's experiences using XML technologies through an enhanced vendor suite of applications integrated on Windows-based platforms called the Wallops Integrated Scheduling and Document Management System (WISDMS). This paper describes our results using this system in a variety of endeavors, including providing range project scheduling and resource management for a Range and Mission Management Office; implementing an automated Customer Feedback system for a directorate; streamlining mission status reporting across a directorate; and initiating a document management, configuration management and portal access system for a Range Safety Office's programs. The end result is a reduction of the knowledge gap through better integration and distribution of information, improved process performance, automated metric gathering, and quicker identification of problem areas and issues. However, the real proof of the pudding comes through overcoming the user's reluctance to replace familiar, seasoned processes with new technology ingredients blended with automated procedures in an untested recipe. This paper shares some of the team's observations that led to better implementation techniques, as well as an IS0 9001 Best Practices citation. This project has provided a unique opportunity to advance NASA's competency in new technologies, as well as to strategically implement them within an organizational structure, while wetting the appetite for continued improvements in mission management.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
PC-based automation system streamlines operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, J.
1995-10-01
The continued emergence of PC-based automation systems in the modern compressor station is driving the need for personnel who have the special skills need to support them. However, the dilemma is that operating budget restraints limit the overall number of people available to operate and maintain compressor stations. An ideal solution is to deploy automation systems which can be easily understood and supported by existing compressor station personnel. This paper reviews such a system developed by Waukesha-Pearce Industries, Inc.
Bredfeldt, Christine E; Butani, Amy; Padmanabhan, Sandhyasree; Hitz, Paul; Pardee, Roy
2013-03-22
Multi-site health sciences research is becoming more common, as it enables investigation of rare outcomes and diseases and new healthcare innovations. Multi-site research usually involves the transfer of large amounts of research data between collaborators, which increases the potential for accidental disclosures of protected health information (PHI). Standard protocols for preventing release of PHI are extremely vulnerable to human error, particularly when the shared data sets are large. To address this problem, we developed an automated program (SAS macro) to identify possible PHI in research data before it is transferred between research sites. The macro reviews all data in a designated directory to identify suspicious variable names and data patterns. The macro looks for variables that may contain personal identifiers such as medical record numbers and social security numbers. In addition, the macro identifies dates and numbers that may identify people who belong to small groups, who may be identifiable even in the absences of traditional identifiers. Evaluation of the macro on 100 sample research data sets indicated a recall of 0.98 and precision of 0.81. When implemented consistently, the macro has the potential to streamline the PHI review process and significantly reduce accidental PHI disclosures.
Ambulatory surgery centers best practices for the 90s.
Hoover, J A
1994-05-01
Outpatient surgery will be the driving force in the continued growth of ambulatory care in the 1990s. Providing efficient, high-quality ambulatory surgical services should therefore be a priority among healthcare providers. Arthur Andersen conducted a survey to discover best practices in ambulatory surgical service. General success characteristics of best performers were business-focused relationships with physicians, the use of clinical protocols, patient convenience, cost management, strong leadership, teamwork, streamlined processes and efficient design. Other important factors included scheduling to maximize OR room use; achieving surgical efficiencies through reduced case pack assembly errors and equipment availability; a focus on cost capture rather than charge capture; sound materiel management practices, such as standardization and vendor teaming; and the appropriate use of automated systems. It is important to evaluate whether the best practices are applicable to your environment and what specific changes to your current processes would be necessary to adopt them.
Lean and Information Technology Toolkit
The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.
Improving medical stores management through automation and effective communication.
Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu
2016-01-01
Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.
Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc
2010-06-01
Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.
Protocols for Automated Protist Analysis
2011-12-01
Report No: CG-D-14-13 Protocols for Automated Protist Analysis December 2011 Distribution Statement A: Approved for public...release; distribution is unlimited. Protocols for Automated Protist Analysis ii UNCLAS//Public | CG-926 RDC | B. Nelson, et al. | Public...Director United States Coast Guard Research & Development Center 1 Chelsea Street New London, CT 06320 Protocols for Automated Protist Analysis
Streamlined Genome Sequence Compression using Distributed Source Coding
Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel
2014-01-01
We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552
Building a world-class A/P function: how UPMC went paperless.
DeLuca, Michael; Smith, Corey
2010-03-01
UPMC engaged people, processes, and technology to move its A/P function from a highly manual, paper-based operation to a completely automated process. UPMC's CFO hired a chief supply chain officer to develop a strategic plan, and UPMC named a value analysis team to gain clinician buy-in. UPMC automated A/P by enabling receipt of electronic invoices. UPMC streamlined its processes for invoices.
Standardizing Flow Cytometry Immunophenotyping Analysis from the Human ImmunoPhenotyping Consortium
Finak, Greg; Langweiler, Marc; Jaimes, Maria; Malek, Mehrnoush; Taghiyar, Jafar; Korin, Yael; Raddassi, Khadir; Devine, Lesley; Obermoser, Gerlinde; Pekalski, Marcin L.; Pontikos, Nikolas; Diaz, Alain; Heck, Susanne; Villanova, Federica; Terrazzini, Nadia; Kern, Florian; Qian, Yu; Stanton, Rick; Wang, Kui; Brandes, Aaron; Ramey, John; Aghaeepour, Nima; Mosmann, Tim; Scheuermann, Richard H.; Reed, Elaine; Palucka, Karolina; Pascual, Virginia; Blomberg, Bonnie B.; Nestle, Frank; Nussenblatt, Robert B.; Brinkman, Ryan Remy; Gottardo, Raphael; Maecker, Holden; McCoy, J Philip
2016-01-01
Standardization of immunophenotyping requires careful attention to reagents, sample handling, instrument setup, and data analysis, and is essential for successful cross-study and cross-center comparison of data. Experts developed five standardized, eight-color panels for identification of major immune cell subsets in peripheral blood. These were produced as pre-configured, lyophilized, reagents in 96-well plates. We present the results of a coordinated analysis of samples across nine laboratories using these panels with standardized operating procedures (SOPs). Manual gating was performed by each site and by a central site. Automated gating algorithms were developed and tested by the FlowCAP consortium. Centralized manual gating can reduce cross-center variability, and we sought to determine whether automated methods could streamline and standardize the analysis. Within-site variability was low in all experiments, but cross-site variability was lower when central analysis was performed in comparison with site-specific analysis. It was also lower for clearly defined cell subsets than those based on dim markers and for rare populations. Automated gating was able to match the performance of central manual analysis for all tested panels, exhibiting little to no bias and comparable variability. Standardized staining, data collection, and automated gating can increase power, reduce variability, and streamline analysis for immunophenotyping. PMID:26861911
Feasibility of Developing a Protocol for Automated Protist Analysis
2010-03-01
Acquisition Directorate Research & Development Center Report No: CG-D-02-ll Feasibility of Developing a Protocol for Automated Protist Analysis...Technical Information Service, Springfield, VA 22161. March 2010 Homeland Security Feasibility of Developing a Protocol for Automated Protist ...March 21)10 Feasibility of Developing a Protocol for Automated Protist Analysis 00 00 o CM Technical Report Documentation Page 1. Report No CG-D
Task Decomposition Model for Dispatchers in Dynamic Scheduling of Demand Responsive Transit Systems
DOT National Transportation Integrated Search
2000-06-01
Since the passage of ADA, the demand for paratransit service is steadily increasing. Paratransit companies are relying on computer automation to streamline dispatch operations, increase productivity and reduce operator stress and error. Little resear...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
2013-01-01
Background Multi-site health sciences research is becoming more common, as it enables investigation of rare outcomes and diseases and new healthcare innovations. Multi-site research usually involves the transfer of large amounts of research data between collaborators, which increases the potential for accidental disclosures of protected health information (PHI). Standard protocols for preventing release of PHI are extremely vulnerable to human error, particularly when the shared data sets are large. Methods To address this problem, we developed an automated program (SAS macro) to identify possible PHI in research data before it is transferred between research sites. The macro reviews all data in a designated directory to identify suspicious variable names and data patterns. The macro looks for variables that may contain personal identifiers such as medical record numbers and social security numbers. In addition, the macro identifies dates and numbers that may identify people who belong to small groups, who may be identifiable even in the absences of traditional identifiers. Results Evaluation of the macro on 100 sample research data sets indicated a recall of 0.98 and precision of 0.81. Conclusions When implemented consistently, the macro has the potential to streamline the PHI review process and significantly reduce accidental PHI disclosures. PMID:23521861
Improving medical stores management through automation and effective communication
Kumar, Ashok; Cariappa, M.P.; Marwaha, Vishal; Sharma, Mukti; Arora, Manu
2016-01-01
Background Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. Methods We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan–Do–Study–Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. Results After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Conclusion Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management. PMID:26900225
Solar Asset Management Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron; Zviagin, George
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.
A Combined Fabrication and Instrumentation Platform for Sample Preparation.
Guckenberger, David J; Thomas, Peter C; Rothbauer, Jacob; LaVanway, Alex J; Anderson, Meghan; Gilson, Dan; Fawcett, Kevin; Berto, Tristan; Barrett, Kevin; Beebe, David J; Berry, Scott M
2014-06-01
While potentially powerful, access to molecular diagnostics is substantially limited in the developing world. Here we present an approach to reduced cost molecular diagnostic instrumentation that has the potential to empower developing world communities by reducing costs through streamlining the sample preparation process. In addition, this instrument is capable of producing its own consumable devices on demand, reducing reliance on assay suppliers. Furthermore, this instrument is designed with an "open" architecture, allowing users to visually observe the assay process and make modifications as necessary (as opposed to traditional "black box" systems). This open environment enables integration of microfluidic fabrication and viral RNA purification onto an easy-to-use modular system via the use of interchangeable trays. Here we employ this system to develop a protocol to fabricate microfluidic devices and then use these devices to isolate viral RNA from serum for the measurement of human immunodeficiency virus (HIV) viral load. Results obtained from this method show significantly reduced error compared with similar nonautomated sample preparation processes. © 2014 Society for Laboratory Automation and Screening.
ERIC Educational Resources Information Center
Ramaswami, Rama
2008-01-01
Human Resources (HR) administrators are finding that as software modules are installed to automate various processes, they have more time to focus on strategic objectives. And as compliance with affirmative action and other employment regulations comes under increasing scrutiny, HR staffers are finding that software can deliver and track data with…
Expert system isssues in automated, autonomous space vehicle rendezvous
NASA Technical Reports Server (NTRS)
Goodwin, Mary Ann; Bochsler, Daniel C.
1987-01-01
The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.
Final Report Ra Power Management 1255 10-15-16 FINAL_Public
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins
DOT National Transportation Integrated Search
2017-04-15
In this 3-year project, the research team developed the Hydrologic Disaster Forecast and Response (HDFR) system, a set of integrated software tools for end users that streamlines hydrologic prediction workflows involving automated retrieval of hetero...
Advance Approach to Concept and Design Studies for Space Missions
NASA Technical Reports Server (NTRS)
Deutsch, M.; Nichols, J.
1999-01-01
Recent automated and advanced techniques developed at JPL have created a streamlined and fast-track approach to initial mission conceptualization and system architecture design, answering the need for rapid turnaround of trade studies for potential proposers, as well as mission and instrument study groups.
Knirsch, Charles; Alemayehu, Demissie; Botgros, Radu; Comic-Savic, Sabrina; Friedland, David; Holland, Thomas L; Merchant, Kunal; Noel, Gary J; Pelfrene, Eric; Reith, Christina; Santiago, Jonas; Tiernan, Rosemary; Tenearts, Pamela; Goldsack, Jennifer C; Fowler, Vance G
2016-08-15
The etiology of hospital-acquired or ventilator-associated bacterial pneumonia (HABP/VABP) is often multidrug-resistant infections. The evaluation of new antibacterial drugs for efficacy in this population is important, as many antibacterial drugs have demonstrated limitations when studied in this population. HABP/VABP trials are expensive and challenging to conduct due to protocol complexity and low patient enrollment, among other factors. The Clinical Trials Transformation Initiative (CTTI) seeks to advance antibacterial drug development by streamlining HABP/VABP clinical trials to improve efficiency and feasibility while maintaining ethical rigor, patient safety, information value, and scientific validity. In 2013, CTTI engaged a multidisciplinary group of experts to discuss challenges impeding the conduct of HABP/VABP trials. Separate workstreams identified challenges associated with HABP/VABP protocol complexity. The Project Team developed potential solutions to streamline HABP/VABP trials using a Quality by Design approach. CTTI recommendations focus on 4 key areas to improve HABP/VABP trials: informed consent processes/practices, protocol design, choice of an institutional review board (IRB), and trial outcomes. Informed consent processes should include legally authorized representatives. Protocol design decisions should focus on eligibility criteria, prestudy antibacterial therapy considerations, use of new diagnostics, and sample size. CTTI recommends that sponsors use a central IRB and discuss trial endpoints with regulators, including defining a clinical failure and evaluating the impact of concomitant antibacterial drugs. Streamlining HABP/VABP trials by addressing key protocol elements can improve trial startup and patient recruitment/retention, reduce trial complexity and costs, and ensure patient safety while advancing antibacterial drug development. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
Streamlining Compliance Validation Through Automation Processes
2014-03-01
up to 16 data storage registers. By contrast, the Raspberry Pi is a credit card sized computer that is sold for $35 and comes “stock” with 512MB... Raspberry Pi Foundation. Raspberry Pi FAQs. [Online]. Available: http://www.raspberrypi.org/faqs [5] R. Meulen and C. Pettey. (2008, June). Gartner
Probst, Yasmine; Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh
2016-07-28
Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used.
Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh
2016-01-01
Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used. PMID:27471104
Distance-Based and Low Energy Adaptive Clustering Protocol for Wireless Sensor Networks
Gani, Abdullah; Anisi, Mohammad Hossein; Ab Hamid, Siti Hafizah; Akhunzada, Adnan; Khan, Muhammad Khurram
2016-01-01
A wireless sensor network (WSN) comprises small sensor nodes with limited energy capabilities. The power constraints of WSNs necessitate efficient energy utilization to extend the overall network lifetime of these networks. We propose a distance-based and low-energy adaptive clustering (DISCPLN) protocol to streamline the green issue of efficient energy utilization in WSNs. We also enhance our proposed protocol into the multi-hop-DISCPLN protocol to increase the lifetime of the network in terms of high throughput with minimum delay time and packet loss. We also propose the mobile-DISCPLN protocol to maintain the stability of the network. The modelling and comparison of these protocols with their corresponding benchmarks exhibit promising results. PMID:27658194
Web-Based Time Entry Systems: Providing Greater Automation and Compliance
ERIC Educational Resources Information Center
Williams, Tracy
2005-01-01
Time and resources are becoming increasingly scarce in most higher education institutions today. As a result, colleges and universities are looking to streamline and simplify many costly, labor-intensive administrative processes. In this article, Tracy Williams examines how Web-based time-entry systems can help institutions save valuable time and…
Manpower Requirements Report FY 1994
1993-06-01
decrease in FY 1993 is primarily due to reductions in advanced weapons (-144), aerospace avionics (-48), materials (-37), and test and evaluation support...sub- sistence, medical goods, industrial and construction material , general and electronic supplies, and petroleum products. Logistic services include...efficiencies resulting from streamlining depots, modernizing/automating materials handling, and a projected decline in contract administration and
Automating Security Protocol Analysis
2004-03-01
language that allows easy representation of pattern interaction. Using CSP, Lowe tests whether a protocol achieves authentication. In the case of...only to correctly code whatever protocol they intend to evaluate. The tool, OCaml 3.04 [1], translates the protocol into Horn clauses and then...model protocol transactions. One example of automated modeling software is Maude [19]. Maude was the intended language for this research, but Java
Streamlining Collaborative Planning in Spacecraft Mission Architectures
NASA Technical Reports Server (NTRS)
Misra, Dhariti; Bopf, Michel; Fishman, Mark; Jones, Jeremy; Kerbel, Uri; Pell, Vince
2000-01-01
During the past two decades, the planning and scheduling community has substantially increased the capability and efficiency of individual planning and scheduling systems. Relatively recently, research work to streamline collaboration between planning systems is gaining attention. Spacecraft missions stand to benefit substantially from this work as they require the coordination of multiple planning organizations and planning systems. Up to the present time this coordination has demanded a great deal of human intervention and/or extensive custom software development efforts. This problem will become acute with increased requirements for cross-mission plan coordination and multi -spacecraft mission planning. The Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center is taking innovative steps to define collaborative planning architectures, and to identify coordinated planning tools for Cross-Mission Campaigns. Prototypes are being developed to validate these architectures and assess the usefulness of the coordination tools by the planning community. This presentation will focus on one such planning coordination too], named Visual Observation Layout Tool (VOLT), which is currently being developed to streamline the coordination between astronomical missions
Accurate, Streamlined Analysis of mRNA Translation by Sucrose Gradient Fractionation
Aboulhouda, Soufiane; Di Santo, Rachael; Therizols, Gabriel; Weinberg, David
2017-01-01
The efficiency with which proteins are produced from mRNA molecules can vary widely across transcripts, cell types, and cellular states. Methods that accurately assay the translational efficiency of mRNAs are critical to gaining a mechanistic understanding of post-transcriptional gene regulation. One way to measure translational efficiency is to determine the number of ribosomes associated with an mRNA molecule, normalized to the length of the coding sequence. The primary method for this analysis of individual mRNAs is sucrose gradient fractionation, which physically separates mRNAs based on the number of bound ribosomes. Here, we describe a streamlined protocol for accurate analysis of mRNA association with ribosomes. Compared to previous protocols, our method incorporates internal controls and improved buffer conditions that together reduce artifacts caused by non-specific mRNA–ribosome interactions. Moreover, our direct-from-fraction qRT-PCR protocol eliminates the need for RNA purification from gradient fractions, which greatly reduces the amount of hands-on time required and facilitates parallel analysis of multiple conditions or gene targets. Additionally, no phenol waste is generated during the procedure. We initially developed the protocol to investigate the translationally repressed state of the HAC1 mRNA in S. cerevisiae, but we also detail adapted procedures for mammalian cell lines and tissues. PMID:29170751
NASA Astrophysics Data System (ADS)
Yu, Peter; Eyles, Nick; Sookhan, Shane
2015-10-01
Resolving the origin(s) of drumlins and related megaridges in areas of megascale glacial lineations (MSGL) left by paleo-ice sheets is critical to understanding how ancient ice sheets interacted with their sediment beds. MSGL is now linked with fast-flowing ice streams but there is a broad range of erosional and depositional models. Further progress is reliant on constraining fluxes of subglacial sediment at the ice sheet base which in turn is dependent on morphological data such as landform shape and elongation and most importantly landform volume. Past practice in determining shape has employed a broad range of geomorphological methods from strictly visualisation techniques to more complex semi-automated and automated drumlin extraction methods. This paper reviews and builds on currently available visualisation, semi-automated and automated extraction methods and presents a new, Curvature Based Relief Separation (CBRS) technique; for drumlin mapping. This uses curvature analysis to generate a base level from which topography can be normalized and drumlin volume can be derived. This methodology is tested using a high resolution (3 m) LiDAR elevation dataset from the Wadena Drumlin Field, Minnesota, USA, which was constructed by the Wadena Lobe of the Laurentide Ice Sheet ca. 20,000 years ago and which as a whole contains 2000 drumlins across an area of 7500 km2. This analysis demonstrates that CBRS provides an objective and robust procedure for automated drumlin extraction. There is strong agreement with manually selected landforms but the method is also capable of resolving features that were not detectable manually thereby considerably expanding the known population of streamlined landforms. CBRS provides an effective automatic method for visualisation of large areas of the streamlined beds of former ice sheets and for modelling sediment fluxes below ice sheets.
An automated perfusion bioreactor for the streamlined production of engineered osteogenic grafts.
Ding, Ming; Henriksen, Susan S; Wendt, David; Overgaard, Søren
2016-04-01
A computer-controlled perfusion bioreactor was developed for the streamlined production of engineered osteogenic grafts. This system automated the required bioprocesses, from the initial filling of the system through the phases of cell seeding and prolonged cell/tissue culture. Flow through chemo-optic micro-sensors allowed to non-invasively monitor the levels of oxygen and pH in the perfused culture medium throughout the culture period. To validate its performance, freshly isolated ovine bone marrow stromal cells were directly seeded on porous scaffold granules (hydroxyapatite/β-tricalcium-phosphate/poly-lactic acid), bypassing the phase of monolayer cell expansion in flasks. Either 10 or 20 days after culture, engineered cell-granule grafts were implanted in an ectopic mouse model to quantify new bone formation. After four weeks of implantation, histomorphometry showed more bone in bioreactor-generated grafts than cell-free granule controls, while bone formation did not show significant differences between 10 days and 20 days of incubation. The implanted granules without cells had no bone formation. This novel perfusion bioreactor has revealed the capability of activation larger viable bone graft material, even after shorter incubation time of graft material. This study has demonstrated the feasibility of engineering osteogenic grafts in an automated bioreactor system, laying the foundation for a safe, regulatory-compliant, and cost-effective manufacturing process. © 2015 Wiley Periodicals, Inc.
Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki
2016-01-01
The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.
Biocoder: A programming language for standardizing and automating biology protocols
2010-01-01
Background Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. Results We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. Conclusions BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains. PMID:21059251
Lohmann, Amanda R; Carlson, Matthew L; Sladen, Douglas P
2018-03-01
Intraoperative cochlear implant device testing provides valuable information regarding device integrity, electrode position, and may assist with determining initial stimulation settings. Manual intraoperative device testing during cochlear implantation requires the time and expertise of a trained audiologist. The purpose of the current study is to investigate the feasibility of using automated remote intraoperative cochlear implant reverse telemetry testing as an alternative to standard testing. Prospective pilot study evaluating intraoperative remote automated impedance and Automatic Neural Response Telemetry (AutoNRT) testing in 34 consecutive cochlear implant surgeries using the Intraoperative Remote Assistant (Cochlear Nucleus CR120). In all cases, remote intraoperative device testing was performed by trained operating room staff. A comparison was made to the "gold standard" of manual testing by an experienced cochlear implant audiologist. Electrode position and absence of tip fold-over was confirmed using plain film x-ray. Automated remote reverse telemetry testing was successfully completed in all patients. Intraoperative x-ray demonstrated normal electrode position without tip fold-over. Average impedance values were significantly higher using standard testing versus CR120 remote testing (standard mean 10.7 kΩ, SD 1.2 vs. CR120 mean 7.5 kΩ, SD 0.7, p < 0.001). There was strong agreement between standard manual testing and remote automated testing with regard to the presence of open or short circuits along the array. There were, however, two cases in which standard testing identified an open circuit, when CR120 testing showed the circuit to be closed. Neural responses were successfully obtained in all patients using both systems. There was no difference in basal electrode responses (standard mean 195.0 μV, SD 14.10 vs. CR120 194.5 μV, SD 14.23; p = 0.7814); however, more favorable (lower μV amplitude) results were obtained with the remote automated system in the apical 10 electrodes (standard 185.4 μV, SD 11.69 vs. CR120 177.0 μV, SD 11.57; p value < 0.001). These preliminary data demonstrate that intraoperative cochlear implant device testing using a remote automated system is feasible. This system may be useful for cochlear implant programs with limited audiology support or for programs looking to streamline intraoperative device testing protocols. Future studies with larger patient enrollment are required to validate these promising, but preliminary, findings.
Integrated, automated revenue management for managed care contracts.
Burckhart, Kent
2002-04-01
Faced with increasing managed care penetration and declining net revenue in recent years, healthcare providers increasingly are emphasizing revenue management. To streamline processes and reduce costs in this area, many healthcare providers have implemented or are considering automated contract management systems. When selecting such a system, healthcare financial managers should make certain that the system can interface with both patient-accounting and decision-support systems of the organization. This integration enhances a healthcare provider's financial viability by providing integrated revenue-management capabilities to analyze projected performance of proposed managed care contracts and actual performance of existing contracts.
Automated Weaning from Mechanical Ventilation after Off-Pump Coronary Artery Bypass Grafting.
Fot, Evgenia V; Izotova, Natalia N; Yudina, Angelika S; Smetkin, Aleksei A; Kuzkov, Vsevolod V; Kirov, Mikhail Y
2017-01-01
The discontinuation of mechanical ventilation after coronary surgery may prolong and significantly increase the load on intensive care unit personnel. We hypothesized that automated mode using INTELLiVENT-ASV can decrease duration of postoperative mechanical ventilation, reduce workload on medical staff, and provide safe ventilation after off-pump coronary artery bypass grafting (OPCAB). The primary endpoint of our study was to assess the duration of postoperative mechanical ventilation during different modes of weaning from respiratory support (RS) after OPCAB. The secondary endpoint was to assess safety of the automated weaning mode and the number of manual interventions to the ventilator settings during the weaning process in comparison with the protocolized weaning mode. Forty adult patients undergoing elective OPCAB were enrolled into a prospective single-center study. Patients were randomized into two groups: automated weaning ( n = 20) using INTELLiVENT-ASV mode with quick-wean option; and protocolized weaning ( n = 20), using conventional synchronized intermittent mandatory ventilation (SIMV) + pressure support (PS) mode. We assessed the duration of postoperative ventilation, incidence and duration of unacceptable RS, and the load on medical staff. We also performed the retrospective analysis of 102 patients (standard weaning) who were weaned from ventilator with SIMV + PS mode based on physician's experience without prearranged algorithm. Realization of the automated weaning protocol required change in respiratory settings in 2 patients vs. 7 (5-9) adjustments per patient in the protocolized weaning group. Both incidence and duration of unacceptable RS were reduced significantly by means of the automated weaning approach. The FiO 2 during spontaneous breathing trials was significantly lower in the automated weaning group: 30 (30-35) vs. 40 (40-45) % in the protocolized weaning group ( p < 0.01). The average time until tracheal extubation did not differ in the automated weaning and the protocolized weaning groups: 193 (115-309) and 197 (158-253) min, respectively, but increased to 290 (210-411) min in the standard weaning group. The automated weaning system after off-pump coronary surgery might provide postoperative ventilation in a more protective way, reduces the workload on medical staff, and does not prolong the duration of weaning from ventilator. The use of automated or protocolized weaning can reduce the duration of postoperative mechanical ventilation in comparison with non-protocolized weaning based on the physician's decision.
Goyal, Manu S; Hoff, Brian G; Williams, Jennifer; Khoury, Naim; Wiesehan, Rebecca; Heitsch, Laura; Panagos, Peter; Vo, Katie D; Benzinger, Tammie; Derdeyn, Colin P; Lee, Jin-Moo; Ford, Andria L
2016-04-01
Stroke mimics (SM) challenge the initial assessment of patients presenting with possible acute ischemic stroke (AIS). When SM is considered likely, intravenous tissue-type plasminogen activator (tPA) may be withheld, risking an opportunity to treat AIS. Although computed tomography is routinely used for tPA decision making, magnetic resonance imaging (MRI) may diagnose AIS when SM is favored but not certain. We hypothesized that a hyperacute MRI (hMRI) protocol would identify tPA-eligible AIS patients among those initially favored to have SM. A streamlined hMRI protocol was designed based on barriers to rapid patient transport, MRI acquisition, and post-MRI tPA delivery. Neurologists were trained to order hMRI when SM was favored and tPA was being withheld. The use of hMRI for tPA decision making, door-to-needle times, and outcomes were compared before hMRI implementation (pre-hMRI: August 1, 2011 to July 31, 2013) and after (post-hMRI, August 1, 2013, to January 15, 2015). Post hMRI, 57 patients with suspected SM underwent hMRI (median MRI-order-to-start time, 29 minutes), of whom, 11 (19%) were diagnosed with AIS and 7 (12%) received tPA. Pre-hMRI, no tPA-treated patients were screened with hMRI. Post hMRI, 7 of 106 (6.6%) tPA-treated patients underwent hMRI to aid in decision making because of suspected SM (0% versus 6.6%; P=0.001). To ensure standard care was maintained after implementing the hMRI protocol, pre- versus post-hMRI tPA-treated cohorts were compared and did not differ: door-to-needle time (39 versus 37 minutes; P=0.63), symptomatic hemorrhage rate (4.5% versus 1.9%; P=0.32), and favorable discharge location (85% versus 89%; P=0.37). A streamlined hMRI protocol permitted tPA administration to a small, but significant, subset of AIS patients initially considered to have SM. © 2016 American Heart Association, Inc.
Streamlining the Process of Acquiring Secure Open Architecture Software Systems
2013-10-08
Microsoft.NET, Enterprise Java Beans, GNU Lesser General Public License (LGPL) libraries, and data communication protocols like the Hypertext Transfer...NetBeans development environments), customer relationship management (SugarCRM), database management systems (PostgreSQL, MySQL ), operating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Taylor, Cody
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less
Agile based "Semi-"Automated Data ingest process : ORNL DAAC example
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.
2015-12-01
The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.
A LOTUS NOTES APPLICATION FOR PREPARING, REVIEWING, AND STORING NHEERL RESEARCH PROTOCOLS
Upon becoming QA Manager of the Health Effects Research Laboratory (HERL) in 1990, Ron became aware of the need to simplify and streamline the research planning process that Laboratory Principal Investigators (Pls) faced. Appropriately, animal studies require close scrutiny, both...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, D.N.
1997-02-01
The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.
Lighting Automation - Flying an Earthlike Habit Project
NASA Technical Reports Server (NTRS)
Falker, Jay; Howard, Ricky; Culbert, Christopher; Clark, Toni Anne; Kolomenski, Andrei
2017-01-01
Our proposal will enable the development of automated spacecraft habitats for long duration missions. Majority of spacecraft lighting systems employ lamps or zone specific switches and dimmers. Automation is not in the "picture". If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. To transform how spacecraft lighting environments are automated, we will provide performance data on a standard lighting communication protocol. We will investigate utilization and application of an industry accepted lighting control protocol, DMX512. We will demonstrate how lighting automation can conserve power, assist with lighting countermeasures, and utilize spatial body tracking. By using DMX512 we will prove the "wheel" does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.
2004-12-01
handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using
Williams, Diana L; Adams, Linda B; Lahiri, Ramanuj
2014-10-01
Mycobacterium leprae, etiologic agent of leprosy, is propagated in athymic nude mouse footpads (FPs). The current purification protocol is tedious and physically demanding. A simpler, semi-automated protocol was developed using gentleMACS™ Octo Dissociator. The gentleMACS protocol provided a very effective means for purification of highly viable M. leprae from tissue. Copyright © 2014. Published by Elsevier B.V.
Application of automated measurement and verification to utility energy efficiency program data
Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...
2017-02-17
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less
Application of automated measurement and verification to utility energy efficiency program data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Fernandes, Samuel
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less
Automatic control of pressure support for ventilator weaning in surgical intensive care patients.
Schädler, Dirk; Engel, Christoph; Elke, Gunnar; Pulletz, Sven; Haake, Nils; Frerichs, Inéz; Zick, Günther; Scholz, Jens; Weiler, Norbert
2012-03-15
Despite its ability to reduce overall ventilation time, protocol-guided weaning from mechanical ventilation is not routinely used in daily clinical practice. Clinical implementation of weaning protocols could be facilitated by integration of knowledge-based, closed-loop controlled protocols into respirators. To determine whether automated weaning decreases overall ventilation time compared with weaning based on a standardized written protocol in an unselected surgical patient population. In this prospective controlled trial patients ventilated for longer than 9 hours were randomly allocated to receive either weaning with automatic control of pressure support ventilation (automated-weaning group) or weaning based on a standardized written protocol (control group) using the same ventilation mode. The primary end point of the study was overall ventilation time. Overall ventilation time (median [25th and 75th percentile]) did not significantly differ between the automated-weaning (31 [19-101] h; n = 150) and control groups (39 [20-118] h; n = 150; P = 0.178). Patients who underwent cardiac surgery (n = 132) exhibited significantly shorter overall ventilation times in the automated-weaning (24 [18-57] h) than in the control group (35 [20-93] h; P = 0.035). The automated-weaning group exhibited shorter ventilation times until the first spontaneous breathing trial (1 [0-15] vs. 9 [1-51] h; P = 0.001) and a trend toward fewer tracheostomies (17 vs. 28; P = 0.075). Overall ventilation times did not significantly differ between weaning using automatic control of pressure support ventilation and weaning based on a standardized written protocol. Patients after cardiac surgery may benefit from automated weaning. Implementation of additional control variables besides the level of pressure support may further improve automated-weaning systems. Clinical trial registered with www.clinicaltrials.gov (NCT 00445289).
Garyfallidis, Eleftherios; Côté, Marc-Alexandre; Rheault, Francois; Sidhu, Jasmeen; Hau, Janice; Petit, Laurent; Fortin, David; Cunanne, Stephen; Descoteaux, Maxime
2018-04-15
Virtual dissection of diffusion MRI tractograms is cumbersome and needs extensive knowledge of white matter anatomy. This virtual dissection often requires several inclusion and exclusion regions-of-interest that make it a process that is very hard to reproduce across experts. Having automated tools that can extract white matter bundles for tract-based studies of large numbers of people is of great interest for neuroscience and neurosurgical planning. The purpose of our proposed method, named RecoBundles, is to segment white matter bundles and make virtual dissection easier to perform. This can help explore large tractograms from multiple persons directly in their native space. RecoBundles leverages latest state-of-the-art streamline-based registration and clustering to recognize and extract bundles using prior bundle models. RecoBundles uses bundle models as shape priors for detecting similar streamlines and bundles in tractograms. RecoBundles is 100% streamline-based, is efficient to work with millions of streamlines and, most importantly, is robust and adaptive to incomplete data and bundles with missing components. It is also robust to pathological brains with tumors and deformations. We evaluated our results using multiple bundles and showed that RecoBundles is in good agreement with the neuroanatomical experts and generally produced more dense bundles. Across all the different experiments reported in this paper, RecoBundles was able to identify the core parts of the bundles, independently from tractography type (deterministic or probabilistic) or size. Thus, RecoBundles can be a valuable method for exploring tractograms and facilitating tractometry studies. Copyright © 2017 Elsevier Inc. All rights reserved.
Frood, R; Baren, J; McDermott, G; Bottomley, D; Patel, C; Scarsbrook, A
2018-04-30
To evaluate the efficacy of single time-point half-body (skull base to thighs) fluorine-18 choline positron emission tomography-computed tomography (PET-CT) compared to a triple-phase acquisition protocol in the detection of prostate carcinoma recurrence. Consecutive choline PET-CT studies performed at a single tertiary referral centre in patients with biochemical recurrence of prostate carcinoma between September 2012 and March 2017 were reviewed retrospectively. The indication for the study, imaging protocol used, imaging findings, whether management was influenced by the PET-CT, and subsequent patient outcome were recorded. Ninety-one examinations were performed during the study period; 42 were carried out using a triple-phase protocol (dynamic pelvic imaging for 20 minutes after tracer injection, half-body acquisition at 60 minutes and delayed pelvic scan at 90 minutes) between 2012 and August 2015. Subsequently following interim review of diagnostic performance, a streamlined protocol and appropriate-use criteria were introduced. Forty-nine examinations were carried out using the single-phase protocol between 2015 and 2017. Twenty-nine (69%) of the triple-phase studies were positive for recurrence compared to 38 (78%) of the single-phase studies. Only one patient who had a single-phase study would have benefited from a dynamic acquisition, they have required no further treatment or imaging and are currently under prostate-specific antigen (PSA) surveillance. Choline PET-CT remains a useful tool for the detection of prostate recurrence when used in combination with appropriate-use criteria. Removal of dynamic and delayed acquisition phases reduces study time without adversely affecting accuracy. Benefits include shorter imaging time which improves patient comfort, reduced cost, and improved scanner efficiency. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Optimizing the NASA Technical Report Server
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maa, Ming-Hokng
1996-01-01
The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.
Bacterial and fungal DNA extraction from blood samples: automated protocols.
Lorenz, Michael G; Disqué, Claudia; Mühl, Helge
2015-01-01
Automation in DNA isolation is a necessity for routine practice employing molecular diagnosis of infectious agents. To this end, the development of automated systems for the molecular diagnosis of microorganisms directly in blood samples is at its beginning. Important characteristics of systems demanded for routine use include high recovery of microbial DNA, DNA-free containment for the reduction of DNA contamination from exogenous sources, DNA-free reagents and consumables, ideally a walkaway system, and economical pricing of the equipment and consumables. Such full automation of DNA extraction evaluated and in use for sepsis diagnostics is yet not available. Here, we present protocols for the semiautomated isolation of microbial DNA from blood culture and low- and high-volume blood samples. The protocols include a manual pretreatment step followed by automated extraction and purification of microbial DNA.
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
BioBlocks: Programming Protocols in Biology Made Easier.
Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso
2017-07-21
The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.
Yang, Huiping; Jones, Carrie; Varga, Zoltan M.; Tiersch, Terrence R.
2009-01-01
Sperm cryopreservation offers potential for long-term storage of genetic resources. However, the current protocols for zebrafish Danio rerio are cumbersome and poorly reproducible. Our objective was to facilitate adoption of cryopreservation by streamlining methods from sperm collection through thawing and use. First, sperm activation was evaluated, and motility was completely inhibited when osmolality of the extender was ≥ 295 to 300 mOsmol/kg. To evaluate cryoprotectant toxicity, sperm were incubated with dimethyl sulfoxide (DMSO), N, N-dimethyl acetamide (DMA), methanol, or glycerol at 5, 10, and 15% concentrations. Based on motility, DMSO, DMA, and methanol (≤ 10%) were less toxic; therefore, sperm were cryopreserved using these cryoprotectants at cooling rates of 10 and 20 °C/min. The highest motility (mean ± SD) (35 ± 23%; P ≤ 0.0001) and fertility (13 ± 8%; P ≤ 0.001) in thawed sperm were obtained with the combination of 8% methanol and a cooling rate of 10 °C/min. Further evaluations of 8% methanol and 10 °C/min were performed with males from populations with high (2.05 ± 0.24) and low (1.18 ± 0.12) body condition (P = 0.0001). Motility of thawed sperm from the two populations was 38 ± 16% and 78 ± 10% (P = 0.0001), and fertilization was 6 ± 6% and 33 ± 20% (P = 0.0001). These values were positively related with body condition factor. Overall, this study simplified and standardized sperm cryopreservation, and established a protocol using French straws as a freezing container and an extender without powdered milk. This protocol can be readily adapted for high-throughput application using automated equipment, and motility and fertility comparable to previous reports were obtained. Male variability and sperm quality remain important considerations for future work, especially in mutant and inbred lines. PMID:17544099
Development of a safe and pragmatic awake craniotomy program at Maine Medical Center.
Rughani, Anand I; Rintel, Theodor; Desai, Rajiv; Cushing, Deborah A; Florman, Jeffrey E
2011-01-01
Awake craniotomy offers an excellent means of performing intraoperative mapping and optimizing surgical resection of brain tumors. Awake craniotomy relies on a strong collaboration between anesthesiologists, neurosurgeons, and operating room staff. The authors recently introduced awake craniotomy for tumor resection at the Maine Medical Center and propose that it can be performed safely, effectively, and efficiently in a high-volume community hospital. We describe a practical approach to performing awake craniotomy involving streamlined anesthetic protocols and simplified intraoperative testing parameters in a carefully selected group of patients. Our first 25 patients are retrospectively reviewed with particular attention to the anesthetic protocol, the extent of resection, the operative time, post-operative complications, the length of hospitalization, and their functional status at follow-up. The authors established an anesthetic protocol based primarily on midazolam, fentanyl, propofol, and local anesthetic. The authors note that all but one patient was able to tolerate the awake procedure. Gross total resection was achieved in nearly 80% of patients with a glial tumor. Operative time was short, averaging 159 minutes of entire anesthesia care. Length of stay averaged 3.7 days. Persistent new post-operative deficits were noted in 2 of 25 patients. There was no substantial difference in total hospital charges for patients undergoing awake craniotomy when compared to a matched historical control. With attention focused on patient selection and a streamlined anesthetic protocol, the authors were able to successfully implement an awake craniotomy protocol in a community setting with satisfying results, including low operative morbidity, short operative times, low anesthetic complications, and excellent patient tolerance.
SU-E-P-49: Evaluation of Image Quality and Radiation Dose of Various Unenhanced Head CT Protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, L; Khan, M; Alapati, K
2015-06-15
Purpose: To evaluate the diagnostic value of various unenhanced head CT protocols and predicate acceptable radiation dose level for head CT exam. Methods: Our retrospective analysis included 3 groups, 20 patients per group, who underwent clinical routine unenhanced adult head CT examination. All exams were performed axially with 120 kVp. Three protocols, 380 mAs without iterative reconstruction and automAs, 340 mAs with iterative reconstruction without automAs, 340 mAs with iterative reconstruction and automAs, were applied on each group patients respectively. The images were reconstructed with H30, J30 for brain window and H60, J70 for bone window. Images acquired with threemore » protocols were randomized and blindly reviewed by three radiologists. A 5 point scale was used to rate each exam The percentage of exam score above 3 and average scores of each protocol were calculated for each reviewer and tissue types. Results: For protocols without automAs, the average scores of bone window with iterative reconstruction were higher than those without iterative reconstruction for each reviewer although the radiation dose was 10 percentage lower. 100 percentage exams were scored 3 or higher and the average scores were above 4 for both brain and bone reconstructions. The CTDIvols are 64.4 and 57.8 mGy of 380 and 340 mAs, respectively. With automAs, the radiation dose varied with head size, resulting in 47.5 mGy average CTDIvol between 39.5 and 56.5 mGy. 93 and 98 percentage exams were scored great than 3 for brain and bone windows, respectively. The diagnostic confidence level and image quality of exams with AutomAs were less than those without AutomAs for each reviewer. Conclusion: According to these results, the mAs was reduced to 300 with automAs OFF for head CT exam. The radiation dose was 20 percentage lower than the original protocol and the CTDIvol was reduced to 51.2 mGy.« less
Greene, Leasa A; Isaac, Issa; Gray, Dean E; Schwartz, Sarah A
2007-09-01
Several species in the genus Echinacea are beneficial herbs popularly used for many ailments. The most popular Echinacea species for cultivation, wild collection, and herbal products include E. purpurea (L.) Moench, E. pallida (Nutt.) Nutt., and E. angustifolia (DC). Product adulteration is a key concern for the natural products industry, where botanical misidentification and introduction of other botanical and nonbotanical contaminants exist throughout the formulation and production process. Therefore, rapid and cost-effective methods that can be used to monitor these materials for complex product purity and consistency are of benefit to consumers and producers. The objective of this continuing research was to develop automated, high-throughput processing methods that, teamed with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) analysis, differentiate Echinacea species by their mass profiles. Small molecules, peptide, and proteins from aerial parts (leaf/stem/flowers), seeds, and roots from E. purpurea and E. angustifolia; seeds and roots from E. pallida; and off-the-shelf Echinacea supplements were extracted and analyzed by MS using methods developed on the ProPrep liquid handling system (Genomic Solutions). Analysis of these samples highlighted key MS signal patterns from both small molecules and proteins that characterized the individual Echinacea materials analyzed. Based on analysis of pure Echinacea samples, off-the-shelf products containing Echinacea could then be evaluated in a streamlined process. Corresponding analysis of dietary supplements was used to monitor for product composition, including Echinacea species and plant materials used. These results highlight the potential for streamlined, automated approaches for agricultural species differentiation and botanical product evaluation.
ERIC Educational Resources Information Center
Branzburg, Jeffrey
2007-01-01
This article presents an interview with Peter J. Young, Director of Technology at Rockford Public Schools in Michigan. In the interview, Young talked about how his district has done a lot more automation to integrate its disparate systems. He also discussed how they streamline their systems, how parents and community benefit from these efforts,…
Fire Danger Rating: The next 20 Years
John E. Deeming
1987-01-01
For the next 10 years, few changes will be made to the fire-danger rating system. During that time, the focus will be on the automation of weather observing systems and the streamlining of the computation and display of ratings. The time horizon for projecting fire danger will be pushed to 30 days by the late 1990's. A close alignment of the fire-danger rating...
The Xpress Transfer Protocol (XTP): A tutorial (expanded version)
NASA Technical Reports Server (NTRS)
Sanders, Robert M.; Weaver, Alfred C.
1990-01-01
The Xpress Transfer Protocol (XTP) is a reliable, real-time, light weight transfer layer protocol. Current transport layer protocols such as DoD's Transmission Control Protocol (TCP) and ISO's Transport Protocol (TP) were not designed for the next generation of high speed, interconnected reliable networks such as fiber distributed data interface (FDDI) and the gigabit/second wide area networks. Unlike all previous transport layer protocols, XTP is being designed to be implemented in hardware as a VLSI chip set. By streamlining the protocol, combining the transport and network layers and utilizing the increased speed and parallelization possible with a VLSI implementation, XTP will be able to provide the end-to-end data transmission rates demanded in high speed networks without compromising reliability and functionality. This paper describes the operation of the XTP protocol and in particular, its error, flow and rate control; inter-networking addressing mechanisms; and multicast support features, as defined in the XTP Protocol Definition Revision 3.4.
Marquis-Nicholson, Renate; Lai, Daniel; Love, Jennifer M.; Love, Donald R.
2013-01-01
Purpose. The aim of this study was to develop a streamlined mutation screening protocol for the DMD gene in order to confirm a clinical diagnosis of Duchenne or Becker muscular dystrophy in affected males and to clarify the carrier status of female family members. Methods. Sequence analysis and array comparative genomic hybridization (aCGH) were used to identify mutations in the dystrophin DMD gene. We analysed genomic DNA from six individuals with a range of previously characterised mutations and from eight individuals who had not previously undergone any form of molecular analysis. Results. We successfully identified the known mutations in all six patients. A molecular diagnosis was also made in three of the four patients with a clinical diagnosis who had not undergone prior genetic screening, and testing for familial mutations was successfully completed for the remaining four patients. Conclusion. The mutation screening protocol described here meets best practice guidelines for molecular testing of the DMD gene in a diagnostic laboratory. The aCGH method is a superior alternative to more conventional assays such as multiplex ligation-dependent probe amplification (MLPA). The combination of aCGH and sequence analysis will detect mutations in 98% of patients with the Duchenne or Becker muscular dystrophy. PMID:23476807
Ludgate, Jackie L; Wright, James; Stockwell, Peter A; Morison, Ian M; Eccles, Michael R; Chatterjee, Aniruddha
2017-08-31
Formalin fixed paraffin embedded (FFPE) tumor samples are a major source of DNA from patients in cancer research. However, FFPE is a challenging material to work with due to macromolecular fragmentation and nucleic acid crosslinking. FFPE tissue particularly possesses challenges for methylation analysis and for preparing sequencing-based libraries relying on bisulfite conversion. Successful bisulfite conversion is a key requirement for sequencing-based methylation analysis. Here we describe a complete and streamlined workflow for preparing next generation sequencing libraries for methylation analysis from FFPE tissues. This includes, counting cells from FFPE blocks and extracting DNA from FFPE slides, testing bisulfite conversion efficiency with a polymerase chain reaction (PCR) based test, preparing reduced representation bisulfite sequencing libraries and massively parallel sequencing. The main features and advantages of this protocol are: An optimized method for extracting good quality DNA from FFPE tissues. An efficient bisulfite conversion and next generation sequencing library preparation protocol that uses 50 ng DNA from FFPE tissue. Incorporation of a PCR-based test to assess bisulfite conversion efficiency prior to sequencing. We provide a complete workflow and an integrated protocol for performing DNA methylation analysis at the genome-scale and we believe this will facilitate clinical epigenetic research that involves the use of FFPE tissue.
McDonald, Sandra A; Ryan, Benjamin J; Brink, Amy; Holtschlag, Victoria L
2012-02-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item-one easily overlooked-is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web-based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks.
Ryan, Benjamin J.; Brink, Amy; Holtschlag, Victoria L.
2012-01-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item—one easily overlooked—is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web–based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks. PMID:23386921
Lew, Matthew D; von Diezmann, Alexander R S; Moerner, W E
2013-02-25
Automated processing of double-helix (DH) microscope images of single molecules (SMs) streamlines the protocol required to obtain super-resolved three-dimensional (3D) reconstructions of ultrastructures in biological samples by single-molecule active control microscopy. Here, we present a suite of MATLAB subroutines, bundled with an easy-to-use graphical user interface (GUI), that facilitates 3D localization of single emitters (e.g. SMs, fluorescent beads, or quantum dots) with precisions of tens of nanometers in multi-frame movies acquired using a wide-field DH epifluorescence microscope. The algorithmic approach is based upon template matching for SM recognition and least-squares fitting for 3D position measurement, both of which are computationally expedient and precise. Overlapping images of SMs are ignored, and the precision of least-squares fitting is not as high as maximum likelihood-based methods. However, once calibrated, the algorithm can fit 15-30 molecules per second on a 3 GHz Intel Core 2 Duo workstation, thereby producing a 3D super-resolution reconstruction of 100,000 molecules over a 20×20×2 μm field of view (processing 128×128 pixels × 20000 frames) in 75 min.
"First generation" automated DNA sequencing technology.
Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M
2011-10-01
Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.
Cheng, Yuk Wah; Wilkinson, Jenny M
2015-08-01
This paper reports on an evaluation of the introduction of a blood bank automation system (Ortho AutoVue(®) Innova) in a hospital blood bank by considering the performance and workflow as compared with manual methods. The turnaround time was found to be 45% faster than the manual method. The concordance rate was found to be 100% for both ABO/Rh(D) typing and antibody screening in both of the systems and there was no significant difference in detection sensitivity for clinically significant antibodies. The Ortho AutoVue(®) Innova automated blood banking system streamlined the routine pre-transfusion testing in hospital blood bank with high throughput, equivalent sensitivity and reliability as compared with conventional manual method. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks
NASA Technical Reports Server (NTRS)
Anderson, Mark G.
2011-01-01
This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.
US Topo Maps 2014: Program updates and research
Fishburn, Kristin A.
2014-01-01
The U. S. Geological Survey (USGS) US Topo map program is now in year two of its second three-year update cycle. Since the program was launched in 2009, the product and the production system tools and processes have undergone enhancements that have made the US Topo maps a popular success story. Research and development continues with structural and content product enhancements, streamlined and more fully automated workflows, and the evaluation of a GIS-friendly US Topo GIS Packet. In addition, change detection methodologies are under evaluation to further streamline product maintenance and minimize resource expenditures for production in the future. The US Topo map program will continue to evolve in the years to come, providing traditional map users and Geographic Information System (GIS) analysts alike with a convenient, freely available product incorporating nationally consistent data that are quality assured to high standards.
MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.
Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu
2012-06-01
In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.
MR efficiency using automated MRI-desktop eProtocol
NASA Astrophysics Data System (ADS)
Gao, Fei; Xu, Yanzhe; Panda, Anshuman; Zhang, Min; Hanson, James; Su, Congzhe; Wu, Teresa; Pavlicek, William; James, Judy R.
2017-03-01
MRI protocols are instruction sheets that radiology technologists use in routine clinical practice for guidance (e.g., slice position, acquisition parameters etc.). In Mayo Clinic Arizona (MCA), there are over 900 MR protocols (ranging across neuro, body, cardiac, breast etc.) which makes maintaining and updating the protocol instructions a labor intensive effort. The task is even more challenging given different vendors (Siemens, GE etc.). This is a universal problem faced by all the hospitals and/or medical research institutions. To increase the efficiency of the MR practice, we designed and implemented a web-based platform (eProtocol) to automate the management of MRI protocols. It is built upon a database that automatically extracts protocol information from DICOM compliant images and provides a user-friendly interface to the technologists to create, edit and update the protocols. Advanced operations such as protocol migrations from scanner to scanner and capability to upload Multimedia content were also implemented. To the best of our knowledge, eProtocol is the first MR protocol automated management tool used clinically. It is expected that this platform will significantly improve the radiology operations efficiency including better image quality and exam consistency, fewer repeat examinations and less acquisition errors. These protocols instructions will be readily available to the technologists during scans. In addition, this web-based platform can be extended to other imaging modalities such as CT, Mammography, and Interventional Radiology and different vendors for imaging protocol management.
Prefield methods: streamlining forest or nonforest determinations to increase inventory efficiency
Sara Goeking; Gretchen Moisen; Kevin Megown; Jason Toombs
2009-01-01
Interior West Forest Inventory and Analysis has developed prefield protocols to distinguish forested plots that require field visits from nonforested plots that do not require field visits. Recent innovations have increased the efficiency of the prefield process. First, the incorporation of periodic inventory data into a prefield database increased the amount of...
ERIC Educational Resources Information Center
Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth
2010-01-01
This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…
Organizational principles of cloud storage to support collaborative biomedical research.
Kanbar, Lara J; Shalish, Wissam; Robles-Rubio, Carlos A; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E
2015-08-01
This paper describes organizational guidelines and an anonymization protocol for the management of sensitive information in interdisciplinary, multi-institutional studies with multiple collaborators. This protocol is flexible, automated, and suitable for use in cloud-based projects as well as for publication of supplementary information in journal papers. A sample implementation of the anonymization protocol is illustrated for an ongoing study dealing with Automated Prediction of EXtubation readiness (APEX).
An Automated Design Framework for Multicellular Recombinase Logic.
Guiziou, Sarah; Ulliana, Federico; Moreau, Violaine; Leclere, Michel; Bonnet, Jerome
2018-05-18
Tools to systematically reprogram cellular behavior are crucial to address pressing challenges in manufacturing, environment, or healthcare. Recombinases can very efficiently encode Boolean and history-dependent logic in many species, yet current designs are performed on a case-by-case basis, limiting their scalability and requiring time-consuming optimization. Here we present an automated workflow for designing recombinase logic devices executing Boolean functions. Our theoretical framework uses a reduced library of computational devices distributed into different cellular subpopulations, which are then composed in various manners to implement all desired logic functions at the multicellular level. Our design platform called CALIN (Composable Asynchronous Logic using Integrase Networks) is broadly accessible via a web server, taking truth tables as inputs and providing corresponding DNA designs and sequences as outputs (available at http://synbio.cbs.cnrs.fr/calin ). We anticipate that this automated design workflow will streamline the implementation of Boolean functions in many organisms and for various applications.
Helping System Engineers Bridge the Peaks
NASA Technical Reports Server (NTRS)
Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen
2014-01-01
In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.
Vogel, Anne Ilse Maria; Lale, Rahmi; Hohmann-Marriott, Martin Frank
2017-01-01
Synechococcus sp. PCC 7002 (henceforth Synechococcus ) is developing into a powerful synthetic biology chassis. In order to streamline the integration of genes into the Synechococcus chromosome, validation of neutral integration sites with optimization of the DNA transformation protocol parameters is necessary. Availability of BioBrick-compatible integration modules is desirable to further simplifying chromosomal integrations. We designed three BioBrick-compatible genetic modules, each targeting a separate neutral integration site, A2842, A0935, and A0159, with varying length of homologous region, spanning from 100 to 800 nt. The performance of the different modules for achieving DNA integration were tested. Our results demonstrate that 100 nt homologous regions are sufficient for inserting a 1 kb DNA fragment into the Synechococcus chromosome. By adapting a transformation protocol from a related cyanobacterium, we shortened the transformation procedure for Synechococcus significantly. The optimized transformation protocol reported in this study provides an efficient way to perform genetic engineering in Synechococcus . We demonstrated that homologous regions of 100 nt are sufficient for inserting a 1 kb DNA fragment into the three tested neutral integration sites. Integration at A2842, A0935 and A0159 results in only a minimal fitness cost for the chassis. This study contributes to developing Synechococcus as the prominent chassis for future synthetic biology applications.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
2016-01-01
A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365
Lighting Automation Flying an Earthlike Habitat
NASA Technical Reports Server (NTRS)
Clark, Toni A.; Kolomenski, Andrei
2017-01-01
Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.
Lighting Automation - Flying an Earthlike Habitat
NASA Technical Reports Server (NTRS)
Clark, Tori A. (Principal Investigator); Kolomenski, Andrei
2017-01-01
Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.
Systematic review automation technologies
2014-01-01
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128
Meyer, Markus; Donsa, Klaus; Truskaller, Thomas; Frohner, Matthias; Pohn, Birgit; Felfernig, Alexander; Sinner, Frank; Pieber, Thomas
2018-01-01
A fast and accurate data transmission from glucose meter to clinical decision support systems (CDSSs) is crucial for the management of type 2 diabetes mellitus since almost all therapeutic interventions are derived from glucose measurements. Aim was to develop a prototype of an automated glucose measurement transmission protocol based on the Continua Design Guidelines and to embed the protocol into a CDSS used by healthcare professionals. A literature and market research was performed to analyze the state-of-the-art and thereupon develop, integrate and validate an automated glucose measurement transmission protocol in an iterative process. Findings from literature and market research guided towards the development of a standardized glucose measurement transmission protocol using a middleware. The interface description to communicate with the glucose meter was illustrated and embedded into a CDSS. A prototype of an interoperable transmission of glucose measurements was developed and implemented in a CDSS presenting a promising way to reduce medication errors and improve user satisfaction.
Robotic automation of medication-use management.
Enright, S M
1993-11-01
In the October 1993 issue of Physician Assistant, we published "Robots for Health Care," the first of two articles on the medical applications of robotics. That article discussed ways in which robots could help patients with manipulative disabilities to perform activities of daily living and hold paid employment; transfer patients from bed to chair and back again; add precision to the most exacting surgical procedures; and someday carry out diagnostic and therapeutic techniques from within the human body. This month, we are pleased to offer an article by Sharon Enright, an authority on pharmacy operations, who considers how an automated medication-management system that makes use of bar-code technology is capable of streamlining drug dispensing, controlling safety, increasing cost-effectiveness, and ensuring accurate and complete record-keeping.
Linget, J M; du Vignaud, P
1999-05-01
A 215 Gilson liquid handler was used to automate enzymatic incubations using microsomes, cytosol and plasma. The design of automated protocols are described. They were based on the use of 96 deep well plates and on HPLC-based methods for assaying the substrate. The assessment of those protocols was made with comparison between manual and automated incubations, reliability and reproducibility of automated incubations in microsomes and cytosol. Examples of the use of those programs in metabolic studies in drug research, i.e. metabolic screening in microsomes and plasma were shown. Even rapid processes (with disappearance half lives as low as 1 min) can be analysed. This work demonstrates how stability studies can be automated to save time, render experiments involving human biological media less hazardous and may be improve inter-laboratory reproducibility.
PR-PR: Cross-Platform Laboratory Automation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Goyal, G
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less
PR-PR: cross-platform laboratory automation system.
Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J
2014-08-15
To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.
Outplacement Services in Support of BRAC and Competitive Sourcing Task Group
2003-07-30
Management(SM) is an automated Web-based solution that streamlines workforce adjustment initiatives and exit processing, saving time and money by getting...effected during FY02 – almost 7,000. o Over 160,000 employees have been saved from involuntary separation since program inception in 1993. Buyouts...monetary incentive, up to $25,000, for employee to retire, either optional or early, or resign. Payment of the incentive must save another DoD employee
GPSit: An automated method for evolutionary analysis of nonculturable ciliated microeukaryotes.
Chen, Xiao; Wang, Yurui; Sheng, Yalan; Warren, Alan; Gao, Shan
2018-05-01
Microeukaryotes are among the most important components of the microbial food web in almost all aquatic and terrestrial ecosystems worldwide. In order to gain a better understanding their roles and functions in ecosystems, sequencing coupled with phylogenomic analyses of entire genomes or transcriptomes is increasingly used to reconstruct the evolutionary history and classification of these microeukaryotes and thus provide a more robust framework for determining their systematics and diversity. More importantly, phylogenomic research usually requires high levels of hands-on bioinformatics experience. Here, we propose an efficient automated method, "Guided Phylogenomic Search in trees" (GPSit), which starts from predicted protein sequences of newly sequenced species and a well-defined customized orthologous database. Compared with previous protocols, our method streamlines the entire workflow by integrating all essential and other optional operations. In so doing, the manual operation time for reconstructing phylogenetic relationships is reduced from days to several hours, compared to other methods. Furthermore, GPSit supports user-defined parameters in most steps and thus allows users to adapt it to their studies. The effectiveness of GPSit is demonstrated by incorporating available online data and new single-cell data of three nonculturable marine ciliates (Anteholosticha monilata, Deviata sp. and Diophrys scutum) under moderate sequencing coverage (~5×). Our results indicate that the former could reconstruct robust "deep" phylogenetic relationships while the latter reveals the presence of intermediate taxa in shallow relationships. Based on empirical phylogenomic data, we also used GPSit to evaluate the impact of different levels of missing data on two commonly used methods of phylogenetic analyses, maximum likelihood (ML) and Bayesian inference (BI) methods. We found that BI is less sensitive to missing data when fast-evolving sites are removed. © 2018 John Wiley & Sons Ltd.
Zhang, Zezhong; Qi, Qingqing
2014-05-01
Medication errors are very dangerous even fatal since it could cause serious even fatal harm to patients. In order to reduce medication errors, automated patient medication systems using the Radio Frequency Identification (RFID) technology have been used in many hospitals. The data transmitted in those medication systems is very important and sensitive. In the past decade, many security protocols have been proposed to ensure its secure transition attracted wide attention. Due to providing mutual authentication between the medication server and the tag, the RFID authentication protocol is considered as the most important security protocols in those systems. In this paper, we propose a RFID authentication protocol to enhance patient medication safety using elliptic curve cryptography (ECC). The analysis shows the proposed protocol could overcome security weaknesses in previous protocols and has better performance. Therefore, the proposed protocol is very suitable for automated patient medication systems.
NASA Technical Reports Server (NTRS)
Call, Jared A.; Kwok, John H.; Fisher, Forest W.
2013-01-01
This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.
USDA-ARS?s Scientific Manuscript database
The National Animal Disease Center (NADC) conducts basic and applied research on endemic animal diseases of high priority that adversely affect U.S. livestock production or trade. Experiments conducted at this Center vary in range and scope, with a subset involving synthetic or recombinant nucleic a...
Sociometric Indicators of Leadership: An Exploratory Analysis
2018-01-01
streamline existing observational protocols and assessment methods . This research provides an initial test of sociometric badges in the context of the U.S...understand, the requirements of the mission. Traditional research and assessment methods focusing on leader and follower interactions require direct...based methods of social network analysis. Novel Measures of Leadership Building on these findings and earlier research , it is apparent that
Use of Flowchart for Automation of Clinical Protocols in mHealth.
Dias, Karine Nóra; Welfer, Daniel; Cordeiro d'Ornellas, Marcos; Pereira Haygert, Carlos Jesus; Dotto, Gustavo Nogara
2017-01-01
For healthcare professionals to use mobile applications we need someone who knows software development, provide them. In healthcare institutions, health professionals use clinical protocols to govern care, and sometimes these documents are computerized through mobile applications to assist them. This work aims to present a proposal of an application of flow as a way of describing clinical protocols for automatic generation of mobile applications to assist health professionals. The purpose of this research is to enable health professionals to develop applications from the description of their own clinical protocols. As a result, we developed a web system that automates clinical protocols for an Android platform, and we validated with two clinical protocols used in a Brazilian hospital. Preliminary results of the developed architecture demonstrate the feasibility of this study.
Louzao, Maria Carmen; Rodriguez Vieytes, Mercedes; Garcia Cabado, Ana; Vieites Baptista De Sousa, Juan Manuel; Botana, Luis Miguel
2003-04-01
Paralytic shellfish poisoning is one of the most severe forms of food poisoning. The toxins responsible for this type of poisoning are metabolic products of dinoflagellates, which block neuronal transmission by binding to the voltage-gated Na(+) channel. Accumulation of paralytic toxins in shellfish is an unpredictable phenomenon that necessitates the implementation of a widespread and thorough monitoring program for mollusk toxicity. All of these programs require periodical collection and analysis of a wide range of shellfish. Therefore, development of accurate analytical protocols for the rapid determination of toxicity levels would streamline this process. Our laboratory has developed a fluorimetric microplate bioassay that rapidly and specifically determines the presence of paralytic shellfish toxins in many seafood samples. This method is based on the pharmacological activity of toxins and involves several steps: (i) Incubation of excitable cells in 96 well microtiter plates with the fluorescent dye, bis-oxonol, the distribution of which across the membrane is potential-dependent. (ii) Cell depolarization with veratridine, a sodium channel-activating toxin. (iii) Dose-dependent inhibition of depolarization with saxitoxin or natural samples containing paralytic shellfish toxins. Measuring toxin-induced changes in membrane potential allowed for quantification and estimation of the toxic potency of the samples. This new approach offers significant advantages over classical methods and can be easily automated.
Semiautomated Workflow for Clinically Streamlined Glioma Parametric Response Mapping
Keith, Lauren; Ross, Brian D.; Galbán, Craig J.; Luker, Gary D.; Galbán, Stefanie; Zhao, Binsheng; Guo, Xiaotao; Chenevert, Thomas L.; Hoff, Benjamin A.
2017-01-01
Management of glioblastoma multiforme remains a challenging problem despite recent advances in targeted therapies. Timely assessment of therapeutic agents is hindered by the lack of standard quantitative imaging protocols for determining targeted response. Clinical response assessment for brain tumors is determined by volumetric changes assessed at 10 weeks post-treatment initiation. Further, current clinical criteria fail to use advanced quantitative imaging approaches, such as diffusion and perfusion magnetic resonance imaging. Development of the parametric response mapping (PRM) applied to diffusion-weighted magnetic resonance imaging has provided a sensitive and early biomarker of successful cytotoxic therapy in brain tumors while maintaining a spatial context within the tumor. Although PRM provides an earlier readout than volumetry and sometimes greater sensitivity compared with traditional whole-tumor diffusion statistics, it is not routinely used for patient management; an automated and standardized software for performing the analysis and for the generation of a clinical report document is required for this. We present a semiautomated and seamless workflow for image coregistration, segmentation, and PRM classification of glioblastoma multiforme diffusion-weighted magnetic resonance imaging scans. The software solution can be integrated using local hardware or performed remotely in the cloud while providing connectivity to existing picture archive and communication systems. This is an important step toward implementing PRM analysis of solid tumors in routine clinical practice. PMID:28286871
Automated selective disruption of slow wave sleep.
Ooms, Sharon J; Zempel, John M; Holtzman, David M; Ju, Yo-El S
2017-04-01
Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10s live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5-4Hz) band, particularly in the 0.5-2Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4-8Hz) and alpha (8-12Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6min, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. Copyright © 2017 Elsevier B.V. All rights reserved.
A holistic approach to ZigBee performance enhancement for home automation networks.
Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep
2014-08-14
Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.
A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks
Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep
2014-01-01
Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004
Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M; Teichgraeber, John F; Gateno, Jaime; Xia, James J
2017-12-01
There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities.
Yuan, Peng; Mai, Huaming; Li, Jianfu; Ho, Dennis Chun-Yu; Lai, Yingying; Liu, Siting; Kim, Daeseung; Xiong, Zixiang; Alfi, David M.; Teichgraeber, John F.; Gateno, Jaime
2017-01-01
Purpose There are many proven problems associated with traditional surgical planning methods for orthognathic surgery. To address these problems, we developed a computer-aided surgical simulation (CASS) system, the AnatomicAligner, to plan orthognathic surgery following our streamlined clinical protocol. Methods The system includes six modules: image segmentation and three-dimensional (3D) reconstruction, registration and reorientation of models to neutral head posture, 3D cephalometric analysis, virtual osteotomy, surgical simulation, and surgical splint generation. The accuracy of the system was validated in a stepwise fashion: first to evaluate the accuracy of AnatomicAligner using 30 sets of patient data, then to evaluate the fitting of splints generated by AnatomicAligner using 10 sets of patient data. The industrial gold standard system, Mimics, was used as the reference. Result When comparing the results of segmentation, virtual osteotomy and transformation achieved with AnatomicAligner to the ones achieved with Mimics, the absolute deviation between the two systems was clinically insignificant. The average surface deviation between the two models after 3D model reconstruction in AnatomicAligner and Mimics was 0.3 mm with a standard deviation (SD) of 0.03 mm. All the average surface deviations between the two models after virtual osteotomy and transformations were smaller than 0.01 mm with a SD of 0.01 mm. In addition, the fitting of splints generated by AnatomicAligner was at least as good as the ones generated by Mimics. Conclusion We successfully developed a CASS system, the AnatomicAligner, for planning orthognathic surgery following the streamlined planning protocol. The system has been proven accurate. AnatomicAligner will soon be available freely to the boarder clinical and research communities. PMID:28432489
Price, Travis K.; Dune, Tanaka; Hilt, Evann E.; Thomas-White, Krystal J.; Kliethermes, Stephanie; Brincat, Cynthia; Brubaker, Linda; Wolfe, Alan J.
2016-01-01
Enhanced quantitative urine culture (EQUC) detects live microorganisms in the vast majority of urine specimens reported as “no growth” by the standard urine culture protocol. Here, we evaluated an expanded set of EQUC conditions (expanded-spectrum EQUC) to identify an optimal version that provides a more complete description of uropathogens in women experiencing urinary tract infection (UTI)-like symptoms. One hundred fifty adult urogynecology patient-participants were characterized using a self-completed validated UTI symptom assessment (UTISA) questionnaire and asked “Do you feel you have a UTI?” Women responding negatively were recruited into the no-UTI cohort, while women responding affirmatively were recruited into the UTI cohort; the latter cohort was reassessed with the UTISA questionnaire 3 to 7 days later. Baseline catheterized urine samples were plated using both standard urine culture and expanded-spectrum EQUC protocols: standard urine culture inoculated at 1 μl onto 2 agars incubated aerobically; expanded-spectrum EQUC inoculated at three different volumes of urine onto 7 combinations of agars and environments. Compared to expanded-spectrum EQUC, standard urine culture missed 67% of uropathogens overall and 50% in participants with severe urinary symptoms. Thirty-six percent of participants with missed uropathogens reported no symptom resolution after treatment by standard urine culture results. Optimal detection of uropathogens could be achieved using the following: 100 μl of urine plated onto blood (blood agar plate [BAP]), colistin-nalidixic acid (CNA), and MacConkey agars in 5% CO2 for 48 h. This streamlined EQUC protocol achieved 84% uropathogen detection relative to 33% detection by standard urine culture. The streamlined EQUC protocol improves detection of uropathogens that are likely relevant for symptomatic women, giving clinicians the opportunity to receive additional information not currently reported using standard urine culture techniques. PMID:26962083
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L
2009-01-01
A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
2016-04-01
A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Streamlining workflow and automation to accelerate laboratory scale protein production.
Konczal, Jennifer; Gray, Christopher H
2017-05-01
Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning
ERIC Educational Resources Information Center
Thomas, M. S.; Kothari, D. P.; Prakash, A.
2011-01-01
Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…
2012-03-01
this list adding “out-of-the-loop syndrome ”, mode awareness problems, and vigilance decrements to the SA challenges faced by RPA crews. 18...Systems, Man, and Cybernetics, vol. 19, no. 3, May/June. Ouma, J., Chappelle, W., & Salinas, A. (2011) “Faces of occupational burnout among U.S
Zelesky, Veronica; Schneider, Richard; Janiszewski, John; Zamora, Ismael; Ferguson, James; Troutman, Matthew
2013-05-01
The ability to supplement high-throughput metabolic clearance data with structural information defining the site of metabolism should allow design teams to streamline their synthetic decisions. However, broad application of metabolite identification in early drug discovery has been limited, largely due to the time required for data review and structural assignment. The advent of mass defect filtering and its application toward metabolite scouting paved the way for the development of software automation tools capable of rapidly identifying drug-related material in complex biological matrices. Two semi-automated commercial software applications, MetabolitePilot™ and Mass-MetaSite™, were evaluated to assess the relative speed and accuracy of structural assignments using data generated on a high-resolution MS platform. Review of these applications has demonstrated their utility in providing accurate results in a time-efficient manner, leading to acceleration of metabolite identification initiatives while highlighting the continued need for biotransformation expertise in the interpretation of more complex metabolic reactions.
Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S
2008-11-06
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.
Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.
2008-01-01
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194
Soares, Filipa A.C.; Chandra, Amit; Thomas, Robert J.; Pedersen, Roger A.; Vallier, Ludovic; Williams, David J.
2014-01-01
The transfer of a laboratory process into a manufacturing facility is one of the most critical steps required for the large scale production of cell-based therapy products. This study describes the first published protocol for scalable automated expansion of human induced pluripotent stem cell lines growing in aggregates in feeder-free and chemically defined medium. Cells were successfully transferred between different sites representative of research and manufacturing settings; and passaged manually and using the CompacT SelecT automation platform. Modified protocols were developed for the automated system and the management of cells aggregates (clumps) was identified as the critical step. Cellular morphology, pluripotency gene expression and differentiation into the three germ layers have been used compare the outcomes of manual and automated processes. PMID:24440272
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Xiaofan; Peris, David; Kominek, Jacek
The availability of genomes across the tree of life is highly biased toward vertebrates, pathogens, human disease models, and organisms with relatively small and simple genomes. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understanding the biology and evolution of the full spectrum of biodiversity. The increasing diversity of sequencing technologies, assays, and de novo assembly algorithms have augmented the complexity of de novo genome sequencing projects in nonmodel organisms. To reduce the costs and challenges in de novo genome sequencing projects and streamline their experimentalmore » design and analysis, we developed iWGS (in silico Whole Genome Sequencer and Analyzer), an automated pipeline for guiding the choice of appropriate sequencing strategy and assembly protocols. iWGS seamlessly integrates the four key steps of a de novo genome sequencing project: data generation (through simulation), data quality control, de novo assembly, and assembly evaluation and validation. The last three steps can also be applied to the analysis of real data. iWGS is designed to enable the user to have great flexibility in testing the range of experimental designs available for genome sequencing projects, and supports all major sequencing technologies and popular assembly tools. Three case studies illustrate how iWGS can guide the design of de novo genome sequencing projects, and evaluate the performance of a wide variety of user-specified sequencing strategies and assembly protocols on genomes of differing architectures. iWGS, along with a detailed documentation, is freely available at https://github.com/zhouxiaofan1983/iWGS.« less
Zhou, Xiaofan; Peris, David; Kominek, Jacek; ...
2016-09-16
The availability of genomes across the tree of life is highly biased toward vertebrates, pathogens, human disease models, and organisms with relatively small and simple genomes. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understanding the biology and evolution of the full spectrum of biodiversity. The increasing diversity of sequencing technologies, assays, and de novo assembly algorithms have augmented the complexity of de novo genome sequencing projects in nonmodel organisms. To reduce the costs and challenges in de novo genome sequencing projects and streamline their experimentalmore » design and analysis, we developed iWGS (in silico Whole Genome Sequencer and Analyzer), an automated pipeline for guiding the choice of appropriate sequencing strategy and assembly protocols. iWGS seamlessly integrates the four key steps of a de novo genome sequencing project: data generation (through simulation), data quality control, de novo assembly, and assembly evaluation and validation. The last three steps can also be applied to the analysis of real data. iWGS is designed to enable the user to have great flexibility in testing the range of experimental designs available for genome sequencing projects, and supports all major sequencing technologies and popular assembly tools. Three case studies illustrate how iWGS can guide the design of de novo genome sequencing projects, and evaluate the performance of a wide variety of user-specified sequencing strategies and assembly protocols on genomes of differing architectures. iWGS, along with a detailed documentation, is freely available at https://github.com/zhouxiaofan1983/iWGS.« less
RetroPath2.0: A retrosynthesis workflow for metabolic engineers.
Delépine, Baudoin; Duigou, Thomas; Carbonell, Pablo; Faulon, Jean-Loup
2018-01-01
Synthetic biology applied to industrial biotechnology is transforming the way we produce chemicals. However, despite advances in the scale and scope of metabolic engineering, the research and development process still remains costly. In order to expand the chemical repertoire for the production of next generation compounds, a major engineering biology effort is required in the development of novel design tools that target chemical diversity through rapid and predictable protocols. Addressing that goal involves retrosynthesis approaches that explore the chemical biosynthetic space. However, the complexity associated with the large combinatorial retrosynthesis design space has often been recognized as the main challenge hindering the approach. Here, we provide RetroPath2.0, an automated open source workflow for retrosynthesis based on generalized reaction rules that perform the retrosynthesis search from chassis to target through an efficient and well-controlled protocol. Its easiness of use and the versatility of its applications make this tool a valuable addition to the biological engineer bench desk. We show through several examples the application of the workflow to biotechnological relevant problems, including the identification of alternative biosynthetic routes through enzyme promiscuity or the development of biosensors. We demonstrate in that way the ability of the workflow to streamline retrosynthesis pathway design and its major role in reshaping the design, build, test and learn pipeline by driving the process toward the objective of optimizing bioproduction. The RetroPath2.0 workflow is built using tools developed by the bioinformatics and cheminformatics community, because it is open source we anticipate community contributions will likely expand further the features of the workflow. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
ProDeGe: A computational protocol for fully automated decontamination of genomes
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; ...
2015-06-09
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
Using machine learning for sequence-level automated MRI protocol selection in neuroradiology.
Brown, Andrew D; Marotta, Thomas R
2018-05-01
Incorrect imaging protocol selection can lead to important clinical findings being missed, contributing to both wasted health care resources and patient harm. We present a machine learning method for analyzing the unstructured text of clinical indications and patient demographics from magnetic resonance imaging (MRI) orders to automatically protocol MRI procedures at the sequence level. We compared 3 machine learning models - support vector machine, gradient boosting machine, and random forest - to a baseline model that predicted the most common protocol for all observations in our test set. The gradient boosting machine model significantly outperformed the baseline and demonstrated the best performance of the 3 models in terms of accuracy (95%), precision (86%), recall (80%), and Hamming loss (0.0487). This demonstrates the feasibility of automating sequence selection by applying machine learning to MRI orders. Automated sequence selection has important safety, quality, and financial implications and may facilitate improvements in the quality and safety of medical imaging service delivery.
ProDeGe: A computational protocol for fully automated decontamination of genomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
Eco-friendly streamlined process for sporopollenin exine capsule extraction
Mundargi, Raghavendra C.; Potroz, Michael G.; Park, Jae Hyeon; Seo, Jeongeun; Tan, Ee-Lin; Lee, Jae Ho; Cho, Nam-Joon
2016-01-01
Sporopollenin exine capsules (SECs) extracted from Lycopodium clavatum spores are an attractive biomaterial possessing a highly robust structure suitable for microencapsulation strategies. Despite several decades of research into SEC extraction methods, the protocols commonly used for L. clavatum still entail processing with both alkaline and acidolysis steps at temperatures up to 180 °C and lasting up to 7 days. Herein, we demonstrate a significantly streamlined processing regimen, which indicates that much lower temperatures and processing durations can be used without alkaline lysis. By employing CHN elemental analysis, scanning electron microscopy (SEM), confocal laser scanning microscopy (CLSM), and dynamic image particle analysis (DIPA), the optimum conditions for L. clavatum SEC processing were determined to include 30 hours acidolysis at 70 °C without alkaline lysis. Extending these findings to proof-of-concept encapsulation studies, we further demonstrate that our SECs are able to achieve a loading of 0.170 ± 0.01 g BSA per 1 g SECs by vacuum-assisted loading. Taken together, our streamlined processing method and corresponding characterization of SECs provides important insights for the development of applications including drug delivery, cosmetics, personal care products, and foods. PMID:26818918
Eco-friendly streamlined process for sporopollenin exine capsule extraction
NASA Astrophysics Data System (ADS)
Mundargi, Raghavendra C.; Potroz, Michael G.; Park, Jae Hyeon; Seo, Jeongeun; Tan, Ee-Lin; Lee, Jae Ho; Cho, Nam-Joon
2016-01-01
Sporopollenin exine capsules (SECs) extracted from Lycopodium clavatum spores are an attractive biomaterial possessing a highly robust structure suitable for microencapsulation strategies. Despite several decades of research into SEC extraction methods, the protocols commonly used for L. clavatum still entail processing with both alkaline and acidolysis steps at temperatures up to 180 °C and lasting up to 7 days. Herein, we demonstrate a significantly streamlined processing regimen, which indicates that much lower temperatures and processing durations can be used without alkaline lysis. By employing CHN elemental analysis, scanning electron microscopy (SEM), confocal laser scanning microscopy (CLSM), and dynamic image particle analysis (DIPA), the optimum conditions for L. clavatum SEC processing were determined to include 30 hours acidolysis at 70 °C without alkaline lysis. Extending these findings to proof-of-concept encapsulation studies, we further demonstrate that our SECs are able to achieve a loading of 0.170 ± 0.01 g BSA per 1 g SECs by vacuum-assisted loading. Taken together, our streamlined processing method and corresponding characterization of SECs provides important insights for the development of applications including drug delivery, cosmetics, personal care products, and foods.
Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.
Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J
2007-09-01
Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.
Autonomous Data Transfer Operations for Missions
NASA Technical Reports Server (NTRS)
Repaci, Max; Baker, Paul; Brosi, Fred
2000-01-01
Automating the data transfer operation can significantly reduce the cost of moving data from a spacecraft to a location on Earth. Automated data transfer methods have been developed for the terrestrial Internet. However, they often do not apply to the space environment, since in general they are based on assumptions about connectivity that are true on the Internet but not on space links. Automated file transfer protocols have been developed for use over space links that transfer data via store-and-forward of files or segments of files. This paper investigates some of the operational concepts made possible by these protocols.
Automated selective disruption of slow wave sleep
Ooms, Sharon J.; Zempel, John M.; Holtzman, David M.; Ju, Yo-El S.
2017-01-01
Background Slow wave sleep (SWS) plays an important role in neurophysiologic restoration. Experimentally testing the effect of SWS disruption previously required highly time-intensive and subjective methods. Our goal was to develop an automated and objective protocol to reduce SWS without affecting sleep architecture. New Method We developed a custom Matlab™ protocol to calculate electroencephalogram spectral power every 10 seconds live during a polysomnogram, exclude artifact, and, if measurements met criteria for SWS, deliver increasingly louder tones through earphones. Middle-aged healthy volunteers (n=10) each underwent 2 polysomnograms, one with the SWS disruption protocol and one with sham condition. Results The SWS disruption protocol reduced SWS compared to sham condition, as measured by spectral power in the delta (0.5–4 Hz) band, particularly in the 0.5–2 Hz range (mean 20% decrease). A compensatory increase in the proportion of total spectral power in the theta (4–8 Hz) and alpha (8–12 Hz) bands was seen, but otherwise normal sleep features were preserved. N3 sleep decreased from 20±34 to 3±6 minutes, otherwise there were no significant changes in total sleep time, sleep efficiency, or other macrostructural sleep characteristics. Comparison with existing method This novel SWS disruption protocol produces specific reductions in delta band power similar to existing methods, but has the advantage of being automated, such that SWS disruption can be performed easily in a highly standardized and operator-independent manner. Conclusion This automated SWS disruption protocol effectively reduces SWS without impacting overall sleep architecture. PMID:28238859
Magnetic Resonance of Pelvic and Gastrointestinal Emergencies.
Wongwaisayawan, Sirote; Kaewlai, Rathachai; Dattwyler, Matthew; Abujudeh, Hani H; Singh, Ajay K
2016-05-01
Magnetic resonance (MR) imaging is gaining increased acceptance in the emergency setting despite the continued dominance of computed tomography. MR has the advantages of more precise tissue characterization, superior soft tissue contrast, and a lack of ionizing radiation. Traditional barriers to emergent MR are being overcome by streamlined imaging protocols and newer rapid-acquisition sequences. As the utilization of MR imaging in the emergency department increases, a strong working knowledge of the MR appearance of the most commonly encountered abdominopelvic pathologies is essential. In this article, MR imaging protocols and findings of acute pelvic, scrotal, and gastrointestinal pathologies are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Automated Planning Enables Complex Protocols on Liquid-Handling Robots.
Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg
2018-03-16
Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.
Anti-nuclear antibody screening using HEp-2 cells.
Buchner, Carol; Bryant, Cassandra; Eslami, Anna; Lakos, Gabriella
2014-06-23
The American College of Rheumatology position statement on ANA testing stipulates the use of IIF as the gold standard method for ANA screening(1). Although IIF is an excellent screening test in expert hands, the technical difficulties of processing and reading IIF slides--such as the labor intensive slide processing, manual reading, the need for experienced, trained technologists and the use of dark room--make the IIF method difficult to fit in the workflow of modern, automated laboratories. The first and crucial step towards high quality ANA screening is careful slide processing. This procedure is labor intensive, and requires full understanding of the process, as well as attention to details and experience. Slide reading is performed by fluorescent microscopy in dark rooms, and is done by trained technologists who are familiar with the various patterns, in the context of cell cycle and the morphology of interphase and dividing cells. Provided that IIF is the first line screening tool for SARD, understanding the steps to correctly perform this technique is critical. Recently, digital imaging systems have been developed for the automated reading of IIF slides. These systems, such as the NOVA View Automated Fluorescent Microscope, are designed to streamline the routine IIF workflow. NOVA View acquires and stores high resolution digital images of the wells, thereby separating image acquisition from interpretation; images are viewed an interpreted on high resolution computer monitors. It stores images for future reference and supports the operator's interpretation by providing fluorescent light intensity data on the images. It also preliminarily categorizes results as positive or negative, and provides pattern recognition for positive samples. In summary, it eliminates the need for darkroom, and automates and streamlines the IIF reading/interpretation workflow. Most importantly, it increases consistency between readers and readings. Moreover, with the use of barcoded slides, transcription errors are eliminated by providing sample traceability and positive patient identification. This results in increased patient data integrity and safety. The overall goal of this video is to demonstrate the IIF procedure, including slide processing, identification of common IIF patterns, and the introduction of new advancements to simplify and harmonize this technique.
Blom, Kimberly C; Farina, Sasha; Gomez, Yessica-Haydee; Campbell, Norm R C; Hemmelgarn, Brenda R; Cloutier, Lyne; McKay, Donald W; Dawes, Martin; Tobe, Sheldon W; Bolli, Peter; Gelfer, Mark; McLean, Donna; Bartlett, Gillian; Joseph, Lawrence; Featherstone, Robin; Schiffrin, Ernesto L; Daskalopoulou, Stella S
2015-04-01
Despite progress in automated blood pressure measurement (BPM) technology, there is limited research linking hard outcomes to automated office BPM (OBPM) treatment targets and thresholds. Equivalences for automated BPM devices have been estimated from approximations of standardized manual measurements of 140/90 mmHg. Until outcome-driven targets and thresholds become available for automated measurement methods, deriving evidence-based equivalences between automated methods and standardized manual OBPM is the next best solution. The MeasureBP study group was initiated by the Canadian Hypertension Education Program to close this critical knowledge gap. MeasureBP aims to define evidence-based equivalent values between standardized manual OBPM and automated BPM methods by synthesizing available evidence using a systematic review and individual subject-level data meta-analyses. This manuscript provides a review of the literature and MeasureBP study protocol. These results will lay the evidenced-based foundation to resolve uncertainties within blood pressure guidelines which, in turn, will improve the management of hypertension.
Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus
2011-08-01
Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.
Automatized set-up procedure for transcranial magnetic stimulation protocols.
Harquel, S; Diard, J; Raffin, E; Passera, B; Dall'Igna, G; Marendaz, C; David, O; Chauvin, A
2017-06-01
Transcranial Magnetic Stimulation (TMS) established itself as a powerful technique for probing and treating the human brain. Major technological evolutions, such as neuronavigation and robotized systems, have continuously increased the spatial reliability and reproducibility of TMS, by minimizing the influence of human and experimental factors. However, there is still a lack of efficient set-up procedure, which prevents the automation of TMS protocols. For example, the set-up procedure for defining the stimulation intensity specific to each subject is classically done manually by experienced practitioners, by assessing the motor cortical excitability level over the motor hotspot (HS) of a targeted muscle. This is time-consuming and introduces experimental variability. Therefore, we developed a probabilistic Bayesian model (AutoHS) that automatically identifies the HS position. Using virtual and real experiments, we compared the efficacy of the manual and automated procedures. AutoHS appeared to be more reproducible, faster, and at least as reliable as classical manual procedures. By combining AutoHS with robotized TMS and automated motor threshold estimation methods, our approach constitutes the first fully automated set-up procedure for TMS protocols. The use of this procedure decreases inter-experimenter variability while facilitating the handling of TMS protocols used for research and clinical routine. Copyright © 2017 Elsevier Inc. All rights reserved.
Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.
2009-01-01
Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669
Automated monitoring of medical protocols: a secure and distributed architecture.
Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F
2003-03-01
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Compact Modbus TCP/IP protocol for data acquisition systems based on limited hardware resources
NASA Astrophysics Data System (ADS)
Bai, Q.; Jin, B.; Wang, D.; Wang, Y.; Liu, X.
2018-04-01
The Modbus TCP/IP has been a standard industry communication protocol and widely utilized for establishing sensor-cloud platforms on the Internet. However, numerous existing data acquisition systems built on traditional single-chip microcontrollers without sufficient resources cannot support it, because the complete Modbus TCP/IP protocol always works dependent on a full operating system which occupies abundant hardware resources. Hence, a compact Modbus TCP/IP protocol is proposed in this work to make it run efficiently and stably even on a resource-limited hardware platform. Firstly, the Modbus TCP/IP protocol stack is analyzed and the refined protocol suite is rebuilt by streamlining the typical TCP/IP suite. Then, specific implementation of every hierarchical layer is respectively presented in detail according to the protocol structure. Besides, the compact protocol is implemented in a traditional microprocessor to validate the feasibility of the scheme. Finally, the performance of the proposed scenario is assessed. The experimental results demonstrate that message packets match the frame format of Modbus TCP/IP protocol and the average bandwidth reaches to 1.15 Mbps. The compact protocol operates stably even based on a traditional microcontroller with only 4-kB RAM and 12-MHz system clock, and no communication congestion or frequent packet loss occurs.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273
Anderson, William W.; Fitzjohn, Stephen M.; Collingridge, Graham L.
2012-01-01
WinLTP is a data acquisition program for studying long-term potentiation (LTP) and other aspects of synaptic function. Earlier versions of WinLTP (J. Neurosci. Methods, 162:346–356, 2007) provided automated electrical stimulation and data acquisition capable of running nearly an entire synaptic plasticity experiment, with the primary exception that perfusion solutions had to be changed manually. This automated stimulation and acquisition was done by using ‘Sweep’, ‘Loop’ and ‘Delay’ events to build scripts using the ‘Protocol Builder’. However, this did not allow automatic changing of many solutions while running multiple slice experiments, or solution changing when this had to be performed rapidly and with accurate timing during patch-clamp experiments. We report here the addition of automated perfusion control to WinLTP. First, perfusion change between sweeps is enabled by adding the ‘Perfuse’ event to Protocol Builder scripting and is used in slice experiments. Second, fast perfusion changes during as well as between sweeps is enabled by using the Perfuse event in the protocol scripts to control changes between sweeps, and also by changing digital or analog output during a sweep and is used for single cell single-line perfusion patch-clamp experiments. The addition of stepper control of tube placement allows dual- or triple-line perfusion patch-clamp experiments for up to 48 solutions. The ability to automate perfusion changes and fully integrate them with the already automated stimulation and data acquisition goes a long way toward complete automation of multi-slice extracellularly recorded and single cell patch-clamp experiments. PMID:22524994
Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).
Automation of data acquisition in electron crystallography.
Cheng, Anchi
2013-01-01
General considerations for using automation software for acquiring high-resolution images of 2D crystals under low-dose conditions are presented. Protocol modifications specific to this application in Leginon are provided.
2016-04-01
Hypomagnesemia, Grade 3 in 1 patient • Hypokalemia, Grade 3 in 2 patient • Pneumonia , Grade 3 in 7 patients • Dehydration, Grade 3 in 3 patients. Once...improvements on lab- scale radiosynthesis (Fig. 9) so that it is readily scaled up and adopted by the cGMP manufacturing. We are working on an automated...reproducibly building at the nanometer scale , including the need for streamlined and “bottom-up” approaches for assembling nanoparticle architectures, the
NASA Technical Reports Server (NTRS)
Kleis, Stanley J.; Truong, Tuan; Goodwin, Thomas J,
2004-01-01
This report is a documentation of a fluid dynamic analysis of the proposed Automated Static Culture System (ASCS) cell module mixing protocol. The report consists of a review of some basic fluid dynamics principles appropriate for the mixing of a patch of high oxygen content media into the surrounding media which is initially depleted of oxygen, followed by a computational fluid dynamics (CFD) study of this process for the proposed protocol over a range of the governing parameters. The time histories of oxygen concentration distributions and mechanical shear levels generated are used to characterize the mixing process for different parameter values.
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat
2017-01-01
Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.
USDA-ARS?s Scientific Manuscript database
A synthetic Candida antarctica lipase B (CALB) gene open reading frame (ORF) for expression in yeast was produced using an automated PCR assembly and DNA purification protocol on an integrated robotic workcell. The lycotoxin-1 (Lyt-1) C3 variant gene ORF was added in-frame with the CALB ORF to pote...
Woodhouse, Marjolein; Worsley, Peter R; Voegeli, David; Schoonhoven, Lisette; Bader, Dan L
2015-02-01
Individuals who have reduced mobility are at risk of developing pressure ulcers if they are subjected to sustained static postures. To reduce this risk, clinical guidelines advocate healthcare professionals reposition patients regularly. Automated tilting mechanisms have recently been introduced to provide periodic repositioning. This study compared the performance of such a prototype mattress to conventional manual repositioning. Ten healthy participants (7 male and 3 female, aged 23-66 years) were recruited to compare the effects of an automated tilting mattress to standard manual repositioning, using the 30° tilt. Measures during the tilting protocols (supine, right and left tilt) included comfort and safety scores, interface pressures, inclinometer angles and transcutaneous gas tensions (sacrum and shoulder). Data from these outcomes were compared between each protocol. Results indicated no significant differences for either interface pressures or transcutaneous gas responses between the two protocols (P>0.05 in both cases). Indeed a small proportion of participants (~30%) exhibited changes in transcutaneous oxygen and carbon dioxide values in the shoulder during a right tilt for both protocols. The tilt angles at the sternum and the pelvis were significantly less in the automated tilt compared to the manual tilt (mean difference=9.4-11.5°, P<0.001). Participants reported similar comfort scores for both protocols, although perceived safety was reduced on the prototype mattress. Although further studies are required to assess its performance in maintaining tissue viability, an automated tilting mattress offers the ability to periodically reposition vulnerable individuals, with potential economic savings to health services. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahi-Anwar, M; Lo, P; Kim, H
Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifiesmore » the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel component to automatically verify image acquisition parameters and automated adherence to specifications. Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics; NIH Grant support from: U01 CA181156.« less
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
Fogel, Mina; Harari, Ayelet; Müller-Holzner, Elisabeth; Zeimet, Alain G; Moldenhauer, Gerhard; Altevogt, Peter
2014-06-25
The L1 cell adhesion molecule (L1CAM) is overexpressed in many human cancers and can serve as a biomarker for prognosis in most of these cancers (including type I endometrial carcinomas). Here we provide an optimized immunohistochemical staining procedure for a widely used automated platform (VENTANA™), which has recourse to commercially available primary antibody and detection reagents. In parallel, we optimized the staining on a semi-automated BioGenix (i6000) immunostainer. These protocols yield good stainings and should represent the basis for a reliable and standardized immunohistochemical detection of L1CAM in a variety of malignancies in different laboratories.
Clarity: An Open Source Manager for Laboratory Automation
Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.
2013-01-01
Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Automation of fluorescent differential display with digital readout.
Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng
2006-01-01
Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.
PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking
Fuller, Sharon; Carrell, David; Pardee, Roy
2012-01-01
Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.
An Outcomes Study on the Effects of the Singapore General Hospital Burns Protocol.
Liang, Weihao; Kok, Yee Onn; Tan, Bien Keem; Chong, Si Jack
2018-01-01
The Singapore General Hospital Burns Protocol was implemented in May 2014 to standardize treatment for all burns patients, incorporate new techniques and materials, and streamline the processes and workflow of burns management. This study aims to analyze the effects of the Burns Protocol 2 years after its implementation. Using a REDCap electronic database, all burns patients admitted from May 2013 to April 2016 were included in the study. The historical preimplementation control group composed of patients admitted from May 2013 to April 2014 (n = 96). The postimplementation prospective study cohort consisted of patients admitted from May 2014 to April 2016 (n = 243). Details of the patients collected included age, sex, comorbidities, total body surface area (TBSA) burns, time until surgery, number of surgeries, number of positive tissue and blood cultures, and length of hospital stay. There was no statistically significant difference in the demographics of both groups. The study group had a statistically significant shorter time to surgery compared with the control group (20.8 vs 38.1, P < 0.0001). The study group also averaged fewer surgeries performed (1.96 vs 2.29, P = 0.285), which, after accounting for the extent of burns, was statistically significant (number of surgeries/TBSA, 0.324 vs 0.506; P = 0.0499). The study group also had significantly shorter length of stay (12.5 vs 16.8, P = 0.0273), a shorter length of stay/TBSA burns (0.874 vs 1.342, P = 0.0101), and fewer positive tissue cultures (0.6 vs 1.3, P = 0.0003). The study group also trended toward fewer positive blood culture results (0.09 vs 0.35, P = 0.0593), although the difference was just shy of statistical significance. The new Singapore General Hospital Burns Protocol had revolutionized Singapore burns care by introducing a streamlined, multidisciplinary burns management, resulting in improved patient outcomes, lowered health care costs, and improved system resource use.
Dalecki, Alex G; Wolschendorf, Frank
2016-07-01
Facing totally resistant bacteria, traditional drug discovery efforts have proven to be of limited use in replenishing our depleted arsenal of therapeutic antibiotics. Recently, the natural anti-bacterial properties of metal ions in synergy with metal-coordinating ligands have shown potential for generating new molecule candidates with potential therapeutic downstream applications. We recently developed a novel combinatorial screening approach to identify compounds with copper-dependent anti-bacterial properties. Through a parallel screening technique, the assay distinguishes between copper-dependent and independent activities against Mycobacterium tuberculosis with hits being defined as compounds with copper-dependent activities. These activities must then be linked to a compound master list to process and analyze the data and to identify the hit molecules, a labor intensive and mistake-prone analysis. Here, we describe a software program built to automate this analysis in order to streamline our workflow significantly. We conducted a small, 1440 compound screen against M. tuberculosis and used it as an example framework to build and optimize the software. Though specifically adapted to our own needs, it can be readily expanded for any small- to medium-throughput screening effort, parallel or conventional. Further, by virtue of the underlying Linux server, it can be easily adapted for chemoinformatic analysis of screens through packages such as OpenBabel. Overall, this setup represents an easy-to-use solution for streamlining processing and analysis of biological screening data, as well as offering a scaffold for ready functionality expansion. Copyright © 2016 Elsevier B.V. All rights reserved.
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
Automated Fluid Feature Extraction from Transient Simulations
NASA Technical Reports Server (NTRS)
Haimes, Robert
2000-01-01
In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.
Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041
O'Brien, Jeremy J; Stormann, Jeremy; Roche, Kelli; Cabral-Goncalves, Ines; Monks, Annamarie; Hallett, Donna; Mortele, Koenraad J
2017-02-01
The purpose of this study was to describe and evaluate the effect of focused process improvements on protocol selection and scheduling in the MRI division of a busy academic medical center, as measured by examination and room times, magnet fill rate, and potential revenue increases and cost savings to the department. Focused process improvements, led by a multidisciplinary team at a large academic medical center, were directed at streamlining MRI protocols and optimizing matching protocol ordering to scheduling while maintaining or improving image quality. Data were collected before (June 2013) and after (March 2015) implementation of focused process improvements and divided by subspecialty on type of examination, allotted examination time, actual examination time, and MRI parameters. Direct and indirect costs were compiled and analyzed in consultation with the business department. Data were compared with evaluated effects on selected outcome and efficiency measures, as well as revenue and cost considerations. Statistical analysis was performed using a t test. During the month of June 2013, 2145 MRI examinations were performed at our center; 2702 were performed in March 2015. Neuroradiology examinations were the most common (59% in June 2013, 56% in March 2015), followed by body examinations (25% and 27%). All protocols and parameters were analyzed and streamlined for each examination, with slice thickness, TR, and echo train length among the most adjusted parameters. Mean time per examination decreased from 43.4 minutes to 36.7 minutes, and mean room time per patient decreased from 46.3 to 43.6 minutes (p = 0.009). Potential revenue from increased throughput may yield up to $3 million yearly (at $800 net revenue per scan) or produce cost savings if the facility can reduce staffed scanner hours or the number of scanners in its fleet. Actual revenue and expense impacts depend on the facility's fixed and variable cost structure, payer contracts, MRI fleet composition, and unmet MRI demand. Focused process improvements in selecting MRI protocols and scheduling examinations significantly increased throughput in the MRI division, thereby increasing capacity and revenue. Shorter scan and department times may also improve patient experience.
Data exchange technology based on handshake protocol for industrial automation system
NASA Astrophysics Data System (ADS)
Astafiev, A. V.; Shardin, T. O.
2018-05-01
In the article, questions of data exchange technology based on the handshake protocol for industrial automation system are considered. The methods of organizing the technology in client-server applications are analyzed. In the process of work, the main threats of client-server applications that arise during the information interaction of users are indicated. Also, a comparative analysis of analogue systems was carried out, as a result of which the most suitable option was chosen for further use. The basic schemes for the operation of the handshake protocol are shown, as well as the general scheme of the implemented application, which describes the entire process of interaction between the client and the server.
Bhattacharyya, S; Fan, L; Vo, L; Labadie, J
2000-04-01
Amine libraries and their derivatives are important targets for high throughput synthesis because of their versatility as medicinal agents and agrochemicals. As a part of our efforts towards automated chemical library synthesis, a titanium(IV) isopropoxide mediated solution phase reductive amination protocol was successfully translated to automation on the Trident(TM) library synthesizer of Argonaut Technologies. An array of 24 secondary amines was prepared in high yield and purity from 4 primary amines and 6 carbonyl compounds. These secondary amines were further utilized in a split synthesis to generate libraries of ureas, amides and sulfonamides in solution phase on the Trident(TM). The automated runs included 192 reactions to synthesize 96 ureas in duplicate and 96 reactions to synthesize 48 amides and 48 sulfonamides. A number of polymer-assisted solution phase protocols were employed for parallel work-up and purification of the products in each step.
NASA Astrophysics Data System (ADS)
Krotov, Aleksei; Pankin, Victor
2017-09-01
The assessment of central circulation (including heart function) parameters is vital in the preventive diagnostics of inherent and acquired heart failures and during polychemotherapy. The protocols currently applied in Russia do not fully utilize the first-pass assessment (FPRNA) and that results in poor data formalization, while the FPRNA is the one of the fastest, affordable and compact methods among other radioisotope diagnostics protocols. A non-imaging algorithm basing on existing protocols has been designed to use the readings of an additional detector above vena subclavia to determine the total blood volume (TBV), not requiring blood sampling in contrast to current protocols. An automated processing of precordial detector readings is presented, in order to determine the heart strike volume (SV). Two techniques to estimate the ejection fraction (EF) of the heart are discussed.
Bing, Sen; Chen, Kang; Hou, Hong; Zhang, Weijuan; Li, Linyi; Wei, Jiao; Shu, Chang; Wan, Yi
2016-04-01
This study aimed to determine the accuracy of the Microlife BP A200 Comfort and W2 Slim automated blood pressure monitors according to the European Society of Hypertension International Protocol revision 2010 and the ANSI/AAMI/ISO 81060-2:2013 protocols. The devices were assessed on 33 participants according to the European Society of Hypertension requirements and were then tested on 85 participants according to the ANSI/AAMI/ISO 81060-2:2013 criteria. Procedures and data analysis were carried out following protocol guidelines precisely. The Microlife BP A200 Comfort and W2 Slim devices passed the criteria of the European Society of Hypertension International Protocol revision 2010 for both systolic blood pressure and diastolic blood pressure. The devices also fulfilled the ANSI/AAMI/ISO 81060-2:2013 criteria, with mean differences of SBP and DPB between the devices and observers of 0.38±5.12 and 0.28±4.29 mmHg for the BP A200 Comfort and 1.01±6.80 and 0.34±5.62 mmHg for the W2 Slim, respectively. The Microlife BP A200 Comfort and W2 Slim automated blood pressure monitors fulfilled the European Society of Hypertension revision 2010 and the ANSI/AAMI/ISO 81060-2:2013 protocols, and can be recommended for self-measurement in the general population.
An ethernet/IP security review with intrusion detection applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laughter, S. A.; Williams, R. D.
2006-07-01
Supervisory Control and Data Acquisition (SCADA) and automation networks, used throughout utility and manufacturing applications, have their own specific set of operational and security requirements when compared to corporate networks. The modern climate of heightened national security and awareness of terrorist threats has made the security of these systems of prime concern. There is a need to understand the vulnerabilities of these systems and how to monitor and protect them. Ethernet/IP is a member of a family of protocols based on the Control and Information Protocol (CIP). Ethernet/IP allows automation systems to be utilized on and integrated with traditional TCP/IPmore » networks, facilitating integration of these networks with corporate systems and even the Internet. A review of the CIP protocol and the additions Ethernet/IP makes to it has been done to reveal the kind of attacks made possible through the protocol. A set of rules for the SNORT Intrusion Detection software is developed based on the results of the security review. These can be used to monitor, and possibly actively protect, a SCADA or automation network that utilizes Ethernet/IP in its infrastructure. (authors)« less
Ferrarini, Alberto; Forcato, Claudio; Buson, Genny; Tononi, Paola; Del Monaco, Valentina; Terracciano, Mario; Bolognesi, Chiara; Fontana, Francesca; Medoro, Gianni; Neves, Rui; Möhlendick, Birte; Rihawi, Karim; Ardizzoni, Andrea; Sumanasuriya, Semini; Flohr, Penny; Lambros, Maryou; de Bono, Johann; Stoecklein, Nikolas H; Manaresi, Nicolò
2018-01-01
Chromosomal instability and associated chromosomal aberrations are hallmarks of cancer and play a critical role in disease progression and development of resistance to drugs. Single-cell genome analysis has gained interest in latest years as a source of biomarkers for targeted-therapy selection and drug resistance, and several methods have been developed to amplify the genomic DNA and to produce libraries suitable for Whole Genome Sequencing (WGS). However, most protocols require several enzymatic and cleanup steps, thus increasing the complexity and length of protocols, while robustness and speed are key factors for clinical applications. To tackle this issue, we developed a single-tube, single-step, streamlined protocol, exploiting ligation mediated PCR (LM-PCR) Whole Genome Amplification (WGA) method, for low-pass genome sequencing with the Ion Torrent™ platform and copy number alterations (CNAs) calling from single cells. The method was evaluated on single cells isolated from 6 aberrant cell lines of the NCI-H series. In addition, to demonstrate the feasibility of the workflow on clinical samples, we analyzed single circulating tumor cells (CTCs) and white blood cells (WBCs) isolated from the blood of patients affected by prostate cancer or lung adenocarcinoma. The results obtained show that the developed workflow generates data accurately representing whole genome absolute copy number profiles of single cell and allows alterations calling at resolutions down to 100 Kbp with as few as 200,000 reads. The presented data demonstrate the feasibility of the Ampli1™ WGA-based low-pass workflow for detection of CNAs in single tumor cells which would be of particular interest for genome-driven targeted therapy selection and for monitoring of disease progression.
Tzeng, Yan-Kai; Chang, Cheng-Chun; Huang, Chien-Ning; Wu, Chih-Che; Han, Chau-Chung; Chang, Huan-Cheng
2008-09-01
A streamlined protocol has been developed to accelerate, simplify, and enhance matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) mass spectrometry (MS) of neutral underivatized glycans released from glycoproteins. It involved microwave-assisted enzymatic digestion and release of glycans, followed by rapid removal of proteins and peptides with carboxylated/oxidized diamond nanoparticles, and finally treating the analytes with NaOH before mixing them with acidic matrix (such as 2,5-dihydroxybenzoic acid) to suppress the formation of both peptide and potassiated oligosaccharide ions in MS analysis. The advantages of this protocol were demonstrated with MALDI-TOF-MS of N-linked glycans released from ovalbumin and ribonuclease B.
Butler, Tracy; Zaborszky, Laszlo; Pirraglia, Elizabeth; Li, Jinyu; Wang, Xiuyuan Hugh; Li, Yi; Tsui, Wai; Talos, Delia; Devinsky, Orrin; Kuchna, Izabela; Nowicki, Krzysztof; French, Jacqueline; Kuzniecky, Rubin; Wegiel, Jerzy; Glodzik, Lidia; Rusinek, Henry; DeLeon, Mony J.; Thesen, Thomas
2014-01-01
Septal nuclei, located in basal forebrain, are strongly connected with hippocampi and important in learning and memory, but have received limited research attention in human MRI studies. While probabilistic maps for estimating septal volume on MRI are now available, they have not been independently validated against manual tracing of MRI, typically considered the gold standard for delineating brain structures. We developed a protocol for manual tracing of the human septal region on MRI based on examination of neuroanatomical specimens. We applied this tracing protocol to T1 MRI scans (n=86) from subjects with temporal epilepsy and healthy controls to measure septal volume. To assess the inter-rater reliability of the protocol, a second tracer used the same protocol on 20 scans that were randomly selected from the 72 healthy controls. In addition to measuring septal volume, maximum septal thickness between the ventricles was measured and recorded. The same scans (n=86) were also analysed using septal probabilistic maps and Dartel toolbox in SPM. Results show that our manual tracing algorithm is reliable, and that septal volume measurements obtained via manual and automated methods correlate significantly with each other (p<001). Both manual and automated methods detected significantly enlarged septal nuclei in patients with temporal lobe epilepsy in accord with a proposed compensatory neuroplastic process related to the strong connections between septal nuclei and hippocampi. Septal thickness, which was simple to measure with excellent inter-rater reliability, correlated well with both manual and automated septal volume, suggesting it could serve as an easy-to-measure surrogate for septal volume in future studies. Our results call attention to the important though understudied human septal region, confirm its enlargement in temporal lobe epilepsy, and provide a reliable new manual delineation protocol that will facilitate continued study of this critical region. PMID:24736183
Butler, Tracy; Zaborszky, Laszlo; Pirraglia, Elizabeth; Li, Jinyu; Wang, Xiuyuan Hugh; Li, Yi; Tsui, Wai; Talos, Delia; Devinsky, Orrin; Kuchna, Izabela; Nowicki, Krzysztof; French, Jacqueline; Kuzniecky, Rubin; Wegiel, Jerzy; Glodzik, Lidia; Rusinek, Henry; deLeon, Mony J; Thesen, Thomas
2014-08-15
Septal nuclei, located in basal forebrain, are strongly connected with hippocampi and important in learning and memory, but have received limited research attention in human MRI studies. While probabilistic maps for estimating septal volume on MRI are now available, they have not been independently validated against manual tracing of MRI, typically considered the gold standard for delineating brain structures. We developed a protocol for manual tracing of the human septal region on MRI based on examination of neuroanatomical specimens. We applied this tracing protocol to T1 MRI scans (n=86) from subjects with temporal epilepsy and healthy controls to measure septal volume. To assess the inter-rater reliability of the protocol, a second tracer used the same protocol on 20 scans that were randomly selected from the 72 healthy controls. In addition to measuring septal volume, maximum septal thickness between the ventricles was measured and recorded. The same scans (n=86) were also analyzed using septal probabilistic maps and DARTEL toolbox in SPM. Results show that our manual tracing algorithm is reliable, and that septal volume measurements obtained via manual and automated methods correlate significantly with each other (p<.001). Both manual and automated methods detected significantly enlarged septal nuclei in patients with temporal lobe epilepsy in accord with a proposed compensatory neuroplastic process related to the strong connections between septal nuclei and hippocampi. Septal thickness, which was simple to measure with excellent inter-rater reliability, correlated well with both manual and automated septal volume, suggesting it could serve as an easy-to-measure surrogate for septal volume in future studies. Our results call attention to the important though understudied human septal region, confirm its enlargement in temporal lobe epilepsy, and provide a reliable new manual delineation protocol that will facilitate continued study of this critical region. Copyright © 2014 Elsevier Inc. All rights reserved.
Automated Indexing of the Hazardous Substances Data Bank (HSDB)
Nuss, Carlo; Chang, Hua Florence; Moore, Dorothy; Fonger, George C.
2003-01-01
The Hazardous Substances Data Bank (HSDB), produced and maintained by the National Library of Medicine (NLM), contains over 4600 records on potentially hazardous chemicals. To enhance information retrieval from HSDB, NLM has undertaken the development of an automated HSDB indexing protocol as part of its Indexing Initiative. The NLM Indexing Initiative investigates methods whereby automated indexing may partially or completely substitute for human indexing. The poster’s purpose is to describe the HSDB Automated Indexing Project. PMID:14728459
Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.
Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V
2015-01-01
Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.
Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas
2017-01-01
Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.
Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels
2011-04-01
We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
Umbrello, M; Salice, V; Spanu, P; Formenti, P; Barassi, A; Melzi d'Eril, G V; Iapichino, G
2014-10-01
The optimal level and modality of glucose control in critically ill patients is still debated. A protocolized approach and the use of nearly-continuous technologies are recommended to manage hyperglycemia, hypoglycemia and glycemic variability. We recently proposed a pato-physiology-based glucose control protocol which takes into account patient glucose/carbohydrate intake and insulin resistance. Aim of the present investigation was to assess the performance of our protocol with an automated intermittent plasma glucose monitoring device (OptiScanner™ 5000). OptiScanner™ was used in 6 septic patients, providing glucose measurement every 15' from a side-port of an indwelling central venous catheter. Target level of glucose was 80-150 mg/dL. Insulin infusion and kcal with nutritional support were also recorded. 6 septic patients were studied for 319 h (1277 measurements); 58 [45-65] hours for each patient (measurements/patient: 231 [172-265]). Blood glucose was at target for 93 [90-98]% of study time. Mean plasma glucose was 126 ± 11 mg/dL. Only 3 hypoglycemic episodes (78, 78, 69 mg/dL) were recorded. Glucose variability was limited: plasma glucose coefficient of variation was 11.7 ± 4.0% and plasma glucose standard deviation was 14.3 ± 5.5 mg/dL. The local glucose control protocol achieved satisfactory glucose control in septic patients along with a high degree of safeness. Automated intermittent plasma glucose monitoring seemed useful to assess the performance of the protocol. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
2005-03-01
to obtain a protocol customized to the needs of a specific setting, under control of an automated theorem proving system that can guarantee...new “compositional” method for protocol design and implementation, in which small microprotocols are combined to obtain a protocol customized to the...and Network Centric Enterprise (NCES) visions. This final report documents a wide range of contributions and technology transitions, including: A
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
Collecting and Animating Online Satellite Images.
ERIC Educational Resources Information Center
Irons, Ralph
1995-01-01
Describes how to generate automated classroom resources from the Internet. Topics covered include viewing animated satellite weather images using file transfer protocol (FTP); sources of images on the Internet; shareware available for viewing images; software for automating image retrieval; procedures for animating satellite images; and storing…
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
Refinement for fault-tolerance: An aircraft hand-off protocol
NASA Technical Reports Server (NTRS)
Marzullo, Keith; Schneider, Fred B.; Dehn, Jon
1994-01-01
Part of the Advanced Automation System (AAS) for air-traffic control is a protocol to permit flight hand-off from one air-traffic controller to another. The protocol must be fault-tolerant and, therefore, is subtle -- an ideal candidate for the application of formal methods. This paper describes a formal method for deriving fault-tolerant protocols that is based on refinement and proof outlines. The AAS hand-off protocol was actually derived using this method; that derivation is given.
NASA Astrophysics Data System (ADS)
Cooper, L. A.; Ballantyne, A.
2017-12-01
Forest disturbances are critical components of ecosystems. Knowledge of their prevalence and impacts is necessary to accurately describe forest health and ecosystem services through time. While there are currently several methods available to identify and describe forest disturbances, especially those which occur in North America, the process remains inefficient and inaccessible in many parts of the world. Here, we introduce a preliminary approach to streamline and automate both the detection and attribution of forest disturbances. We use a combination of the Breaks for Additive Season and Trend (BFAST) detection algorithm to detect disturbances in combination with supervised and unsupervised classification algorithms to attribute the detections to disturbance classes. Both spatial and temporal disturbance characteristics are derived and utilized for the goal of automating the disturbance attribution process. The resulting preliminary algorithm is applied to up-scaled (100m) Landsat data for several different ecosystems in North America, with varying success. Our results indicate that supervised classification is more reliable than unsupervised classification, but that limited training data are required for a region. Future work will improve the algorithm through refining and validating at sites within North America before applying this approach globally.
Del Medico, Luca; Christen, Heinz; Christen, Beat
2017-01-01
Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner. PMID:28531174
Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-01-01
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442
Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam
2014-02-26
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Implementation of Siemens USS protocol into LabVIEW.
Hosek, P; Diblik, M
2011-10-01
This article gives basic overview of the USS protocol as a communication interface to drive Siemens frequency inverters. It presents our implementation of this protocol into LabVIEW, as there was permanent demand from the community of the users to have native LabVIEW implementation of the USS protocol. It also states encountered problems and their solutions. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
Evaluating Management Information Systems, A Protocol for Automated Peer Review Systems
Black, Gordon C.
1980-01-01
This paper discusses key issues in evaluating an automated Peer Review System. Included are the conceptual base, design, steps in planning structural components, operation parameters, criteria, costs and a detailed outline or protocol for use in the evaluation. At the heart of the Peer Review System is the criteria utilized for measuring quality. Criteria evaluation should embrace, as a minimum, appropriateness, validity and reliability, and completemess or comprehensiveness of content. Such an evaluation is not complete without determining the impact (clinical outcome) of the service system or the patient and the population served.
Larson, Joshua; Kirk, Matt; Drier, Eric A.; O’Brien, William; MacKay, James F.; Friedman, Larry; Hoskins, Aaron
2015-01-01
Colocalization Single Molecule Spectroscopy (CoSMoS) has proven to be a useful method for studying the composition, kinetics, and mechanisms of complex cellular machines. Key to the technique is the ability to simultaneously monitor multiple proteins and/or nucleic acids as they interact with one another. Here we describe a protocol for constructing a CoSMoS micromirror Total Internal Reflection Fluorescence Microscope (mmTIRFM). Design and construction of a scientific microscope often requires a number of custom components and a significant time commitment. In our protocol, we have streamlined this process by implementation of a commercially available microscopy platform designed to accommodate the optical components necessary for a mmTIRFM. The mmTIRF system eliminates the need for machining custom parts by the end-user and facilitates optical alignment. Depending on the experience-level of the microscope builder, these time-savings and the following protocol can enable mmTIRF construction to be completed within two months. PMID:25188633
Larson, Joshua; Kirk, Matt; Drier, Eric A; O'Brien, William; MacKay, James F; Friedman, Larry J; Hoskins, Aaron A
2014-10-01
Colocalization single-molecule spectroscopy (CoSMoS) has proven to be a useful method for studying the composition, kinetics and mechanisms of complex cellular machines. Key to the technique is the ability to simultaneously monitor multiple proteins and/or nucleic acids as they interact with one another. Here we describe a protocol for constructing a CoSMoS micromirror total internal reflection fluorescence microscope (mmTIRFM). Design and construction of a scientific microscope often requires a number of custom components and a substantial time commitment. In our protocol, we have streamlined this process by implementation of a commercially available microscopy platform designed to accommodate the optical components necessary for an mmTIRFM. The mmTIRF system eliminates the need for machining custom parts by the end user and facilitates optical alignment. Depending on the experience level of the microscope builder, these time savings and the following protocol can enable mmTIRF construction to be completed within 2 months.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Tian, X; Segars, P
2016-06-15
Purpose: To develop an automated technique for estimating patient-specific regional imparted energy and dose from tube current modulated (TCM) computed tomography (CT) exams across a diverse set of head and body protocols. Methods: A library of 58 adult computational anthropomorphic extended cardiac-torso (XCAT) phantoms were used to model a patient population. A validated Monte Carlo program was used to simulate TCM CT exams on the entire library of phantoms for three head and 10 body protocols. The net imparted energy to the phantoms, normalized by dose length product (DLP), and the net tissue mass in each of the scan regionsmore » were computed. A knowledgebase containing relationships between normalized imparted energy and scanned mass was established. An automated computer algorithm was written to estimate the scanned mass from actual clinical CT exams. The scanned mass estimate, DLP of the exam, and knowledgebase were used to estimate the imparted energy to the patient. The algorithm was tested on 20 chest and 20 abdominopelvic TCM CT exams. Results: The normalized imparted energy increased with increasing kV for all protocols. However, the normalized imparted energy was relatively unaffected by the strength of the TCM. The average imparted energy was 681 ± 376 mJ for abdominopelvic exams and 274 ± 141 mJ for chest exams. Overall, the method was successful in providing patientspecific estimates of imparted energy for 98% of the cases tested. Conclusion: Imparted energy normalized by DLP increased with increasing tube potential. However, the strength of the TCM did not have a significant effect on the net amount of energy deposited to tissue. The automated program can be implemented into the clinical workflow to provide estimates of regional imparted energy and dose across a diverse set of clinical protocols.« less
Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick
2018-05-03
Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.
Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang
2013-07-25
Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.
Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel
2017-03-17
Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.
webpic: A flexible web application for collecting distance and count measurements from images
2018-01-01
Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592
Hardware Realization of an Ethernet Packet Analyzer Search Engine
2000-06-30
specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The
1991-09-01
103 A2352344 Layup Cover Sheets/Inspect ............................. 103 A2352345 Perform Automated Tape Laying operations...A2352345 Perform Automated Tape Laying operations/Inspect The tape is layed in 3-12 inch strips along the surface of the bond mold. The NC program is
Bouhenguel, Jason T; Preiss, David A; Urman, Richard D
2017-12-01
Non-operating room anesthesia (NORA) encounters comprise a significant fraction of contemporary anesthesia practice. With the implemention of an aneshtesia information management system (AIMS), anesthesia practitioners can better streamline preoperative assessment, intraoperative automated documentation, real-time decision support, and remote surveillance. Despite the large personal and financial commitments involved in adoption and implementation of AIMS and other electronic health records in these settings, the benefits to safety, efficacy, and efficiency are far too great to be ignored. Continued future innovation of AIMS technology only promises to further improve on our NORA experience and improve care quality and safety. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, X; Li, S; Zheng, D
Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheetmore » every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.« less
A simple automated instrument for DNA extraction in forensic casework.
Montpetit, Shawn A; Fitch, Ian T; O'Donnell, Patrick T
2005-05-01
The Qiagen BioRobot EZ1 is a small, rapid, and reliable automated DNA extraction instrument capable of extracting DNA from up to six samples in as few as 20 min using magnetic bead technology. The San Diego Police Department Crime Laboratory has validated the BioRobot EZ1 for the DNA extraction of evidence and reference samples in forensic casework. The BioRobot EZ1 was evaluated for use on a variety of different evidence sample types including blood, saliva, and semen evidence. The performance of the BioRobot EZ1 with regard to DNA recovery and potential cross-contamination was also assessed. DNA yields obtained with the BioRobot EZ1 were comparable to those from organic extraction. The BioRobot EZ1 was effective at removing PCR inhibitors, which often co-purify with DNA in organic extractions. The incorporation of the BioRobot EZ1 into forensic casework has streamlined the DNA analysis process by reducing the need for labor-intensive phenol-chloroform extractions.
Automated structure refinement of macromolecular assemblies from cryo-EM maps using Rosetta.
Wang, Ray Yu-Ruei; Song, Yifan; Barad, Benjamin A; Cheng, Yifan; Fraser, James S; DiMaio, Frank
2016-09-26
Cryo-EM has revealed the structures of many challenging yet exciting macromolecular assemblies at near-atomic resolution (3-4.5Å), providing biological phenomena with molecular descriptions. However, at these resolutions, accurately positioning individual atoms remains challenging and error-prone. Manually refining thousands of amino acids - typical in a macromolecular assembly - is tedious and time-consuming. We present an automated method that can improve the atomic details in models that are manually built in near-atomic-resolution cryo-EM maps. Applying the method to three systems recently solved by cryo-EM, we are able to improve model geometry while maintaining the fit-to-density. Backbone placement errors are automatically detected and corrected, and the refinement shows a large radius of convergence. The results demonstrate that the method is amenable to structures with symmetry, of very large size, and containing RNA as well as covalently bound ligands. The method should streamline the cryo-EM structure determination process, providing accurate and unbiased atomic structure interpretation of such maps.
NASA Astrophysics Data System (ADS)
Kopielski, Andreas; Schneider, Anne; Csáki, Andrea; Fritzsche, Wolfgang
2015-01-01
The DNA origami technique offers great potential for nanotechnology. Using biomolecular self-assembly, defined 2D and 3D nanoscale DNA structures can be realized. DNA origami allows the positioning of proteins, fluorophores or nanoparticles with an accuracy of a few nanometers and enables thereby novel nanoscale devices. Origami assembly usually includes a thermal denaturation step at 90 °C. Additional components used for nanoscale assembly (such as proteins) are often thermosensitive, and possibly damaged by such harsh conditions. They have therefore to be attached in an extra second step to avoid defects. To enable a streamlined one-step nanoscale synthesis - a so called one-pot folding - an adaptation of the folding procedures is required. Here we present a thermal optimization of this process for a 2D DNA rectangle-shaped origami resulting in an isothermal assembly protocol below 60 °C without thermal denaturation. Moreover, a room temperature protocol is presented using the chemical additive betaine, which is biocompatible in contrast to chemical denaturing approaches reported previously.The DNA origami technique offers great potential for nanotechnology. Using biomolecular self-assembly, defined 2D and 3D nanoscale DNA structures can be realized. DNA origami allows the positioning of proteins, fluorophores or nanoparticles with an accuracy of a few nanometers and enables thereby novel nanoscale devices. Origami assembly usually includes a thermal denaturation step at 90 °C. Additional components used for nanoscale assembly (such as proteins) are often thermosensitive, and possibly damaged by such harsh conditions. They have therefore to be attached in an extra second step to avoid defects. To enable a streamlined one-step nanoscale synthesis - a so called one-pot folding - an adaptation of the folding procedures is required. Here we present a thermal optimization of this process for a 2D DNA rectangle-shaped origami resulting in an isothermal assembly protocol below 60 °C without thermal denaturation. Moreover, a room temperature protocol is presented using the chemical additive betaine, which is biocompatible in contrast to chemical denaturing approaches reported previously. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr04176c
Strep-Tagged Protein Purification.
Maertens, Barbara; Spriestersbach, Anne; Kubicek, Jan; Schäfer, Frank
2015-01-01
The Strep-tag system can be used to purify recombinant proteins from any expression system. Here, protocols for lysis and affinity purification of Strep-tagged proteins from E. coli, baculovirus-infected insect cells, and transfected mammalian cells are given. Depending on the amount of Strep-tagged protein in the lysate, a protocol for batch binding and subsequent washing and eluting by gravity flow can be used. Agarose-based matrices with the coupled Strep-Tactin ligand are the resins of choice, with a binding capacity of up to 9 mg ml(-1). For purification of lower amounts of Strep-tagged proteins, the use of Strep-Tactin magnetic beads is suitable. In addition, Strep-tagged protein purification can also be automated using prepacked columns for FPLC or other liquid-handling chromatography instrumentation, but automated purification is not discussed in this protocol. The protocols described here can be regarded as an update of the Strep-Tag Protein Handbook (Qiagen, 2009). © 2015 Elsevier Inc. All rights reserved.
Choi, Bryan; Asselin, Nicholas; Pettit, Catherine C; Dannecker, Max; Machan, Jason T; Merck, Derek L; Merck, Lisa H; Suner, Selim; Williams, Kenneth A; Jay, Gregory D; Kobayashi, Leo
2016-12-01
Effective resuscitation of out-of-hospital cardiac arrest (OHCA) patients is challenging. Alternative resuscitative approaches using electromechanical adjuncts may improve provider performance. Investigators applied simulation to study the effect of an experimental automation-assisted, goal-directed OHCA management protocol on EMS providers' resuscitation performance relative to standard protocols and equipment. Two-provider (emergency medical technicians (EMT)-B and EMT-I/C/P) teams were randomized to control or experimental group. Each team engaged in 3 simulations: baseline simulation (standard roles); repeat simulation (standard roles); and abbreviated repeat simulation (reversed roles, i.e., basic life support provider performing ALS tasks). Control teams used standard OHCA protocols and equipment (with high-performance cardiopulmonary resuscitation training intervention); for second and third simulations, experimental teams performed chest compression, defibrillation, airway, pulmonary ventilation, vascular access, medication, and transport tasks with goal-directed protocol and resuscitation-automating devices. Videorecorders and simulator logs collected resuscitation data. Ten control and 10 experimental teams comprised 20 EMT-B's; 1 EMT-I, 8 EMT-C's, and 11 EMT-P's; study groups were not fully matched. Both groups suboptimally performed chest compressions and ventilations at baseline. For their second simulations, control teams performed similarly except for reduced on-scene time, and experimental teams improved their chest compressions (P=0.03), pulmonary ventilations (P<0.01), and medication administration (P=0.02); changes in their performance of chest compression, defibrillation, airway, and transport tasks did not attain significance against control teams' changes. Experimental teams maintained performance improvements during reversed-role simulations. Simulation-based investigation into OHCA resuscitation revealed considerable variability and improvable deficiencies in small EMS teams. Goal-directed, automation-assisted OHCA management augmented select resuscitation bundle element performance without comprehensive improvement.
Efficient seeding and defragmentation of curvature streamlines for colonic polyp detection
NASA Astrophysics Data System (ADS)
Zhao, Lingxiao; Botha, Charl P.; Truyen, Roel; Vos, Frans M.; Post, Frits H.
2008-03-01
Many computer aided diagnosis (CAD) schemes have been developed for colon cancer detection using Virtual Colonoscopy (VC). In earlier work, we developed an automatic polyp detection method integrating flow visualization techniques, that forms part of the CAD functionality of an existing Virtual Colonoscopy pipeline. Curvature streamlines were used to characterize polyp surface shape. Features derived from curvature streamlines correlated highly with true polyp detections. During testing with a large number of patient data sets, we found that the correlation between streamline features and true polyps could be affected by noise and our streamline generation technique. The seeding and spacing constraints and CT noise could lead to streamline fragmentation, which reduced the discriminating power of our streamline features. In this paper, we present two major improvements of our curvature streamline generation. First, we adapted our streamline seeding strategy to the local surface properties and made the streamline generation faster. It generates a significantly smaller number of seeds but still results in a comparable and suitable streamline distribution. Second, based on our observation that longer streamlines are better surface shape descriptors, we improved our streamline tracing algorithm to produce longer streamlines. Our improved techniques are more effcient and also guide the streamline geometry to correspond better to colonic surface shape. These two adaptations support a robust and high correlation between our streamline features and true positive detections and lead to better polyp detection results.
A streamlined failure mode and effects analysis.
Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg
2014-06-01
Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.
Chatterjee, Anirban; Mirer, Paul L; Zaldivar Santamaria, Elvira; Klapperich, Catherine; Sharon, Andre; Sauer-Budge, Alexis F
2010-06-01
The life science and healthcare communities have been redefining the importance of ribonucleic acid (RNA) through the study of small molecule RNA (in RNAi/siRNA technologies), micro RNA (in cancer research and stem cell research), and mRNA (gene expression analysis for biologic drug targets). Research in this field increasingly requires efficient and high-throughput isolation techniques for RNA. Currently, several commercial kits are available for isolating RNA from cells. Although the quality and quantity of RNA yielded from these kits is sufficiently good for many purposes, limitations exist in terms of extraction efficiency from small cell populations and the ability to automate the extraction process. Traditionally, automating a process decreases the cost and personnel time while simultaneously increasing the throughput and reproducibility. As the RNA field matures, new methods for automating its extraction, especially from low cell numbers and in high throughput, are needed to achieve these improvements. The technology presented in this article is a step toward this goal. The method is based on a solid-phase extraction technology using a porous polymer monolith (PPM). A novel cell lysis approach and a larger binding surface throughout the PPM extraction column ensure a high yield from small starting samples, increasing sensitivity and reducing indirect costs in cell culture and sample storage. The method ensures a fast and simple procedure for RNA isolation from eukaryotic cells, with a high yield both in terms of quality and quantity. The technique is amenable to automation and streamlined workflow integration, with possible miniaturization of the sample handling process making it suitable for high-throughput applications.
High-throughput mouse genotyping using robotics automation.
Linask, Kaari L; Lo, Cecilia W
2005-02-01
The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.
Luna, Jorge M; Yip, Natalie; Pivovarov, Rimma; Vawdrey, David K
2016-08-01
Clinical teams in acute inpatient settings can greatly benefit from automated charting technologies that continuously monitor patient vital status. NewYork-Presbyterian has designed and developed a real-time patient monitoring system that integrates vital signs sensors, networking, and electronic health records, to allow for automatic charting of patient status. We evaluate the representativeness (a combination of agreement, safety and timing) of a core vital sign across nursing intensity care protocols for preliminary feasibility assessment. Our findings suggest an automated way of summarizing heart rate provides representation of true heart rate status and can facilitate alternatives approaches to burdensome manual nurse charting of physiological parameters.
Automated observatory in Antarctica: real-time data transfer on constrained networks in practice
NASA Astrophysics Data System (ADS)
Bracke, Stephan; Gonsette, Alexandre; Rasson, Jean; Poncelet, Antoine; Hendrickx, Olivier
2017-08-01
In 2013 a project was started by the geophysical centre in Dourbes to install a fully automated magnetic observatory in Antarctica. This isolated place comes with specific requirements: unmanned station during 6 months, low temperatures with extreme values down to -50 °C, minimum power consumption and satellite bandwidth limited to 56 Kbit s-1. The ultimate aim is to transfer real-time magnetic data every second: vector data from a LEMI-25 vector magnetometer, absolute F measurements from a GEM Systems scalar proton magnetometer and absolute magnetic inclination-declination (DI) measurements (five times a day) with an automated DI-fluxgate magnetometer. Traditional file transfer protocols (for instance File Transfer Protocol (FTP), email, rsync) show severe limitations when it comes to real-time capability. After evaluation of pro and cons of the available real-time Internet of things (IoT) protocols and seismic software solutions, we chose to use Message Queuing Telemetry Transport (MQTT) and receive the 1 s data with a negligible latency cost and no loss of data. Each individual instrument sends the magnetic data immediately after capturing, and the data arrive approximately 300 ms after being sent, which corresponds with the normal satellite latency.
Eulberg, Dirk; Buchner, Klaus; Maasch, Christian; Klussmann, Sven
2005-01-01
We have developed an automated SELEX (Systematic Evolution of Ligands by EXponential Enrichment) process that allows the execution of in vitro selection cycles without any direct manual intervention steps. The automated selection protocol is designed to provide for high flexibility and versatility in terms of choice of buffers and reagents as well as stringency of selection conditions. Employing the automated SELEX process, we have identified RNA aptamers to the mirror-image configuration (d-peptide) of substance P. The peptide substance P belongs to the tachykinin family and exerts various biologically important functions, such as peripheral vasodilation, smooth muscle contraction and pain transmission. The aptamer that was identified most frequently was truncated to the 44mer SUP-A-004. The mirror-image configuration of SUP-A-004, the so-called Spiegelmer, has been shown to bind to naturally occurring l-substance P displaying a Kd of 40 nM and to inhibit (IC50 of 45 nM) l-substance P-mediated Ca2+ release in a cell culture assay. PMID:15745995
Eulberg, Dirk; Buchner, Klaus; Maasch, Christian; Klussmann, Sven
2005-03-03
We have developed an automated SELEX (Systematic Evolution of Ligands by EXponential Enrichment) process that allows the execution of in vitro selection cycles without any direct manual intervention steps. The automated selection protocol is designed to provide for high flexibility and versatility in terms of choice of buffers and reagents as well as stringency of selection conditions. Employing the automated SELEX process, we have identified RNA aptamers to the mirror-image configuration (d-peptide) of substance P. The peptide substance P belongs to the tachykinin family and exerts various biologically important functions, such as peripheral vasodilation, smooth muscle contraction and pain transmission. The aptamer that was identified most frequently was truncated to the 44mer SUP-A-004. The mirror-image configuration of SUP-A-004, the so-called Spiegelmer, has been shown to bind to naturally occurring l-substance P displaying a K(d) of 40 nM and to inhibit (IC50 of 45 nM) l-substance P-mediated Ca2+ release in a cell culture assay.
Wallace, Adam N; Vyhmeister, Ross; Bagade, Swapnil; Chatterjee, Arindam; Hicks, Brandon; Ramirez-Giraldo, Juan Carlos; McKinstry, Robert C
2015-06-01
Cerebrospinal fluid shunts are primarily used for the treatment of hydrocephalus. Shunt complications may necessitate multiple non-contrast head CT scans resulting in potentially high levels of radiation dose starting at an early age. A new head CT protocol using automatic exposure control and automated tube potential selection has been implemented at our institution to reduce radiation exposure. The purpose of this study was to evaluate the reduction in radiation dose achieved by this protocol compared with a protocol with fixed parameters. A retrospective sample of 60 non-contrast head CT scans assessing for cerebrospinal fluid shunt malfunction was identified, 30 of which were performed with each protocol. The radiation doses of the two protocols were compared using the volume CT dose index and dose length product. The diagnostic acceptability and quality of each scan were evaluated by three independent readers. The new protocol lowered the average volume CT dose index from 15.2 to 9.2 mGy representing a 39 % reduction (P < 0.01; 95 % CI 35-44 %) and lowered the dose length product from 259.5 to 151.2 mGy/cm representing a 42 % reduction (P < 0.01; 95 % CI 34-50 %). The new protocol produced diagnostically acceptable scans with comparable image quality to the fixed parameter protocol. A pediatric shunt non-contrast head CT protocol using automatic exposure control and automated tube potential selection reduced patient radiation dose compared with a fixed parameter protocol while producing diagnostic images of comparable quality.
Townsley, Brad T; Covington, Michael F; Ichihashi, Yasunori; Zumstein, Kristina; Sinha, Neelima R
2015-01-01
Next Generation Sequencing (NGS) is driving rapid advancement in biological understanding and RNA-sequencing (RNA-seq) has become an indispensable tool for biology and medicine. There is a growing need for access to these technologies although preparation of NGS libraries remains a bottleneck to wider adoption. Here we report a novel method for the production of strand specific RNA-seq libraries utilizing the terminal breathing of double-stranded cDNA to capture and incorporate a sequencing adapter. Breath Adapter Directional sequencing (BrAD-seq) reduces sample handling and requires far fewer enzymatic steps than most available methods to produce high quality strand-specific RNA-seq libraries. The method we present is optimized for 3-prime Digital Gene Expression (DGE) libraries and can easily extend to full transcript coverage shotgun (SHO) type strand-specific libraries and is modularized to accommodate a diversity of RNA and DNA input materials. BrAD-seq offers a highly streamlined and inexpensive option for RNA-seq libraries.
Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert
2014-07-01
Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Lelijveld, Natasha; Bailey, Jeanette; Mayberry, Amy; Trenouth, Lani; N'Diaye, Dieynaba S; Haghparast-Bidgoli, Hassan; Puett, Chloe
2018-04-24
Acute malnutrition is currently divided into severe (SAM) and moderate (MAM) based on level of wasting. SAM and MAM currently have separate treatment protocols and products, managed by separate international agencies. For SAM, the dose of treatment is allocated by the child's weight. A combined and simplified protocol for SAM and MAM, with a standardised dose of ready-to-use therapeutic food (RUTF), is being trialled for non-inferior recovery rates and may be more cost-effective than the current standard protocols for treating SAM and MAM. This is the protocol for the economic evaluation of the ComPAS trial, a cluster-randomised controlled, non-inferiority trial that compares a novel combined protocol for treating uncomplicated acute malnutrition compared to the current standard protocol in South Sudan and Kenya. We will calculate the total economic costs of both protocols from a societal perspective, using accounting data, interviews and survey questionnaires. The incremental cost of implementing the combined protocol will be estimated, and all costs and outcomes will be presented as a cost-consequence analysis. Incremental cost-effectiveness ratio will be calculated for primary and secondary outcome, if statistically significant. We hypothesise that implementing the combined protocol will be cost-effective due to streamlined logistics at clinic level, reduced length of treatment, especially for MAM, and reduced dosages of RUTF. The findings of this economic evaluation will be important for policymakers, especially given the hypothesised non-inferiority of the main health outcomes. The publication of this protocol aims to improve rigour of conduct and transparency of data collection and analysis. It is also intended to promote inclusion of economic evaluation in other nutrition intervention studies, especially for MAM, and improve comparability with other studies. ISRCTN 30393230 , date: 16/03/2017.
Diurnal Soil Temperature Effects within the Globe[R] Program Dataset
ERIC Educational Resources Information Center
Witter, Jason D.; Spongberg, Alison L.; Czajkowski, Kevin P.
2007-01-01
Long-term collection of soil temperature with depth is important when studying climate change. The international program GLOBE[R] provides an excellent opportunity to collect such data, although currently endorsed temperature collection protocols need to be refined. To enhance data quality, protocol-based methodology and automated data logging,…
Streamlined bioreactor-based production of human cartilage tissues.
Tonnarelli, B; Santoro, R; Adelaide Asnaghi, M; Wendt, D
2016-05-27
Engineered tissue grafts have been manufactured using methods based predominantly on traditional labour-intensive manual benchtop techniques. These methods impart significant regulatory and economic challenges, hindering the successful translation of engineered tissue products to the clinic. Alternatively, bioreactor-based production systems have the potential to overcome such limitations. In this work, we present an innovative manufacturing approach to engineer cartilage tissue within a single bioreactor system, starting from freshly isolated human primary chondrocytes, through the generation of cartilaginous tissue grafts. The limited number of primary chondrocytes that can be isolated from a small clinically-sized cartilage biopsy could be seeded and extensively expanded directly within a 3D scaffold in our perfusion bioreactor (5.4 ± 0.9 doublings in 2 weeks), bypassing conventional 2D expansion in flasks. Chondrocytes expanded in 3D scaffolds better maintained a chondrogenic phenotype than chondrocytes expanded on plastic flasks (collagen type II mRNA, 18-fold; Sox-9, 11-fold). After this "3D expansion" phase, bioreactor culture conditions were changed to subsequently support chondrogenic differentiation for two weeks. Engineered tissues based on 3D-expanded chondrocytes were more cartilaginous than tissues generated from chondrocytes previously expanded in flasks. We then demonstrated that this streamlined bioreactor-based process could be adapted to effectively generate up-scaled cartilage grafts in a size with clinical relevance (50 mm diameter). Streamlined and robust tissue engineering processes, as the one described here, may be key for the future manufacturing of grafts for clinical applications, as they facilitate the establishment of compact and closed bioreactor-based production systems, with minimal automation requirements, lower operating costs, and increased compliance to regulatory guidelines.
Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.
Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels
2012-10-01
We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.
Liu, Ze-Yu; Zhang, Qing-Han; Ye, Xiao-Lei; Liu, Da-Peng; Cheng, Kang; Zhang, Chun-Hai; Wan, Yi
2017-04-01
To validate the G.LAB MD2200 automated wrist blood pressure (BP) monitors according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010, the British Hypertension Society (BHS), and the International Organization for Standardization (ISO) 81060-2:2013 protocols. The device was assessed on 33 participants according to the ESH requirements and was then tested on 85 participants according to the BHS and ISO 81060-2:2013 criteria. The validation procedures and data analysis followed the protocols precisely. The G.LAB MD2200 devices passed all parts of ESH-IP revision 2010 for both systolic and diastolic BP, with a device-observer difference of 2.15±5.51 and 1.51±5.16 mmHg, respectively. The device achieved A/A grading for the BHS protocol and it also fulfilled the criteria of ISO 81060-2:2013, with mean differences of systolic and diastolic BP between the device and the observer of 2.19±5.21 and 2.11±4.70 mmHg, respectively. The G.LAB MD2200 automated wrist BP monitor passed the ESH-IP revision 2010 and the ISO 81060-2:2013 protocol, and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.
Zeng, Wei-Fang; Liu, Ming; Kang, Yuan-Yuan; Li, Yan; Wang, Ji-Guang
2013-08-01
The present study aimed to evaluate the accuracy of the fully automated oscillometric upper-arm blood pressure monitor TM-2656 according to the British Hypertension Society (BHS) Protocol 1993. We recruited individuals until there were 85 eligible participants and their blood pressure could meet the blood pressure distribution requirements specified by the BHS Protocol. For each individual, we sequentially measured the systolic and diastolic blood pressures using a mercury sphygmomanometer (two observers) and the TM-2656 device (one supervisor). Data analysis was carried out according to the BHS Protocol. The device achieved grade A. The percentage of blood pressure differences within 5, 10, and 15 mmHg was 62, 85, and 96%, respectively, for systolic blood pressure, and 71, 93, and 99%, respectively, for diastolic blood pressure. The average (±SD) of the device-observer differences was -2.1±7.8 mmHg (P<0.0001) and -1.1±5.8 mmHg (P<0.0001) for systolic and diastolic blood pressures, respectively. The A&D upper-arm blood pressure monitor TM-2656 has passed the requirements of the BHS Protocol, and can thus be recommended for blood pressure measurement.
EON: a component-based approach to automation of protocol-directed therapy.
Musen, M A; Tu, S W; Das, A K; Shahar, Y
1996-01-01
Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854
de Blank, Peter; Fisher, Michael J; Gittleman, Haley; Barnholtz-Sloan, Jill S; Badve, Chaitra; Berman, Jeffrey I
2018-01-01
Fractional anisotropy (FA) of the optic radiations has been associated with vision deficit in multiple intrinsic brain pathologies including NF1 associated optic pathway glioma, but hand-drawn regions of interest used in previous tractography methods limit consistency of this potential biomarker. We created an automated method to identify white matter tracts in the optic radiations and compared this method to previously reported hand-drawn tractography. Automated tractography of the optic radiation using probabilistic streamline fiber tracking between the lateral geniculate nucleus of the thalamus and the occipital cortex was compared to the hand-drawn method between regions of interest posterior to Meyer's loop and anterior to tract branching near the calcarine cortex. Reliability was assessed by two independent raters in a sample of 20 healthy child controls. Among 50 children with NF1-associated optic pathway glioma, the association of FA and visual acuity deficit was compared for both tractography methods. Hand-drawn tractography methods required 2.6±0.9min/participant; automated methods were performed in <1min of operator time for all participants. Cronbach's alpha was 0.83 between two independent raters for FA in hand-drawn tractography, but repeated automated tractography resulted in identical FA values (Cronbach's alpha=1). On univariate and multivariate analyses, FA was similarly associated with visual acuity loss using both methods. Receiver operator characteristic curves of both multivariate models demonstrated that both automated and hand-drawn tractography methods were equally able to distinguish normal from abnormal visual acuity. Automated tractography of the optic radiations offers a fast, reliable and consistent method of tract identification that is not reliant on operator time or expertise. This method of tract identification may be useful as DTI is developed as a potential biomarker for visual acuity. Copyright © 2017 Elsevier Inc. All rights reserved.
Henzl, Michael T; Markus, Lindsey A; Davis, Meredith E; McMillan, Andrew T
2013-03-01
Capable of providing a detailed thermodynamic picture of noncovalent association reactions, isothermal titration calorimetry (ITC) has become a popular method for studying protein-ligand interactions. We routinely employ the technique to study divalent ion-binding by two-site EF-hand proteins from the parvalbumin- and polcalcin lineages. The combination of high Ca(2+) affinity and relatively low Mg(2+) affinity, and the attendant complication of parameter correlation, conspire to make the simultaneous extraction of binding constants and -enthalpies for both ions challenging. Although global analysis of multiple ITC experiments can overcome these hurdles, our current experimental protocol includes upwards of 10 titrations - requiring a substantial investment in labor, machine time, and material. This paper explores the potential for using a smaller suite of experiments that includes simultaneous titrations with Ca(2+) and Mg(2+) at different ratios of the two ions. The results obtained for four proteins, differing substantially in their divalent ion-binding properties, suggest that the approach has merit. The Ca(2+)- and Mg(2+)-binding constants afforded by the streamlined analysis are in reasonable agreement with those obtained from the standard analysis protocol. Likewise, the abbreviated analysis provides comparable values for the Ca(2+)-binding enthalpies. However, the streamlined analysis can yield divergent values for the Mg(2+)-binding enthalpies - particularly those for lower affinity sites. This shortcoming can be remedied, in large measure, by including data from a direct Ca(2+) titration in the presence of a high, fixed Mg(2+) concentration. Copyright © 2013. Published by Elsevier Inc.
Interactive Streamline Exploration and Manipulation Using Deformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, Xin; Chen, Chun-Ming; Shen, Han-Wei
2015-01-12
Occlusion presents a major challenge in visualizing three-dimensional flow fields with streamlines. Displaying too many streamlines at once makes it difficult to locate interesting regions, but displaying too few streamlines risks missing important features. A more ideal streamline exploration model is to allow the viewer to freely move across the field that has been populated with interesting streamlines and pull away the streamlines that cause occlusion so that the viewer can inspect the hidden ones in detail. In this paper, we present a streamline deformation algorithm that supports such user-driven interaction with three-dimensional flow fields. We define a view-dependent focus+contextmore » technique that moves the streamlines occluding the focus area using a novel displacement model. To preserve the context surrounding the user-chosen focus area, we propose two shape models to define the transition zone for the surrounding streamlines, and the displacement of the contextual streamlines is solved interactively with a goal of preserving their shapes as much as possible. Based on our deformation model, we design an interactive streamline exploration tool using a lens metaphor. Our system runs interactively so that users can move their focus and examine the flow field freely.« less
2009-02-06
that could monitor sensors, evaluate environmental 4 conditions, and control visual and sound devices was conducted. The home automation products used...the prototype system. Use of off-the-shelf home automation products allowed the implementation of an egress control prototype suitable for test and
Guppy-Coles, Kristyan B; Prasad, Sandhir B; Smith, Kym C; Hillier, Samuel; Lo, Ada; Atherton, John J
2015-06-01
We aimed to determine the feasibility of training cardiac nurses to evaluate left ventricular function utilising a semi-automated, workstation-based protocol on three dimensional echocardiography images. Assessment of left ventricular function by nurses is an attractive concept. Recent developments in three dimensional echocardiography coupled with border detection assistance have reduced inter- and intra-observer variability and analysis time. This could allow abbreviated training of nurses to assess cardiac function. A comparative, diagnostic accuracy study evaluating left ventricular ejection fraction assessment utilising a semi-automated, workstation-based protocol performed by echocardiography-naïve nurses on previously acquired three dimensional echocardiography images. Nine cardiac nurses underwent two brief lectures about cardiac anatomy, physiology and three dimensional left ventricular ejection fraction assessment, before a hands-on demonstration in 20 cases. We then selected 50 cases from our three dimensional echocardiography library based on optimal image quality with a broad range of left ventricular ejection fractions, which was quantified by two experienced sonographers and the average used as the comparator for the nurses. Nurses independently measured three dimensional left ventricular ejection fraction using the Auto lvq package with semi-automated border detection. The left ventricular ejection fraction range was 25-72% (70% with a left ventricular ejection fraction <55%). All nurses showed excellent agreement with the sonographers. Minimal intra-observer variability was noted on both short-term (same day) and long-term (>2 weeks later) retest. It is feasible to train nurses to measure left ventricular ejection fraction utilising a semi-automated, workstation-based protocol on previously acquired three dimensional echocardiography images. Further study is needed to determine the feasibility of training nurses to acquire three dimensional echocardiography images on real-world patients to measure left ventricular ejection fraction. Nurse-performed evaluation of left ventricular function could facilitate the broader application of echocardiography to allow cost-effective screening and monitoring for left ventricular dysfunction in high-risk populations. © 2014 John Wiley & Sons Ltd.
Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov
2015-08-01
Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.
The impact of injector-based contrast agent administration in time-resolved MRA.
Budjan, Johannes; Attenberger, Ulrike I; Schoenberg, Stefan O; Pietsch, Hubertus; Jost, Gregor
2018-05-01
Time-resolved contrast-enhanced MR angiography (4D-MRA), which allows the simultaneous visualization of the vasculature and blood-flow dynamics, is widely used in clinical routine. In this study, the impact of two different contrast agent injection methods on 4D-MRA was examined in a controlled, standardized setting in an animal model. Six anesthetized Goettingen minipigs underwent two identical 4D-MRA examinations at 1.5 T in a single session. The contrast agent (0.1 mmol/kg body weight gadobutrol, followed by 20 ml saline) was injected using either manual injection or an automated injection system. A quantitative comparison of vascular signal enhancement and quantitative renal perfusion analyses were performed. Analysis of signal enhancement revealed higher peak enhancements and shorter time to peak intervals for the automated injection. Significantly different bolus shapes were found: automated injection resulted in a compact first-pass bolus shape clearly separated from the recirculation while manual injection resulted in a disrupted first-pass bolus with two peaks. In the quantitative perfusion analyses, statistically significant differences in plasma flow values were found between the injection methods. The results of both qualitative and quantitative 4D-MRA depend on the contrast agent injection method, with automated injection providing more defined bolus shapes and more standardized examination protocols. • Automated and manual contrast agent injection result in different bolus shapes in 4D-MRA. • Manual injection results in an undefined and interrupted bolus with two peaks. • Automated injection provides more defined bolus shapes. • Automated injection can lead to more standardized examination protocols.
Granato, G.E.; Smith, K.P.
1999-01-01
Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.
Computer imaging and workflow systems in the business office.
Adams, W T; Veale, F H; Helmick, P M
1999-05-01
Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.
Automating individualized coaching and authentic role-play practice for brief intervention training.
Hayes-Roth, B; Saker, R; Amano, K
2010-01-01
Brief intervention helps to reduce alcohol abuse, but there is a need for accessible, cost-effective training of clinicians. This study evaluated STAR Workshop , a web-based training system that automates efficacious techniques for individualized coaching and authentic role-play practice. We compared STAR Workshop to a web-based, self-guided e-book and a no-treatment control, for training the Engage for Change (E4C) brief intervention protocol. Subjects were medical and nursing students. Brief written skill probes tested subjects' performance of individual protocol steps, in different clinical scenarios, at three test times: pre-training, post-training, and post-delay (two weeks). Subjects also did live phone interviews with a standardized patient, post-delay. STAR subjects performed significantly better than both other groups. They showed significantly greater improvement from pre-training probes to post-training and post-delay probes. They scored significantly higher on post-delay phone interviews. STAR Workshop appears to be an accessible, cost-effective approach for training students to use the E4C protocol for brief intervention in alcohol abuse. It may also be useful for training other clinical interviewing protocols.
Laboratory Testing Protocols for Heparin-Induced Thrombocytopenia (HIT) Testing.
Lau, Kun Kan Edwin; Mohammed, Soma; Pasalic, Leonardo; Favaloro, Emmanuel J
2017-01-01
Heparin-induced thrombocytopenia (HIT) represents a significant high morbidity complication of heparin therapy. The clinicopathological diagnosis of HIT remains challenging for many reasons; thus, laboratory testing represents an important component of an accurate diagnosis. Although there are many assays available to assess HIT, these essentially fall into two categories-(a) immunological assays, and (b) functional assays. The current chapter presents protocols for several HIT assays, being those that are most commonly performed in laboratory practice and have the widest geographic distribution. These comprise a manual lateral flow-based system (STiC), a fully automated latex immunoturbidimetric assay, a fully automated chemiluminescent assay (CLIA), light transmission aggregation (LTA), and whole blood aggregation (Multiplate).
Tips on hybridizing, washing, and scanning affymetrix microarrays.
Ares, Manuel
2014-02-01
Starting in the late 1990s, Affymetrix, Inc. produced a commercial system for hybridizing, washing, and scanning microarrays that was designed to be easy to operate and reproducible. The system used arrays packaged in a plastic cassette or chamber in which the prefabricated array was mounted and could be filled with fluid through resealable membrane ports either by hand or by an automated "fluidics station" specially designed to handle the arrays. A special rotating hybridization oven and a specially designed scanner were also required. Primarily because of automation and standardization the Affymetrix system was and still remains popular. Here, we provide a skeleton protocol with the potential pitfalls identified. It is designed to augment the protocols provided by Affymetrix.
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
Hierarchical streamline bundles.
Yu, Hongfeng; Wang, Chaoli; Shene, Ching-Kuang; Chen, Jacqueline H
2012-08-01
Effective 3D streamline placement and visualization play an essential role in many science and engineering disciplines. The main challenge for effective streamline visualization lies in seed placement, i.e., where to drop seeds and how many seeds should be placed. Seeding too many or too few streamlines may not reveal flow features and patterns either because it easily leads to visual clutter in rendering or it conveys little information about the flow field. Not only does the number of streamlines placed matter, their spatial relationships also play a key role in understanding the flow field. Therefore, effective flow visualization requires the streamlines to be placed in the right place and in the right amount. This paper introduces hierarchical streamline bundles, a novel approach to simplifying and visualizing 3D flow fields defined on regular grids. By placing seeds and generating streamlines according to flow saliency, we produce a set of streamlines that captures important flow features near critical points without enforcing the dense seeding condition. We group spatially neighboring and geometrically similar streamlines to construct a hierarchy from which we extract streamline bundles at different levels of detail. Streamline bundles highlight multiscale flow features and patterns through clustered yet not cluttered display. This selective visualization strategy effectively reduces visual clutter while accentuating visual foci, and therefore is able to convey the desired insight into the flow data.
NASA Astrophysics Data System (ADS)
Burba, G. G.; Avenson, T.; Burkart, A.; Gamon, J. A.; Guan, K.; Julitta, T.; Pastorello, G.; Sakowska, K.
2017-12-01
Many hundreds of flux towers are presently operational as standalone projects and as parts of regional networks. However, the vast majority of these towers do not allow straightforward coupling with remote sensing (drone, aircraft, satellite, etc.) data, and even fewer have optical sensors for validation of remote sensing products, and upscaling from field to regional levels. In 2016-2017, new tools to collect, process, and share time-synchronized flux data from multiple towers were developed and deployed globally. Originally designed to automate site and data management, and to streamline flux data analysis, these tools allow relatively easy matching of tower data with remote sensing data: GPS-driven PTP time protocol synchronizes instrumentation within the station, different stations with each other, and all of these to remote sensing data to precisely align remote sensing and flux data in time Footprint size and coordinates computed and stored with flux data help correctly align tower flux footprints and drone, aircraft or satellite motion to precisely align optical and flux data in space Full snapshot of the remote sensing pixel can then be constructed, including leaf-level, ground optical sensor, and flux tower measurements from the same footprint area, closely coupled with the remote sensing measurements to help interpret remote sensing data, validate models, and improve upscaling Additionally, current flux towers can be augmented with advanced ground optical sensors and can use standard routines to deliver continuous products (e.g. SIF, PRI, NDVI, etc.) based on automated field spectrometers (e.g., FloX and RoX, etc.) and other optical systems. Several dozens of new towers already operational globally can be readily used for the proposed workflow. Over 500 active traditional flux towers can be updated to synchronize their data with remote sensing measurements. This presentation will show how the new tools are used by major networks, and describe how this approach can be utilized for matching remote sensing and tower data to aid in ground truthing, improve scientific interactions, and promote joint grant writing and other forms of collaboration between the flux and remote sensing communities.
A streamlined failure mode and effects analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie
Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and usedmore » to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.« less
Atlas-based automatic measurements of the morphology of the tibiofemoral joint
NASA Astrophysics Data System (ADS)
Brehler, M.; Thawait, G.; Shyr, W.; Ramsay, J.; Siewerdsen, J. H.; Zbijewski, W.
2017-03-01
Purpose: Anatomical metrics of the tibiofemoral joint support assessment of joint stability and surgical planning. We propose an automated, atlas-based algorithm to streamline the measurements in 3D images of the joint and reduce userdependence of the metrics arising from manual identification of the anatomical landmarks. Methods: The method is initialized with coarse registrations of a set of atlas images to the fixed input image. The initial registrations are then refined separately for the tibia and femur and the best matching atlas is selected. Finally, the anatomical landmarks of the best matching atlas are transformed onto the input image by deforming a surface model of the atlas to fit the shape of the tibial plateau in the input image (a mesh-to-volume registration). We apply the method to weight-bearing volumetric images of the knee obtained from 23 subjects using an extremity cone-beam CT system. Results of the automated algorithm were compared to an expert radiologist for measurements of Static Alignment (SA), Medial Tibial Slope (MTS) and Lateral Tibial Slope (LTS). Results: Intra-reader variability as high as 10% for LTS and 7% for MTS (ratio of standard deviation to the mean in repeated measurements) was found for expert radiologist, illustrating the potential benefits of an automated approach in improving the precision of the metrics. The proposed method achieved excellent registration of the atlas mesh to the input volumes. The resulting automated measurements yielded high correlations with expert radiologist, as indicated by correlation coefficients of 0.72 for MTS, 0.8 for LTS, and 0.89 for SA. Conclusions: The automated method for measurement of anatomical metrics of the tibiofemoral joint achieves high correlation with expert radiologist without the need for time consuming and error prone manual selection of landmarks.
Atlas-based automatic measurements of the morphology of the tibiofemoral joint.
Brehler, M; Thawait, G; Shyr, W; Ramsay, J; Siewerdsen, J H; Zbijewski, W
2017-02-11
Anatomical metrics of the tibiofemoral joint support assessment of joint stability and surgical planning. We propose an automated, atlas-based algorithm to streamline the measurements in 3D images of the joint and reduce user-dependence of the metrics arising from manual identification of the anatomical landmarks. The method is initialized with coarse registrations of a set of atlas images to the fixed input image. The initial registrations are then refined separately for the tibia and femur and the best matching atlas is selected. Finally, the anatomical landmarks of the best matching atlas are transformed onto the input image by deforming a surface model of the atlas to fit the shape of the tibial plateau in the input image (a mesh-to-volume registration). We apply the method to weight-bearing volumetric images of the knee obtained from 23 subjects using an extremity cone-beam CT system. Results of the automated algorithm were compared to an expert radiologist for measurements of Static Alignment (SA), Medial Tibial Slope (MTS) and Lateral Tibial Slope (LTS). Intra-reader variability as high as ~10% for LTS and 7% for MTS (ratio of standard deviation to the mean in repeated measurements) was found for expert radiologist, illustrating the potential benefits of an automated approach in improving the precision of the metrics. The proposed method achieved excellent registration of the atlas mesh to the input volumes. The resulting automated measurements yielded high correlations with expert radiologist, as indicated by correlation coefficients of 0.72 for MTS, 0.8 for LTS, and 0.89 for SA. The automated method for measurement of anatomical metrics of the tibiofemoral joint achieves high correlation with expert radiologist without the need for time consuming and error prone manual selection of landmarks.
Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc
2010-07-01
We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Rotzoll, K.; Izuka, S. K.; Nishikawa, T.; Fienen, M. N.; El-Kadi, A. I.
2015-12-01
The volcanic-rock aquifers of Kauai, Oahu, and Maui are heavily developed, leading to concerns related to the effects of groundwater withdrawals on saltwater intrusion and streamflow. A numerical modeling analysis using the most recently available data (e.g., information on recharge, withdrawals, hydrogeologic framework, and conceptual models of groundwater flow) will substantially advance current understanding of groundwater flow and provide insight into the effects of human activity and climate change on Hawaii's water resources. Three island-wide groundwater-flow models were constructed using MODFLOW 2005 coupled with the Seawater-Intrusion Package (SWI2), which simulates the transition between saltwater and freshwater in the aquifer as a sharp interface. This approach allowed relatively fast model run times without ignoring the freshwater-saltwater system at the regional scale. Model construction (FloPy3), automated-parameter estimation (PEST), and analysis of results were streamlined using Python scripts. Model simulations included pre-development (1870) and current (average of 2001-10) scenarios for each island. Additionally, scenarios for future withdrawals and climate change were simulated for Oahu. We present our streamlined approach and preliminary results showing estimated effects of human activity on the groundwater resource by quantifying decline in water levels, reduction in stream base flow, and rise of the freshwater-saltwater interface.
Earth System Documentation (ES-DOC) Preparation for CMIP6
NASA Astrophysics Data System (ADS)
Denvil, S.; Murphy, S.; Greenslade, M. A.; Lawrence, B.; Guilyardi, E.; Pascoe, C.; Treshanksy, A.; Elkington, M.; Hibling, E.; Hassell, D.
2015-12-01
During the course of 2015 the Earth System Documentation (ES-DOC) project began its preparations for CMIP6 (Coupled Model Inter-comparison Project 6) by further extending the ES-DOC tooling ecosystem in support of Earth System Model (ESM) documentation creation, search, viewing & comparison. The ES-DOC online questionnaire, the ES-DOC desktop notebook, and the ES-DOC python toolkit will serve as multiple complementary pathways to generating CMIP6 documentation. It is envisaged that institutes will leverage these tools at different points of the CMIP6 lifecycle. Institutes will be particularly interested to know that the documentation burden will be either streamlined or completely automated.As all the tools are tightly integrated with the ES-DOC web-service, institutes can be confident that the latency between documentation creation & publishing will be reduced to a minimum. Published documents will be viewable with the online ES-DOC Viewer (accessible via citable URL's). Model inter-comparison scenarios will be supported using the ES-DOC online Comparator tool. The Comparator is being extended to:• Support comparison of both Model descriptions & Simulation runs;• Greatly streamline the effort involved in compiling official tables.The entire ES-DOC ecosystem is open source and built upon open standards such as the Common Information Model (CIM) (versions 1 and 2).
Nurse-driven protocols for febrile pediatric oncology patients.
Dobrasz, Gina; Hatfield, Marianne; Jones, Laura Masak; Berdis, Jennifer Joan; Miller, Erin Elizabeth; Entrekin, Melanie Smith
2013-05-01
Infection is a frequent complication experienced by many children with cancer, with potentially life-threatening consequences that may result in hospitalization, prolonged length of stay, and increased mortality. The need for prompt assessment and early intervention for infection is widely recognized by ED staff as best practice; however, the average length of time to antibiotic administration varies widely in published studies. An interdisciplinary quality improvement initiative including physician, nursing, and pharmacy leaders was created to streamline the identification and treatment for this high-risk population. Based on published evidence for best practice and national recognition of the need for rapid treatment, the goal was set for administration of antibiotic therapy to less than 60 minutes after ED arrival. This project was conducted at 2 emergency departments in a pediatric health care system with 520 beds and a level I and level II trauma designation. Approximately 154,000 patients are seen annually. In the emergency departments, 271 staff members, including registered nurses, paramedics, and patient care technicians, required education about using the newly designed process. Records from all patients with fever and a known history of pediatric cancer who presented to the emergency departments were included in the retrospective review, including patients with solid tumors, acute lymphoblastic leukemia, acute myeloid leukemia, and chronic myelogenous leukemia. Exclusion criteria included patients in known remission, those with prior antibiotic therapy at another facility, congenital neutropenia, or parental concern or objection to treatment. A retrospective medical record review of febrile oncology patients treated from September 2008 until May 2012 was conducted to evaluate the impact of this evidence-based practice change to streamline the "door to drug" process. The average length of time until antibiotic administration, nurses' compliance initiating the protocol, and ED length of stay were determined. The review included 2758 medical records. During the study period from 2008 to 2012, one emergency department's average time for drug administration dropped from 103 to 44 minutes, and the second dropped from 141 to 61 minutes. Both campuses also improved their protocol compliance, with ED 1 increasing from 24% to 78% and ED 2 improving from 30% to 84%. This quality initiative has direct application for all ED leaders who treat children with cancer. High-risk patients can benefit from a streamlined nurse-initiated process that decreases negative consequences of fever. Collaboration by interdisciplinary leadership within the health care facility, as well as key stakeholder buy-in, is imperative to achieve a process that may lead to decreased hospital stay and reduced systemic infection or mortality for these vulnerable patients. Copyright © 2013 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.
Remmelink, Esther; Loos, Maarten; Koopmans, Bastijn; Aarts, Emmeke; van der Sluis, Sophie; Smit, August B; Verhage, Matthijs
2015-04-15
Individuals are able to change their behavior based on its consequences, a process involving instrumental learning. Studying instrumental learning in mice can provide new insights in this elementary aspect of cognition. Conventional appetitive operant learning tasks that facilitate the study of this form of learning in mice, as well as more complex operant paradigms, require labor-intensive handling and food deprivation to motivate the animals. Here, we describe a 1-night operant learning protocol that exploits the advantages of automated home-cage testing and circumvents the interfering effects of food restriction. The task builds on behavior that is part of the spontaneous exploratory repertoire during the days before the task. We compared the behavior of C57BL/6J, BALB/cJ and DBA/2J mice and found various differences in behavior during this task, but no differences in learning curves. BALB/cJ mice showed the largest instrumental learning response, providing a superior dynamic range and statistical power to study instrumental learning by using this protocol. Insights gained with this home-cage-based learning protocol without food restriction will be valuable for the development of other, more complex, cognitive tasks in automated home-cages. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Song, Yi; Deane, Paul; Beigman Klebanov, Beata
2017-01-01
This project focuses on laying the foundations for automated analysis of argumentation schemes, supporting identification and classification of the arguments being made in a text, for the purpose of scoring the quality of written analyses of arguments. We developed annotation protocols for 20 argument prompts from a college-level test under the…
Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena
2015-04-01
To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.
Automated Purification of Recombinant Proteins: Combining High-throughput with High Yield
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Chiann Tso; Moore, Priscilla A.; Auberry, Deanna L.
2006-05-01
Protein crystallography, mapping protein interactions and other approaches of current functional genomics require not only purifying large numbers of proteins but also obtaining sufficient yield and homogeneity for downstream high-throughput applications. There is a need for the development of robust automated high-throughput protein expression and purification processes to meet these requirements. We developed and compared two alternative workflows for automated purification of recombinant proteins based on expression of bacterial genes in Escherichia coli: First - a filtration separation protocol based on expression of 800 ml E. coli cultures followed by filtration purification using Ni2+-NTATM Agarose (Qiagen). Second - a smallermore » scale magnetic separation method based on expression in 25 ml cultures of E.coli followed by 96-well purification on MagneHisTM Ni2+ Agarose (Promega). Both workflows provided comparable average yields of proteins about 8 ug of purified protein per unit of OD at 600 nm of bacterial culture. We discuss advantages and limitations of the automated workflows that can provide proteins more than 90 % pure in the range of 100 ug – 45 mg per purification run as well as strategies for optimization of these protocols.« less
Dewes, Patricia; Frellesen, Claudia; Scholtz, Jan-Erik; Fischer, Sebastian; Vogl, Thomas J; Bauer, Ralf W; Schulz, Boris
2016-06-01
To evaluate a novel tin filter-based abdominal CT protocol for urolithiasis in terms of image quality and CT dose parameters. 130 consecutive patients with suspected urolithiasis underwent non-enhanced CT with three different protocols: 48 patients (group 1) were examined at tin-filtered 150kV (150kV Sn) on a third-generation dual-source-CT, 33 patients were examined with automated kV-selection (110-140kV) based on the scout view on the same CT-device (group 2), and 49 patients were examined on a second-generation dual-source-CT (group 3) with automated kV-selection (100-140kV). Automated exposure control was active in all groups. Image quality was subjectively evaluated on a 5-point-likert-scale by two radiologists and interobserver agreement as well as signal-to-noise-ratio (SNR) was calculated. Dose-length-product (DLP) and volume CT dose index (CTDIvol) were compared. Image quality was rated in favour for the tin filter protocol with excellent interobserver agreement (ICC=0.86-0.91) and the difference reached statistical significance (p<0.001). SNR was significantly higher in group 1 and 2 compared to second-generation DSCT (p<0.001). On third-generation dual-source CT, there was no significant difference in SNR between the 150kV Sn and the automated kV selection protocol (p=0.5). The DLP of group 1 was 23% and 21% (p<0.002) lower in comparison to group 2 and 3, respectively. So was the CTDIvol of group 1 compared to group 2 (-36%) and 3 (-32%) (p<0.001). Additional shaping of a 150kV source spectrum by a tin filter substantially lowers patient exposure while improving image quality on un-enhanced abdominal computed tomography for urinary stone disease. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Moore, J A; Nemat-Gorgani, M; Madison, A C; Sandahl, M A; Punnamaraju, S; Eckhardt, A E; Pollack, M G; Vigneault, F; Church, G M; Fair, R B; Horowitz, M A; Griffin, P B
2017-01-01
This paper reports on the use of a digital microfluidic platform to perform multiplex automated genetic engineering (MAGE) cycles on droplets containing Escherichia coli cells. Bioactivated magnetic beads were employed for cell binding, washing, and media exchange in the preparation of electrocompetent cells in the electrowetting-on-dieletric (EWoD) platform. On-cartridge electroporation was used to deliver oligonucleotides into the cells. In addition to the optimization of a magnetic bead-based benchtop protocol for generating and transforming electrocompetent E. coli cells, we report on the implementation of this protocol in a fully automated digital microfluidic platform. Bead-based media exchange and electroporation pulse conditions were optimized on benchtop for transformation frequency to provide initial parameters for microfluidic device trials. Benchtop experiments comparing electrotransformation of free and bead-bound cells are presented. Our results suggest that dielectric shielding intrinsic to bead-bound cells significantly reduces electroporation field exposure efficiency. However, high transformation frequency can be maintained in the presence of magnetic beads through the application of more intense electroporation pulses. As a proof of concept, MAGE cycles were successfully performed on a commercial EWoD cartridge using variations of the optimal magnetic bead-based preparation procedure and pulse conditions determined by the benchtop results. Transformation frequencies up to 22% were achieved on benchtop; this frequency was matched within 1% (21%) by MAGE cycles on the microfluidic device. However, typical frequencies on the device remain lower, averaging 9% with a standard deviation of 9%. The presented results demonstrate the potential of digital microfluidics to perform complex and automated genetic engineering protocols.
Moore, J. A.; Nemat-Gorgani, M.; Madison, A. C.; Punnamaraju, S.; Eckhardt, A. E.; Pollack, M. G.; Church, G. M.; Fair, R. B.; Horowitz, M. A.; Griffin, P. B.
2017-01-01
This paper reports on the use of a digital microfluidic platform to perform multiplex automated genetic engineering (MAGE) cycles on droplets containing Escherichia coli cells. Bioactivated magnetic beads were employed for cell binding, washing, and media exchange in the preparation of electrocompetent cells in the electrowetting-on-dieletric (EWoD) platform. On-cartridge electroporation was used to deliver oligonucleotides into the cells. In addition to the optimization of a magnetic bead-based benchtop protocol for generating and transforming electrocompetent E. coli cells, we report on the implementation of this protocol in a fully automated digital microfluidic platform. Bead-based media exchange and electroporation pulse conditions were optimized on benchtop for transformation frequency to provide initial parameters for microfluidic device trials. Benchtop experiments comparing electrotransformation of free and bead-bound cells are presented. Our results suggest that dielectric shielding intrinsic to bead-bound cells significantly reduces electroporation field exposure efficiency. However, high transformation frequency can be maintained in the presence of magnetic beads through the application of more intense electroporation pulses. As a proof of concept, MAGE cycles were successfully performed on a commercial EWoD cartridge using variations of the optimal magnetic bead-based preparation procedure and pulse conditions determined by the benchtop results. Transformation frequencies up to 22% were achieved on benchtop; this frequency was matched within 1% (21%) by MAGE cycles on the microfluidic device. However, typical frequencies on the device remain lower, averaging 9% with a standard deviation of 9%. The presented results demonstrate the potential of digital microfluidics to perform complex and automated genetic engineering protocols. PMID:28191268
Enhancing reproducibility of ultrasonic measurements by new users
NASA Astrophysics Data System (ADS)
Pramanik, Manojit; Gupta, Madhumita; Krishnan, Kajoli Banerjee
2013-03-01
Perception of operator influences ultrasound image acquisition and processing. Lower costs are attracting new users to medical ultrasound. Anticipating an increase in this trend, we conducted a study to quantify the variability in ultrasonic measurements made by novice users and identify methods to reduce it. We designed a protocol with four presets and trained four new users to scan and manually measure the head circumference of a fetal phantom with an ultrasound scanner. In the first phase, the users followed this protocol in seven distinct sessions. They then received feedback on the quality of the scans from an expert. In the second phase, two of the users repeated the entire protocol aided by visual cues provided to them during scanning. We performed off-line measurements on all the images using a fully automated algorithm capable of measuring the head circumference from fetal phantom images. The ground truth (198.1±1.6 mm) was based on sixteen scans and measurements made by an expert. Our analysis shows that: (1) the inter-observer variability of manual measurements was 5.5 mm, whereas the inter-observer variability of automated measurements was only 0.6 mm in the first phase (2) consistency of image appearance improved and mean manual measurements was 4-5 mm closer to the ground truth in the second phase (3) automated measurements were more precise, accurate and less sensitive to different presets compared to manual measurements in both phases. Our results show that visual aids and automation can bring more reproducibility to ultrasonic measurements made by new users.
Sochor, Jiri; Ryvolova, Marketa; Krystofova, Olga; Salas, Petr; Hubalek, Jaromir; Adam, Vojtech; Trnkova, Libuse; Havel, Ladislav; Beklova, Miroslava; Zehnalek, Josef; Provaznik, Ivo; Kizek, Rene
2010-11-29
The aim of this study was to describe behaviour, kinetics, time courses and limitations of the six different fully automated spectrometric methods--DPPH, TEAC, FRAP, DMPD, Free Radicals and Blue CrO5. Absorption curves were measured and absorbance maxima were found. All methods were calibrated using the standard compounds Trolox® and/or gallic acid. Calibration curves were determined (relative standard deviation was within the range from 1.5 to 2.5%). The obtained characteristics were compared and discussed. Moreover, the data obtained were applied to optimize and to automate all mentioned protocols. Automatic analyzer allowed us to analyse simultaneously larger set of samples, to decrease the measurement time, to eliminate the errors and to provide data of higher quality in comparison to manual analysis. The total time of analysis for one sample was decreased to 10 min for all six methods. In contrary, the total time of manual spectrometric determination was approximately 120 min. The obtained data provided good correlations between studied methods (R=0.97-0.99).
Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M
2008-05-01
An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.
Automated Fluid Feature Extraction from Transient Simulations
NASA Technical Reports Server (NTRS)
Haimes, Robert
1998-01-01
In the past, feature extraction and identification were interesting concepts, but not required to understand the underlying physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of much interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense. The following is a list of the important physical phenomena found in transient (and steady-state) fluid flow: Shocks; Vortex ores; Regions of Recirculation; Boundary Layers; Wakes.
3D aquifer characterization using stochastic streamline calibration
NASA Astrophysics Data System (ADS)
Jang, Minchul
2007-03-01
In this study, a new inverse approach, stochastic streamline calibration is proposed. Using both a streamline concept and a stochastic technique, stochastic streamline calibration optimizes an identified field to fit in given observation data in a exceptionally fast and stable fashion. In the stochastic streamline calibration, streamlines are adopted as basic elements not only for describing fluid flow but also for identifying the permeability distribution. Based on the streamline-based inversion by Agarwal et al. [Agarwal B, Blunt MJ. Streamline-based method with full-physics forward simulation for history matching performance data of a North sea field. SPE J 2003;8(2):171-80], Wang and Kovscek [Wang Y, Kovscek AR. Streamline approach for history matching production data. SPE J 2000;5(4):353-62], permeability is modified rather along streamlines than at the individual gridblocks. Permeabilities in the gridblocks which a streamline passes are adjusted by being multiplied by some factor such that we can match flow and transport properties of the streamline. This enables the inverse process to achieve fast convergence. In addition, equipped with a stochastic module, the proposed technique supportively calibrates the identified field in a stochastic manner, while incorporating spatial information into the field. This prevents the inverse process from being stuck in local minima and helps search for a globally optimized solution. Simulation results indicate that stochastic streamline calibration identifies an unknown permeability exceptionally quickly. More notably, the identified permeability distribution reflected realistic geological features, which had not been achieved in the original work by Agarwal et al. with the limitations of the large modifications along streamlines for matching production data only. The constructed model by stochastic streamline calibration forecasted transport of plume which was similar to that of a reference model. By this, we can expect the proposed approach to be applied to the construction of an aquifer model and forecasting of the aquifer performances of interest.
Hargrave, Catriona; Mason, Nicole; Guidi, Robyn; Miller, Julie-Anne; Becker, Jillian; Moores, Matthew; Mengersen, Kerrie; Poulsen, Michael; Harden, Fiona
2016-03-01
Time-consuming manual methods have been required to register cone-beam computed tomography (CBCT) images with plans in the Pinnacle(3) treatment planning system in order to replicate delivered treatments for adaptive radiotherapy. These methods rely on fiducial marker (FM) placement during CBCT acquisition or the image mid-point to localise the image isocentre. A quality assurance study was conducted to validate an automated CBCT-plan registration method utilising the Digital Imaging and Communications in Medicine (DICOM) Structure Set (RS) and Spatial Registration (RE) files created during online image-guided radiotherapy (IGRT). CBCTs of a phantom were acquired with FMs and predetermined setup errors using various online IGRT workflows. The CBCTs, DICOM RS and RE files were imported into Pinnacle(3) plans of the phantom and the resulting automated CBCT-plan registrations were compared to existing manual methods. A clinical protocol for the automated method was subsequently developed and tested retrospectively using CBCTs and plans for six bladder patients. The automated CBCT-plan registration method was successfully applied to thirty-four phantom CBCT images acquired with an online 0 mm action level workflow. Ten CBCTs acquired with other IGRT workflows required manual workarounds. This was addressed during the development and testing of the clinical protocol using twenty-eight patient CBCTs. The automated CBCT-plan registrations were instantaneous, replicating delivered treatments in Pinnacle(3) with errors of ±0.5 mm. These errors were comparable to mid-point-dependant manual registrations but superior to FM-dependant manual registrations. The automated CBCT-plan registration method quickly and reliably replicates delivered treatments in Pinnacle(3) for adaptive radiotherapy.
In Vitro Mass Propagation of Cymbopogon citratus Stapf., a Medicinal Gramineae.
Quiala, Elisa; Barbón, Raúl; Capote, Alina; Pérez, Naivy; Jiménez, Elio
2016-01-01
Cymbopogon citratus (D.C.) Stapf. is a medicinal plant source of lemon grass oils with multiple uses in the pharmaceutical and food industry. Conventional propagation in semisolid culture medium has become a fast tool for mass propagation of lemon grass, but the production cost must be lower. A solution could be the application of in vitro propagation methods based on liquid culture advantages and automation. This chapter provides two efficient protocols for in vitro propagation via organogenesis and somatic embryogenesis of this medicinal plant. Firstly, we report the production of shoots using a temporary immersion system (TIS). Secondly, a protocol for somatic embryogenesis using semisolid culture for callus formation and multiplication, and liquid culture in a rotatory shaker and conventional bioreactors for the maintenance of embryogenic culture, is described. Well-developed plants can be achieved from both protocols. Here we provide a fast and efficient technology for mass propagation of this medicinal plant taking the advantage of liquid culture and automation.
Novel SPECT Technologies and Approaches in Cardiac Imaging
Slomka, Piotr; Hung, Guang-Uei; Germano, Guido; Berman, Daniel S.
2017-01-01
Recent novel approaches in myocardial perfusion single photon emission CT (SPECT) have been facilitated by new dedicated high-efficiency hardware with solid-state detectors and optimized collimators. New protocols include very low-dose (1 mSv) stress-only, two-position imaging to mitigate attenuation artifacts, and simultaneous dual-isotope imaging. Attenuation correction can be performed by specialized low-dose systems or by previously obtained CT coronary calcium scans. Hybrid protocols using CT angiography have been proposed. Image quality improvements have been demonstrated by novel reconstructions and motion correction. Fast SPECT acquisition facilitates dynamic flow and early function measurements. Image processing algorithms have become automated with virtually unsupervised extraction of quantitative imaging variables. This automation facilitates integration with clinical variables derived by machine learning to predict patient outcome or diagnosis. In this review, we describe new imaging protocols made possible by the new hardware developments. We also discuss several novel software approaches for the quantification and interpretation of myocardial perfusion SPECT scans. PMID:29034066
Towards Automated Bargaining in Electronic Markets: A Partially Two-Sided Competition Model
NASA Astrophysics Data System (ADS)
Gatti, Nicola; Lazaric, Alessandro; Restelli, Marcello
This paper focuses on the prominent issue of automating bargaining agents within electronic markets. Models of bargaining in literature deal with settings wherein there are only two agents and no model satisfactorily captures settings in which there is competition among buyers, being they more than one, and analogously among sellers. In this paper, we extend the principal bargaining protocol, i.e. the alternating-offers protocol, to capture bargaining in markets. The model we propose is such that, in presence of a unique buyer and a unique seller, agents' equilibrium strategies are those in the original protocol. Moreover, we game theoretically study the considered game providing the following results: in presence of one-sided competition (more buyers and one seller or vice versa) we provide agents' equilibrium strategies for all the values of the parameters, in presence of two-sided competition (more buyers and more sellers) we provide an algorithm that produce agents' equilibrium strategies for a large set of the parameters and we experimentally evaluate its effectiveness.
Bergmeister, Konstantin D; Gröger, Marion; Aman, Martin; Willensdorfer, Anna; Manzano-Szalai, Krisztina; Salminger, Stefan; Aszmann, Oskar C
2016-08-01
Skeletal muscle consists of different fiber types which adapt to exercise, aging, disease, or trauma. Here we present a protocol for fast staining, automatic acquisition, and quantification of fiber populations with ImageJ. Biceps and lumbrical muscles were harvested from Sprague-Dawley rats. Quadruple immunohistochemical staining was performed on single sections using antibodies against myosin heavy chains and secondary fluorescent antibodies. Slides were scanned automatically with a slide scanner. Manual and automatic analyses were performed and compared statistically. The protocol provided rapid and reliable staining for automated image acquisition. Analyses between manual and automatic data indicated Pearson correlation coefficients for biceps of 0.645-0.841 and 0.564-0.673 for lumbrical muscles. Relative fiber populations were accurate to a degree of ± 4%. This protocol provides a reliable tool for quantification of muscle fiber populations. Using freely available software, it decreases the required time to analyze whole muscle sections. Muscle Nerve 54: 292-299, 2016. © 2016 Wiley Periodicals, Inc.
Prediction of psychosis across protocols and risk cohorts using automated language analysis
Corcoran, Cheryl M.; Carrillo, Facundo; Fernández‐Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C.; Bearden, Carrie E.; Cecchi, Guillermo A.
2018-01-01
Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer‐based natural language processing analyses, we previously showed that, among English‐speaking clinical (e.g., ultra) high‐risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross‐validate these automated linguistic analytic methods in a second larger risk cohort, also English‐speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine‐learning speech classifier – comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns – that had an 83% accuracy in predicting psychosis onset (intra‐protocol), a cross‐validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross‐protocol), and a 72% accuracy in discriminating the speech of recent‐onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at‐risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. PMID:29352548
Prediction of psychosis across protocols and risk cohorts using automated language analysis.
Corcoran, Cheryl M; Carrillo, Facundo; Fernández-Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C; Bearden, Carrie E; Cecchi, Guillermo A
2018-02-01
Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer-based natural language processing analyses, we previously showed that, among English-speaking clinical (e.g., ultra) high-risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross-validate these automated linguistic analytic methods in a second larger risk cohort, also English-speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine-learning speech classifier - comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns - that had an 83% accuracy in predicting psychosis onset (intra-protocol), a cross-validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross-protocol), and a 72% accuracy in discriminating the speech of recent-onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at-risk youths and identify linguistic targets for remediation and preventive intervention. More broadly, automated linguistic analysis can be a powerful tool for diagnosis and treatment across neuropsychiatry. © 2018 World Psychiatric Association.
Protocol Coordinator | Center for Cancer Research
PROGRAM DESCRIPTION Within the Leidos Biomedical Research Inc.’s Clinical Research Directorate, the Clinical Monitoring Research Program (CMRP) provides high-quality comprehensive and strategic operational support to the high-profile domestic and international clinical research initiatives of the National Cancer Institute (NCI), National Institute of Allergy and Infectious Diseases (NIAID), Clinical Center (CC), National Institute of Heart, Lung and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Center for Advancing Translational Sciences (NCATS), National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH). Since its inception in 2001, CMRP’s ability to provide rapid responses, high-quality solutions, and to recruit and retain experts with a variety of backgrounds to meet the growing research portfolios of NCI, NIAID, CC, NHLBI, NIAMS, NCATS, NINDS, and NIMH has led to the considerable expansion of the program and its repertoire of support services. CMRP’s support services are strategically aligned with the program’s mission to provide comprehensive, dedicated support to assist National Institutes of Health researchers in providing the highest quality of clinical research in compliance with applicable regulations and guidelines, maintaining data integrity, and protecting human subjects. For the scientific advancement of clinical research, CMRP services include comprehensive clinical trials, regulatory, pharmacovigilance, protocol navigation and development, and programmatic and project management support for facilitating the conduct of 400+ Phase I, II, and III domestic and international trials on a yearly basis. These trials investigate the prevention, diagnosis, treatment of, and therapies for cancer, influenza, HIV, and other infectious diseases and viruses such as hepatitis C, tuberculosis, malaria, and Ebola virus; heart, lung, and blood diseases and conditions; parasitic infections; rheumatic and inflammatory diseases; and rare and neglected diseases. CMRP’s collaborative approach to clinical research and the expertise and dedication of staff to the continuation and success of the program’s mission has contributed to improving the overall standards of public health on a global scale. The Clinical Monitoring Research Program (CMRP) provides comprehensive clinical and administrative support to the National Cancer Institute’s Center for Cancer Research’s (CCR), Office of Regulatory Affairs for protocol development review, regulatory review, and the implementation process as well as oversees medical writing/editing, regulatory/ compliance, and protocol coordination/navigation and administration. KEY ROLES/RESPONSIBILITIES - THIS POSITION IS CONTINGENT UPON FUNDING APPROVAL The Protocol Coordinator II: Provides programmatic and logistical support for the operations of clinical research for Phase I and Phase II clinical trials Provides deployment of clinical support services for clinical research Streamlines protocol development timeline Provides data and document collection and compilation for regulatory filing with the FDA and other regulatory authorities Provides administrative coordination and general logistical support for regulatory activities Ensures the provision of training for investigators and associate staff to reinforce and enhance a GCP culture Provides quality assurance and quality control oversight Performs regulatory review of clinical protocols, informed consent and other clinical documents Tracks and facilitates a portfolio of protocols through each process step (IRB, RAC, DSMB, Office of Protocol Services) Assists clinical investigators in preparing clinical research protocols, including writing and formatting protocol documents and consent forms Prepares protocol packages for review and ensures that protocol packages include all of the required material and comply with CCR, NCI and NIH policies Collaborates with investigators to resolve any protocol/data issues Coordinates submission of protocols for scientific and ethical review by the Branch scientific review committees, the NCI Institutional Review Board (IRB) and the clinical trial sponsor or the FDA Monitors the review process and maintains detailed, complete and accurate records for each protocol of the approvals at the various stages of the review process, including new protocol submissions, amendments to protocols, and continuing reviews, as well as other submissions such as adverse events Attends and prepares minutes for the Branch Protocol Review Committees For protocols that are performed with other research centers: contacts coordinators at other centers to obtain review committee approvals at these centers, maintains records of these approvals at the outside centers in the protocol files, and sends protocol amendments and other reports to the participating centers Maintains a schedule of all review committee submission deadline dates and meeting dates Assists clinical investigators in understanding and complying with the entire review process Works closely with the NCI Protocol Review Office in establishing and maintaining a paperless automated document management and tracking system for NCI protocols Converts protocols from Word format to PDF with bookmarks Maintains the PDF version of the most current approved version of each active clinical protocol on a central server This position has the option to be located in Frederick or Rockville, Maryland.
Evaluation in context: ATC automation in the field
NASA Technical Reports Server (NTRS)
Harwood, Kelly; Sanford, Beverly
1994-01-01
The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.
Indication-Based Ordering: A New Paradigm for Glycemic Control in Hospitalized Inpatients
Lee, Joshua; Clay, Brian; Zelazny, Ziband; Maynard, Gregory
2008-01-01
Background Inpatient glycemic control is a constant challenge. Institutional insulin management protocols and structured order sets are commonly advocated but poorly studied. Effective and validated methods to integrate algorithmic protocol guidance into the insulin ordering process are needed. Methods We introduced a basic structured set of computerized insulin orders (Version 1), and later introduced a paper insulin management protocol, to assist users with the order set. Metrics were devised to assess the impact of the protocol on insulin use, glycemic control, and hypoglycemia using pharmacy data and point of care glucose tests. When incremental improvement was seen (as described in the results), Version 2 of the insulin orders was created to further streamline the process. Results The percentage of regimens containing basal insulin improved with Version 1. The percentage of patient days with hypoglycemia improved from 3.68% at baseline to 2.59% with Version 1 plus the paper insulin management protocol, representing a relative risk for hypoglycemic day of 0.70 [confidence interval (CI) 0.62, 0.80]. The relative risk of an uncontrolled (mean glucose over 180 mg/dl) patient stay was reduced to 0.84 (CI 0.77, 0.91) with Version 1 and was reduced further to 0.73 (CI 0.66, 0.81) with the paper protocol. Version 2 used clinician-entered patient parameters to guide protocol-based insulin ordering and simultaneously improved the flexibility and ease of ordering over Version 1. Conclusion Patient parameter and protocol-based clinical decision support, added to computerized provider order entry, has a track record of improving glycemic control indices. This justifies the incorporation of these algorithms into online order management. PMID:19885198
Efficient model checking of network authentication protocol based on SPIN
NASA Astrophysics Data System (ADS)
Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan
2013-03-01
Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
Transitioning the Defense Automated Neurobehavioral Assessment (DANA) to Operational Use
2013-10-01
science concentrates on CONUS-based studies such as testing DANA in clinical drug, fatigue/alertness, concussion and/or depression protocols. The...operationally deployed into the military. 15. SUBJECT TERMS neurocognitive, assessment, NCAT, concussion , mTBI, mild traumatic brain injury, psychological...concentrate)on)CONUS@based)studies)such)as)testing) DANA)in)clinical)drug,)fatigue/alertness,) concussion )and/or)depression)protocols.))The) second
HTAPP: High-Throughput Autonomous Proteomic Pipeline
Yu, Kebing; Salomon, Arthur R.
2011-01-01
Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676
Ebert, Lars Christian; Ptacek, Wolfgang; Breitbeck, Robert; Fürst, Martin; Kronreif, Gernot; Martinez, Rosa Maria; Thali, Michael; Flach, Patricia M
2014-06-01
In this paper we present the second prototype of a robotic system to be used in forensic medicine. The system is capable of performing automated surface documentation using photogrammetry, optical surface scanning and image-guided, post-mortem needle placement for tissue sampling, liquid sampling, or the placement of guide wires. The upgraded system includes workflow optimizations, an automatic tool-change mechanism, a new software module for trajectory planning and a fully automatic computed tomography-data-set registration algorithm. We tested the placement accuracy of the system by using a needle phantom with radiopaque markers as targets. The system is routinely used for surface documentation and resulted in 24 surface documentations over the course of 11 months. We performed accuracy tests for needle placement using a biopsy phantom, and the Virtobot placed introducer needles with an accuracy of 1.4 mm (±0.9 mm). The second prototype of the Virtobot system is an upgrade of the first prototype but mainly focuses on streamlining the workflow and increasing the level of automation and also has an easier user interface. These upgrades make the Virtobot a potentially valuable tool for case documentation in a scalpel-free setting that uses purely imaging techniques and minimally invasive procedures and is the next step toward the future of virtual autopsy.
ROSE::FTTransform - A Source-to-Source Translation Framework for Exascale Fault-Tolerance Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lidman, J; Quinlan, D; Liao, C
2012-03-26
Exascale computing systems will require sufficient resilience to tolerate numerous types of hardware faults while still assuring correct program execution. Such extreme-scale machines are expected to be dominated by processors driven at lower voltages (near the minimum 0.5 volts for current transistors). At these voltage levels, the rate of transient errors increases dramatically due to the sensitivity to transient and geographically localized voltage drops on parts of the processor chip. To achieve power efficiency, these processors are likely to be streamlined and minimal, and thus they cannot be expected to handle transient errors entirely in hardware. Here we present anmore » open, compiler-based framework to automate the armoring of High Performance Computing (HPC) software to protect it from these types of transient processor errors. We develop an open infrastructure to support research work in this area, and we define tools that, in the future, may provide more complete automated and/or semi-automated solutions to support software resiliency on future exascale architectures. Results demonstrate that our approach is feasible, pragmatic in how it can be separated from the software development process, and reasonably efficient (0% to 30% overhead for the Jacobi iteration on common hardware; and 20%, 40%, 26%, and 2% overhead for a randomly selected subset of benchmarks from the Livermore Loops [1]).« less
EMHP: an accurate automated hole masking algorithm for single-particle cryo-EM image processing.
Berndsen, Zachary; Bowman, Charles; Jang, Haerin; Ward, Andrew B
2017-12-01
The Electron Microscopy Hole Punch (EMHP) is a streamlined suite of tools for quick assessment, sorting and hole masking of electron micrographs. With recent advances in single-particle electron cryo-microscopy (cryo-EM) data processing allowing for the rapid determination of protein structures using a smaller computational footprint, we saw the need for a fast and simple tool for data pre-processing that could run independent of existing high-performance computing (HPC) infrastructures. EMHP provides a data preprocessing platform in a small package that requires minimal python dependencies to function. https://www.bitbucket.org/chazbot/emhp Apache 2.0 License. bowman@scripps.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Case study: impact of technology investment on lead discovery at Bristol-Myers Squibb, 1998-2006.
Houston, John G; Banks, Martyn N; Binnie, Alastair; Brenner, Stephen; O'Connell, Jonathan; Petrillo, Edward W
2008-01-01
We review strategic approaches taken over an eight-year period at BMS to implement new high-throughput approaches to lead discovery. Investments in compound management infrastructure and chemistry library production capability allowed significant growth in the size, diversity and quality of the BMS compound collection. Screening platforms were upgraded with robust automated technology to support miniaturized assay formats, while workflows and information handling technologies were streamlined for improved performance. These technology changes drove the need for a supporting organization in which critical engineering, informatics and scientific skills were more strongly represented. Taken together, these investments led to significant improvements in speed and productivity as well a greater impact of screening campaigns on the initiation of new drug discovery programs.
Considerations and benefits of implementing an online database tool for business continuity.
Mackinnon, Susanne; Pinette, Jennifer
2016-01-01
In today's challenging climate of ongoing fiscal restraints, limited resources and complex organisational structures there is an acute need to investigate opportunities to facilitate enhanced delivery of business continuity programmes while maintaining or increasing acceptable levels of service delivery. In 2013, Health Emergency Management British Columbia (HEMBC), responsible for emergency management and business continuity activities across British Columbia's health sector, transitioned its business continuity programme from a manual to automated process with the development of a customised online database, known as the Health Emergency Management Assessment Tool (HEMAT). Key benefits to date include a more efficient business continuity input process, immediate situational awareness for use in emergency response and/or advanced planning and streamlined analyses for generation of reports.
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
Systems Maintenance Automated Repair Tasks (SMART)
NASA Technical Reports Server (NTRS)
2008-01-01
SMART is an interactive decision analysis and refinement software system that uses evaluation criteria for discrepant conditions to automatically provide and populate a document/procedure with predefined steps necessary to repair a discrepancy safely, effectively, and efficiently. SMART can store the tacit (corporate) knowledge merging the hardware specification requirements with the actual "how to" repair methods, sequences, and required equipment, all within a user-friendly interface. Besides helping organizations retain repair knowledge in streamlined procedures and sequences, SMART can also help them in saving processing time and expense, increasing productivity, improving quality, and adhering more closely to safety and other guidelines. Though SMART was developed for Space Shuttle applications, its interface is easily adaptable to any hardware that can be broken down by component, subcomponent, discrepancy, and repair.
Toward Intelligent Software Defect Detection
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2011-01-01
Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.
Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.
2001-01-01
Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.
Protocol Coordinator | Center for Cancer Research
PROGRAM DESCRIPTION Within the Leidos Biomedical Research Inc.’s Clinical Research Directorate, the Clinical Monitoring Research Program (CMRP) provides high-quality comprehensive and strategic operational support to the high-profile domestic and international clinical research initiatives of the National Cancer Institute (NCI), National Institute of Allergy and Infectious Diseases (NIAID), Clinical Center (CC), National Institute of Heart, Lung and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Center for Advancing Translational Sciences (NCATS), National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH). Since its inception in 2001, CMRP’s ability to provide rapid responses, high-quality solutions, and to recruit and retain experts with a variety of backgrounds to meet the growing research portfolios of NCI, NIAID, CC, NHLBI, NIAMS, NCATS, NINDS, and NIMH has led to the considerable expansion of the program and its repertoire of support services. CMRP’s support services are strategically aligned with the program’s mission to provide comprehensive, dedicated support to assist National Institutes of Health researchers in providing the highest quality of clinical research in compliance with applicable regulations and guidelines, maintaining data integrity, and protecting human subjects. For the scientific advancement of clinical research, CMRP services include comprehensive clinical trials, regulatory, pharmacovigilance, protocol navigation and development, and programmatic and project management support for facilitating the conduct of 400+ Phase I, II, and III domestic and international trials on a yearly basis. These trials investigate the prevention, diagnosis, treatment of, and therapies for cancer, influenza, HIV, and other infectious diseases and viruses such as hepatitis C, tuberculosis, malaria, and Ebola virus; heart, lung, and blood diseases and conditions; parasitic infections; rheumatic and inflammatory diseases; and rare and neglected diseases. CMRP’s collaborative approach to clinical research and the expertise and dedication of staff to the continuation and success of the program’s mission has contributed to improving the overall standards of public health on a global scale. The Clinical Monitoring Research Program (CMRP) provides comprehensive clinical and administrative support to the National Cancer Institute’s Center for Cancer Research’s (CCR) Protocol Support Office (PSO) for protocol development review, regulatory review, and the implementation process as well as oversees medical writing/editing, regulatory/ compliance, and protocol coordination/navigation and administration. KEY ROLES/RESPONSIBILITIES The Protocol Coordinator III: Provides programmatic and logistical support for the operations of clinical research for Phase I and Phase II clinical trials. Provides deployment of clinical support services for clinical research. Streamlines the protocol development timeline. Provides data and documents collection and compilation for regulatory filing with the U.S. Food and Drug Administration (FDA) and other regulatory authorities.. Provides technical review and report preparation. Provides administrative coordination and general logistical support for regulatory activities. Ensures the provision of training for investigators and associate staff to reinforce and enhance a Good Clinical Practices (GCP) culture. Oversees quality assurance and quality control, performs regulatory review of clinical protocols, informed consent and other clinical documents. Tracks and facilitates a portfolio of protocols through each process step (Institutional Review Board [IRB], Regulatory Affairs Compliance [RAC], Data Safety Monitoring Board [DSMB], Office of Protocol Services). Assists clinical investigators in preparing clinical research protocols, including writing and formatting consent forms. Prepares protocol packages for review and ensures that protocol packages include all of the required material and comply with CCR, NCI and NIH policies. Collaborates with investigators to resolve any protocol/data issues. Coordinates submission of protocols for scientific and ethical review by the Branch scientific review committees, the NCI IRB, and the clinical trial sponsor or the FDA. Monitors the review process and maintains detailed, complete and accurate records for each protocol of the approvals at the various stages of the review process, including new protocol submissions, amendments to protocols, and continuing reviews, as well as other submissions such as adverse events. Attends and prepares minutes for the Branch Protocol Review Committees. Contacts coordinators at other centers for protocols that are performed there to obtain review committee approvals at those centers, maintains records of these approvals and sends protocol amendments and other reports to the participating centers. Maintains a schedule of all review committee submission deadline dates and meeting dates. Assists clinical investigators in understanding and complying with the entire review process. Works closely with the NCI Protocol Review Office in establishing and maintaining a paperless automated document and tracking system for NCI protocols. Converts protocols from Word format to .pdf with bookmarks. Maintains the .pdf version of the most current approved version of each active clinical protocol on a central server. This position is located in Rockville, Maryland.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
Donnelly, Helen; Alemayehu, Demissie; Botgros, Radu; Comic-Savic, Sabrina; Eisenstein, Barry; Lorenz, Benjamin; Merchant, Kunal; Pelfrene, Eric; Reith, Christina; Santiago, Jonas; Tiernan, Rosemary; Wunderink, Richard; Tenaerts, Pamela; Knirsch, Charles
2016-08-15
Resistant bacteria are one of the leading causes of hospital-acquired/ventilator-associated bacterial pneumonia (HABP/VABP). HABP/VABP trials are complex and difficult to conduct due to the large number of medical procedures, adverse events, and concomitant medications involved. Differences in the legislative frameworks between different regions of the world may also lead to excessive data collection. The Clinical Trials Transformation Initiative (CTTI) seeks to advance antibacterial drug development (ABDD) by streamlining clinical trials to improve efficiency and feasibility while maintaining ethical rigor, patient safety, information value, and scientific validity. In 2013, CTTI engaged a multidisciplinary group of experts to discuss challenges impeding the conduct of HABP/VABP trials. Separate workstreams identified challenges associated with current data collection processes. Experts defined "data collection" as the act of capturing and reporting certain data on the case report form as opposed to recording of data as part of routine clinical care. The ABDD Project Team developed strategies for streamlining safety data collection in HABP/VABP trials using a Quality by Design approach. Current safety data collection processes in HABP/VABP trials often include extraneous information. More targeted strategies for safety data collection in HABP/VABP trials will rely on optimal protocol design and prespecification of which safety data are essential to satisfy regulatory reporting requirements. A consensus and a cultural change in clinical trial design and conduct, which involve recognition of the need for more efficient data collection, are urgently needed to advance ABDD and to improve HABP/VABP trials in particular. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
Barnett, Adrian G; Herbert, Danielle L; Campbell, Megan; Daly, Naomi; Roberts, Jason A; Mudge, Alison; Graves, Nicholas
2015-02-07
Despite the widely recognised importance of sustainable health care systems, health services research remains generally underfunded in Australia. The Australian Centre for Health Services Innovation (AusHSI) is funding health services research in the state of Queensland. AusHSI has developed a streamlined protocol for applying and awarding funding using a short proposal and accelerated peer review. An observational study of proposals for four health services research funding rounds from May 2012 to November 2013. A short proposal of less than 1,200 words was submitted using a secure web-based portal. The primary outcome measures are: time spent preparing proposals; a simplified scoring of grant proposals (reject, revise or accept for interview) by a scientific review committee; and progressing from submission to funding outcomes within eight weeks. Proposals outside of health services research were deemed ineligible. There were 228 eligible proposals across 4 funding rounds: from 29% to 79% were shortlisted and 9% to 32% were accepted for interview. Success rates increased from 6% (in 2012) to 16% (in 2013) of eligible proposals. Applicants were notified of the outcomes within two weeks from the interview; which was a maximum of eight weeks after the submission deadline. Applicants spent 7 days on average preparing their proposal. Applicants with a ranking of reject or revise received written feedback and suggested improvements for their proposals, and resubmissions composed one third of the 2013 rounds. The AusHSI funding scheme is a streamlined application process that has simplified the process of allocating health services research funding for both applicants and peer reviewers. The AusHSI process has minimised the time from submission to notification of funding outcomes.
Wolf, Dominik; Bocchetta, Martina; Preboske, Gregory M; Boccardi, Marina; Grothe, Michel J
2017-08-01
A harmonized protocol (HarP) for manual hippocampal segmentation on magnetic resonance imaging (MRI) has recently been developed by an international European Alzheimer's Disease Consortium-Alzheimer's Disease Neuroimaging Initiative project. We aimed at providing consensual certified HarP hippocampal labels in Montreal Neurological Institute (MNI) standard space to serve as reference in automated image analyses. Manual HarP tracings on the high-resolution MNI152 standard space template of four expert certified HarP tracers were combined to obtain consensual bilateral hippocampus labels. Utility and validity of these reference labels is demonstrated in a simple atlas-based morphometry approach for automated calculation of HarP-compliant hippocampal volumes within SPM software. Individual tracings showed very high agreement among the four expert tracers (pairwise Jaccard indices 0.82-0.87). Automatically calculated hippocampal volumes were highly correlated (r L/R = 0.89/0.91) with gold standard volumes in the HarP benchmark data set (N = 135 MRIs), with a mean volume difference of 9% (standard deviation 7%). The consensual HarP hippocampus labels in the MNI152 template can serve as a reference standard for automated image analyses involving MNI standard space normalization. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Refined approach for quantification of in vivo ischemia-reperfusion injury in the mouse heart
Medway, Debra J.; Schulz-Menger, Jeanette; Schneider, Jurgen E.; Neubauer, Stefan; Lygate, Craig A.
2009-01-01
Cardiac ischemia-reperfusion experiments in the mouse are important in vivo models of human disease. Infarct size is a particularly important scientific readout as virtually all cardiocirculatory pathways are affected by it. Therefore, such measurements must be exact and valid. The histological analysis, however, remains technically challenging, and the resulting quality is often unsatisfactory. For this report we have scrutinized each step involved in standard double-staining histology. We have tested published approaches and challenged their practicality. As a result, we propose an improved and streamlined protocol, which consistently yields high-quality histology, thereby minimizing experimental noise and group sizes. PMID:19820193
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
David Welsch; Roger Ryder; Tim Post
2006-01-01
The specific purpose of the BMP protocol is to create an economical, standardized, and repeatable BMP monitoring process that is completely automated, from data gathering through report generation, in order to provide measured data, ease of use, and compatibility with State BMP programs.The protocol was developed to meet the following needs:? Document the use and...
A hash based mutual RFID tag authentication protocol in telecare medicine information system.
Srivastava, Keerti; Awasthi, Amit K; Kaul, Sonam D; Mittal, R C
2015-01-01
Radio Frequency Identification (RFID) is a technology which has multidimensional applications to reduce the complexity of today life. Everywhere, like access control, transportation, real-time inventory, asset management and automated payment systems etc., RFID has its enormous use. Recently, this technology is opening its wings in healthcare environments, where potential applications include patient monitoring, object traceability and drug administration systems etc. In this paper, we propose a secure RFID-based protocol for the medical sector. This protocol is based on hash operation with synchronized secret. The protocol is safe against active and passive attacks such as forgery, traceability, replay and de-synchronization attack.
Technology Transfer Opportunities: Automated Ground-Water Monitoring, A Proven Technology
Smith, Kirk P.; Granato, Gregory E.
1998-01-01
Introduction The U.S. Geological Survey (USGS) has developed and tested an automated ground-water monitoring system that measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automated ground-water monitoring systems can be used to monitor known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, to serve as early warning systems monitoring ground-water quality near public water-supply wells, and for ground-water quality research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahmood, U; Erdi, Y; Wang, W
Purpose: To assess the impact of General Electrics automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of a pediatric anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, 80 mA, 0.7s rotation time. Image quality was assessed by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document.more » NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: For the baseline protocol, CNR was found to decrease from 0.460 ± 0.182 to 0.420 ± 0.057 when kVa was activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.620 ± 0.040. The liver dose decreased by 30% with kVa activation. Conclusion: Application of kVa reduces the liver dose up to 30%. However, reduction in image quality for abdominal scans occurs when using the automated tube voltage selection feature at the baseline protocol. As demonstrated by the CNR and NPS analysis, the texture and magnitude of the noise in reconstructed images at ASiR 40% was found to be the same as our baseline images. We have demonstrated that 30% dose reduction is possible when using 40% ASiR with kVa in pediatric patients.« less
Communications among elements of a space construction ensemble
NASA Technical Reports Server (NTRS)
Davis, Randal L.; Grasso, Christopher A.
1989-01-01
Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.
Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul
2013-07-21
Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format.
Lutz, Barry; Liang, Tinny; Fu, Elain; Ramachandran, Sujatha; Kauffman, Peter; Yager, Paul
2013-01-01
Lateral flow tests (LFTs) are an ingenious format for rapid and easy-to-use diagnostics, but they are fundamentally limited to assay chemistries that can be reduced to a single chemical step. In contrast, most laboratory diagnostic assays rely on multiple timed steps carried out by a human or a machine. Here, we use dissolvable sugar applied to paper to create programmable flow delays and present a paper network topology that uses these time delays to program automated multi-step fluidic protocols. Solutions of sucrose at different concentrations (10-70% of saturation) were added to paper strips and dried to create fluidic time delays spanning minutes to nearly an hour. A simple folding card format employing sugar delays was shown to automate a four-step fluidic process initiated by a single user activation step (folding the card); this device was used to perform a signal-amplified sandwich immunoassay for a diagnostic biomarker for malaria. The cards are capable of automating multi-step assay protocols normally used in laboratories, but in a rapid, low-cost, and easy-to-use format. PMID:23685876
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Romberger, Jeff
The HVAC Controls Evaluation Protocol is designed to address evaluation issues for direct digital controls/energy management systems/building automation systems (DDC/EMS/BAS) that are installed to control heating, ventilation, and air-conditioning (HVAC) equipment in commercial and institutional buildings. (This chapter refers to the DDC/EMS/BAS measure as HVAC controls.) This protocol may also be applicable to industrial facilities such as clean rooms and labs, which have either significant HVAC equipment or spaces requiring special environmental conditions.
Haug, M; Reischl, B; Prölß, G; Pollmann, C; Buckert, T; Keidel, C; Schürmann, S; Hock, M; Rupitsch, S; Heckel, M; Pöschel, T; Scheibel, T; Haynl, C; Kiriaev, L; Head, S I; Friedrich, O
2018-04-15
We engineered an automated biomechatronics system, MyoRobot, for robust objective and versatile assessment of muscle or polymer materials (bio-)mechanics. It covers multiple levels of muscle biosensor assessment, e.g. membrane voltage or contractile apparatus Ca 2+ ion responses (force resolution 1µN, 0-10mN for the given sensor; [Ca 2+ ] range ~ 100nM-25µM). It replaces previously tedious manual protocols to obtain exhaustive information on active/passive biomechanical properties across various morphological tissue levels. Deciphering mechanisms of muscle weakness requires sophisticated force protocols, dissecting contributions from altered Ca 2+ homeostasis, electro-chemical, chemico-mechanical biosensors or visco-elastic components. From whole organ to single fibre levels, experimental demands and hardware requirements increase, limiting biomechanics research potential, as reflected by only few commercial biomechatronics systems that can address resolution, experimental versatility and mostly, automation of force recordings. Our MyoRobot combines optical force transducer technology with high precision 3D actuation (e.g. voice coil, 1µm encoder resolution; stepper motors, 4µm feed motion), and customized control software, enabling modular experimentation packages and automated data pre-analysis. In small bundles and single muscle fibres, we demonstrate automated recordings of (i) caffeine-induced-, (ii) electrical field stimulation (EFS)-induced force, (iii) pCa-force, (iv) slack-tests and (v) passive length-tension curves. The system easily reproduces results from manual systems (two times larger stiffness in slow over fast muscle) and provides novel insights into unloaded shortening velocities (declining with increasing slack lengths). The MyoRobot enables automated complex biomechanics assessment in muscle research. Applications also extend to material sciences, exemplarily shown here for spider silk and collagen biopolymers. Copyright © 2017 Elsevier B.V. All rights reserved.
Burr, D.
2005-01-01
A unique clustering of layered streamlined forms in Athabasca Valles is hypothesized to reflect a significant hydraulic event. The forms, interpreted as sedimentary, are attributed to extensive sediment deposition during ponding and then streamlining of this sediment behind flow obstacles during ponded water outflow. These streamlined forms are analogous to those found in depositional basins and other loci of ponding in terrestrial catastrophic flood landscapes. These terrestrial streamlined forms can provide the best opportunity for reconstructing the history of the terrestrial flooding. Likewise, the streamlined forms in Athabasca Valles may provide the best opportunity to reconstruct the recent geologic history of this young Martian outflow channel. ?? 2005 Elsevier B.V. All rights reserved.
Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays
NASA Technical Reports Server (NTRS)
Pang, Jackson; Pingree, Paula; Torgerson, J. Leigh
2006-01-01
Deep Space Telecommunications Requirements: 1) Automated file transfer across inter-planetary distances; 2) Limited communication periods; 3) Reliable transport; 4) Delay and Disruption Tolerant; and 5) Asymmetric Data Channels.
Streamlines behind curved shock waves in axisymmetric flow fields
NASA Astrophysics Data System (ADS)
Filippi, A. A.; Skews, B. W.
2018-07-01
Streamlines behind axisymmetric curved shock waves were used to predict the internal surfaces that produced them. Axisymmetric ring wedge models with varying internal radii of curvature and leading-edge angles were used to produce numerical results. Said numerical simulations were validated using experimental shadowgraph results for a series of ring wedge test pieces. The streamlines behind curved shock waves for lower leading-edge angles are examined at Mach 3.4, whereas the highest leading-edge angle cases are explored at Mach 2.8 and 3.4. Numerical and theoretical streamlines are compared for the highest leading-edge angle cases at Mach 3.6. It was found that wall-bounding theoretical streamlines did not match the internal curved surface. This was due to extreme streamline curvature curving the streamlines when the shock angle approached the Mach angle at lower leading-edge angles. Increased Mach number and internal radius of curvature produced more reasonable results. Very good agreement was found between the theoretical and numerical streamlines at lower curvatures before the influence of the trailing edge expansion fan.
MollDE: a homology modeling framework you can click with.
Canutescu, Adrian A; Dunbrack, Roland L
2005-06-15
Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.
Gilad, S; Khosravi, R; Harnik, R; Ziv, Y; Shkedy, D; Galanty, Y; Frydman, M; Levi, J; Sanal, O; Chessa, L; Smeets, D; Shiloh, Y; Bar-Shira, A
1998-01-01
Ataxia-telangiectasia (A-T) is an autosomal recessive disorder characterized by neurodegeneration, immunodeficiency, cancer predisposition, and radiation sensitivity. The responsible gene, ATM, has an extensive genomic structure and encodes a large transcript with a 9.2 kb open reading frame (ORF). A-T mutations are extremely variable and most of them are private. We streamlined a high throughput protocol for the search for ATM mutations. The entire ATM ORF is amplified in a single RT-PCR step requiring a minimal amount of RNA. The product can serve for numerous nested PCRs in which overlapping portions of the ORF are further amplified and subjected to restriction endonuclease fingerprinting (REF) analysis. Splicing errors are readily detectable during the initial amplification of each portion. Using this protocol, we identified 5 novel A-T mutations and completed the elucidation of the molecular basis of A-T in the Israeli population.
All-in-One CRISPR-Cas9/FokI-dCas9 Vector-Mediated Multiplex Genome Engineering in Cultured Cells.
Sakuma, Tetsushi; Sakamoto, Takuya; Yamamoto, Takashi
2017-01-01
CRISPR-Cas9 enables highly convenient multiplex genome engineering in cultured cells, because it utilizes generic Cas9 nuclease and an easily customizable single-guide RNA (sgRNA) for site-specific DNA double-strand break induction. We previously established a multiplex CRISPR-Cas9 assembly system for constructing an all-in-one vector simultaneously expressing multiple sgRNAs and Cas9 nuclease or other Cas9 variants including FokI-dCas9, which supersedes the wild-type Cas9 with regard to high specificity. In this chapter, we describe a streamlined protocol to design and construct multiplex CRISPR-Cas9 or FokI-dCas9 vectors, to introduce them into cultured cells by lipofection or electroporation, to enrich the genomically edited cells with a transient puromycin selection, to validate the mutation efficiency by Surveyor nuclease assay, and to perform off-target analyses. We show that our protocol enables highly efficient multiplex genome engineering even in hard-to-transfect HepG2 cells.
Technology Transfer Opportunities: Automated Ground-Water Monitoring
Smith, Kirk P.; Granato, Gregory E.
1997-01-01
Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.
Nomenclature in laboratory robotics and automation (IUPAC Recommendation 1994)
(Skip) Kingston, H. M.; Kingstonz, M. L.
1994-01-01
These recommended terms have been prepared to help provide a uniform approach to terminology and notation in laboratory automation and robotics. Since the terminology used in laboratory automation and robotics has been derived from diverse backgrounds, it is often vague, imprecise, and in some cases, in conflict with classical automation and robotic nomenclature. These dejinitions have been assembled from standards, monographs, dictionaries, journal articles, and documents of international organizations emphasizing laboratory and industrial automation and robotics. When appropriate, definitions have been taken directly from the original source and identified with that source. However, in some cases no acceptable definition could be found and a new definition was prepared to define the object, term, or action. Attention has been given to defining specific robot types, coordinate systems, parameters, attributes, communication protocols and associated workstations and hardware. Diagrams are included to illustrate specific concepts that can best be understood by visualization. PMID:18924684
Regan, John Frederick
2014-09-09
Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.
Comparison of in vivo 3D cone-beam computed tomography tooth volume measurement protocols.
Forst, Darren; Nijjar, Simrit; Flores-Mir, Carlos; Carey, Jason; Secanell, Marc; Lagravere, Manuel
2014-12-23
The objective of this study is to analyze a set of previously developed and proposed image segmentation protocols for precision in both intra- and inter-rater reliability for in vivo tooth volume measurements using cone-beam computed tomography (CBCT) images. Six 3D volume segmentation procedures were proposed and tested for intra- and inter-rater reliability to quantify maxillary first molar volumes. Ten randomly selected maxillary first molars were measured in vivo in random order three times with 10 days separation between measurements. Intra- and inter-rater agreement for all segmentation procedures was attained using intra-class correlation coefficient (ICC). The highest precision was for automated thresholding with manual refinements. A tooth volume measurement protocol for CBCT images employing automated segmentation with manual human refinement on a 2D slice-by-slice basis in all three planes of space possessed excellent intra- and inter-rater reliability. Three-dimensional volume measurements of the entire tooth structure are more precise than 3D volume measurements of only the dental roots apical to the cemento-enamel junction (CEJ).
101 Labeled Brain Images and a Consistent Human Cortical Labeling Protocol
Klein, Arno; Tourville, Jason
2012-01-01
We introduce the Mindboggle-101 dataset, the largest and most complete set of free, publicly accessible, manually labeled human brain images. To manually label the macroscopic anatomy in magnetic resonance images of 101 healthy participants, we created a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels. The “Desikan–Killiany–Tourville” (DKT) protocol is intended to improve the ease, consistency, and accuracy of labeling human cortical areas. Given how difficult it is to label brains, the Mindboggle-101 dataset is intended to serve as brain atlases for use in labeling other brains, as a normative dataset to establish morphometric variation in a healthy population for comparison against clinical populations, and contribute to the development, training, testing, and evaluation of automated registration and labeling algorithms. To this end, we also introduce benchmarks for the evaluation of such algorithms by comparing our manual labels with labels automatically generated by probabilistic and multi-atlas registration-based approaches. All data and related software and updated information are available on the http://mindboggle.info/data website. PMID:23227001
Optimising mHealth helpdesk responsiveness in South Africa: towards automated message triage
Engelhard, Matthew; Copley, Charles; Watson, Jacqui; Pillay, Yogan; Barron, Peter
2018-01-01
In South Africa, a national-level helpdesk was established in August 2014 as a social accountability mechanism for improving governance, allowing recipients of public sector services to send complaints, compliments and questions directly to a team of National Department of Health (NDoH) staff members via text message. As demand increases, mechanisms to streamline and improve the helpdesk must be explored. This work aims to evaluate the need for and feasibility of automated message triage to improve helpdesk responsiveness to high-priority messages. Drawing from 65 768 messages submitted between October 2016 and July 2017, the quality of helpdesk message handling was evaluated via detailed inspection of (1) a random sample of 481 messages and (2) messages reporting mistreatment of women, as identified using expert-curated keywords. Automated triage was explored by training a naïve Bayes classifier to replicate message labels assigned by NDoH staff. Classifier performance was evaluated on 12 526 messages withheld from the training set. 90 of 481 (18.7%) NDoH responses were scored as suboptimal or incorrect, with median response time of 4.0 hours. 32 reports of facility-based mistreatment and 39 of partner and family violence were identified; NDoH response time and appropriateness for these messages were not superior to the random sample (P>0.05). The naïve Bayes classifier had average accuracy of 85.4%, with ≥98% specificity for infrequently appearing (<50%) labels. These results show that helpdesk handling of mistreatment of women could be improved. Keyword matching and naïve Bayes effectively identified uncommon messages of interest and could support automated triage to improve handling of high-priority messages. PMID:29713508
Automated Procurement System (APS): Project management plan (DS-03), version 1.2
NASA Technical Reports Server (NTRS)
Murphy, Diane R.
1994-01-01
The National Aeronautics and Space Administration (NASA) Marshall Space Flight Center (MSFC) is implementing an Automated Procurement System (APS) to streamline its business activities that are used to procure goods and services. This Project Management Plan (PMP) is the governing document throughout the implementation process and is identified as the APS Project Management Plan (DS-03). At this point in time, the project plan includes the schedules and tasks necessary to proceed through implementation. Since the basis of APS is an existing COTS system, the implementation process is revised from the standard SDLC. The purpose of the PMP is to provide the framework for the implementation process. It discusses the roles and responsibilities of the NASA project staff, the functions to be performed by the APS Development Contractor (PAI), and the support required of the NASA computer support contractor (CSC). To be successful, these three organizations must work together as a team, working towards the goals established in this Project Plan. The Project Plan includes a description of the proposed system, describes the work to be done, establishes a schedule of deliverables, and discusses the major standards and procedures to be followed.
Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System
Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth
2018-01-01
The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544
Automated Parallel Capillary Electrophoretic System
Li, Qingbo; Kane, Thomas E.; Liu, Changsheng; Sonnenschein, Bernard; Sharer, Michael V.; Kernan, John R.
2000-02-22
An automated electrophoretic system is disclosed. The system employs a capillary cartridge having a plurality of capillary tubes. The cartridge has a first array of capillary ends projecting from one side of a plate. The first array of capillary ends are spaced apart in substantially the same manner as the wells of a microtitre tray of standard size. This allows one to simultaneously perform capillary electrophoresis on samples present in each of the wells of the tray. The system includes a stacked, dual carousel arrangement to eliminate cross-contamination resulting from reuse of the same buffer tray on consecutive executions from electrophoresis. The system also has a gel delivery module containing a gel syringe/a stepper motor or a high pressure chamber with a pump to quickly and uniformly deliver gel through the capillary tubes. The system further includes a multi-wavelength beam generator to generate a laser beam which produces a beam with a wide range of wavelengths. An off-line capillary reconditioner thoroughly cleans a capillary cartridge to enable simultaneous execution of electrophoresis with another capillary cartridge. The streamlined nature of the off-line capillary reconditioner offers the advantage of increased system throughput with a minimal increase in system cost.
The automated Army ROTC Questionnaire (ARQ)
NASA Technical Reports Server (NTRS)
Young, David L. H.
1991-01-01
The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.
Sedgewick, Gerald J.; Ericson, Marna
2015-01-01
Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows. PMID:25575568
Using AUTORAD for Cassini File Uplinks: Incorporating Automated Commanding into Mission Operations
NASA Technical Reports Server (NTRS)
Goo, Sherwin
2014-01-01
As the Cassini spacecraft embarked on the Solstice Mission in October 2010, the flight operations team faced a significant challenge in planning and executing the continuing tour of the Saturnian system. Faced with budget cuts that reduced the science and engineering staff by over a third in size, new and streamlined processes had to be developed to allow the Cassini mission to maintain a high level of science data return with a lower amount of available resources while still minimizing the risk. Automation was deemed an important key in enabling mission operations with reduced workforce and the Cassini flight team has made this goal a priority for the Solstice Mission. The operations team learned about a utility called AUTORAD which would give the flight operations team the ability to program selected command files for radiation up to seven days in advance and help minimize the need for off-shift support that could deplete available staffing during the prime shift hours. This paper will describe how AUTORAD is being utilized by the Cassini flight operations team and the processes that were developed or modified to ensure that proper oversight and verification is maintained in the generation and execution of radiated command files.
Chaudhry, Waseem; Hussain, Nasir; Ahlberg, Alan W; Croft, Lori B; Fernandez, Antonio B; Parker, Mathew W; Swales, Heather H; Slomka, Piotr J; Henzlova, Milena J; Duvall, W Lane
2017-06-01
A stress-first myocardial perfusion imaging (MPI) protocol saves time, is cost effective, and decreases radiation exposure. A limitation of this protocol is the requirement for physician review of the stress images to determine the need for rest images. This hurdle could be eliminated if an experienced technologist and/or automated computer quantification could make this determination. Images from consecutive patients who were undergoing a stress-first MPI with attenuation correction at two tertiary care medical centers were prospectively reviewed independently by a technologist and cardiologist blinded to clinical and stress test data. Their decision on the need for rest imaging along with automated computer quantification of perfusion results was compared with the clinical reference standard of an assessment of perfusion images by a board-certified nuclear cardiologist that included clinical and stress test data. A total of 250 patients (mean age 61 years and 55% female) who underwent a stress-first MPI were studied. According to the clinical reference standard, 42 (16.8%) and 208 (83.2%) stress-first images were interpreted as "needing" and "not needing" rest images, respectively. The technologists correctly classified 229 (91.6%) stress-first images as either "needing" (n = 28) or "not needing" (n = 201) rest images. Their sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were 66.7%, 96.6%, 80.0%, and 93.5%, respectively. An automated stress TPD score ≥1.2 was associated with optimal sensitivity and specificity and correctly classified 179 (71.6%) stress-first images as either "needing" (n = 31) or "not needing" (n = 148) rest images. Its sensitivity, specificity, PPV, and NPV were 73.8%, 71.2%, 34.1%, and 93.1%, respectively. In a model whereby the computer or technologist could correct for the other's incorrect classification, 242 (96.8%) stress-first images were correctly classified. The composite sensitivity, specificity, PPV, and NPV were 83.3%, 99.5%, 97.2%, and 96.7%, respectively. Technologists and automated quantification software had a high degree of agreement with the clinical reference standard for determining the need for rest images in a stress-first imaging protocol. Utilizing an experienced technologist and automated systems to screen stress-first images could expand the use of stress-first MPI to sites where the cardiologist is not immediately available for interpretation.
del Río, Joaquín; Aguzzi, Jacopo; Costa, Corrado; Menesatti, Paolo; Sbragaglia, Valerio; Nogueras, Marc; Sarda, Francesc; Manuèl, Antoni
2013-10-30
Field measurements of the swimming activity rhythms of fishes are scant due to the difficulty of counting individuals at a high frequency over a long period of time. Cabled observatory video monitoring allows such a sampling at a high frequency over unlimited periods of time. Unfortunately, automation for the extraction of biological information (i.e., animals' visual counts per unit of time) is still a major bottleneck. In this study, we describe a new automated video-imaging protocol for the 24-h continuous counting of fishes in colorimetrically calibrated time-lapse photographic outputs, taken by a shallow water (20 m depth) cabled video-platform, the OBSEA. The spectral reflectance value for each patch was measured between 400 to 700 nm and then converted into standard RGB, used as a reference for all subsequent calibrations. All the images were acquired within a standardized Region Of Interest (ROI), represented by a 2 × 2 m methacrylate panel, endowed with a 9-colour calibration chart, and calibrated using the recently implemented "3D Thin-Plate Spline" warping approach in order to numerically define color by its coordinates in n-dimensional space. That operation was repeated on a subset of images, 500 images as a training set, manually selected since acquired under optimum visibility conditions. All images plus those for the training set were ordered together through Principal Component Analysis allowing the selection of 614 images (67.6%) out of 908 as a total corresponding to 18 days (at 30 min frequency). The Roberts operator (used in image processing and computer vision for edge detection) was used to highlights regions of high spatial colour gradient corresponding to fishes' bodies. Time series in manual and visual counts were compared together for efficiency evaluation. Periodogram and waveform analysis outputs provided very similar results, although quantified parameters in relation to the strength of respective rhythms were different. Results indicate that automation efficiency is limited by optimum visibility conditions. Data sets from manual counting present the larger day-night fluctuations in comparison to those derived from automation. This comparison indicates that the automation protocol subestimate fish numbers but it is anyway suitable for the study of community activity rhythms.
del Río, Joaquín; Aguzzi, Jacopo; Costa, Corrado; Menesatti, Paolo; Sbragaglia, Valerio; Nogueras, Marc; Sarda, Francesc; Manuèl, Antoni
2013-01-01
Field measurements of the swimming activity rhythms of fishes are scant due to the difficulty of counting individuals at a high frequency over a long period of time. Cabled observatory video monitoring allows such a sampling at a high frequency over unlimited periods of time. Unfortunately, automation for the extraction of biological information (i.e., animals' visual counts per unit of time) is still a major bottleneck. In this study, we describe a new automated video-imaging protocol for the 24-h continuous counting of fishes in colorimetrically calibrated time-lapse photographic outputs, taken by a shallow water (20 m depth) cabled video-platform, the OBSEA. The spectral reflectance value for each patch was measured between 400 to 700 nm and then converted into standard RGB, used as a reference for all subsequent calibrations. All the images were acquired within a standardized Region Of Interest (ROI), represented by a 2 × 2 m methacrylate panel, endowed with a 9-colour calibration chart, and calibrated using the recently implemented “3D Thin-Plate Spline” warping approach in order to numerically define color by its coordinates in n-dimensional space. That operation was repeated on a subset of images, 500 images as a training set, manually selected since acquired under optimum visibility conditions. All images plus those for the training set were ordered together through Principal Component Analysis allowing the selection of 614 images (67.6%) out of 908 as a total corresponding to 18 days (at 30 min frequency). The Roberts operator (used in image processing and computer vision for edge detection) was used to highlights regions of high spatial colour gradient corresponding to fishes' bodies. Time series in manual and visual counts were compared together for efficiency evaluation. Periodogram and waveform analysis outputs provided very similar results, although quantified parameters in relation to the strength of respective rhythms were different. Results indicate that automation efficiency is limited by optimum visibility conditions. Data sets from manual counting present the larger day-night fluctuations in comparison to those derived from automation. This comparison indicates that the automation protocol subestimate fish numbers but it is anyway suitable for the study of community activity rhythms. PMID:24177726
A framework for streamlining research workflow in neuroscience and psychology
Kubilius, Jonas
2014-01-01
Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers. PMID:24478691
The operational processing of wind estimates from cloud motions: Past, present and future
NASA Technical Reports Server (NTRS)
Novak, C.; Young, M.
1977-01-01
Current NESS winds operations provide approximately 1800 high quality wind estimates per day to about twenty domestic and foreign users. This marked improvement in NESS winds operations was the result of computer techniques development which began in 1969 to streamline and improve operational procedures. In addition, the launch of the SMS-1 satellite in 1974, the first in the second generation of geostationary spacecraft, provided an improved source of visible and infrared scanner data for the extraction of wind estimates. Currently, operational winds processing at NESS is accomplished by the automated and manual analyses of infrared data from two geostationary spacecraft. This system uses data from SMS-2 and GOES-1 to produce wind estimates valid for 00Z, 12Z and 18Z synoptic times.
Shuttle Repair Tools Automate Vehicle Maintenance
NASA Technical Reports Server (NTRS)
2013-01-01
Successfully building, flying, and maintaining the space shuttles was an immensely complex job that required a high level of detailed, precise engineering. After each shuttle landed, it entered a maintenance, repair, and overhaul (MRO) phase. Each system was thoroughly checked and tested, and worn or damaged parts replaced, before the shuttle was rolled out for its next mission. During the MRO period, workers needed to record exactly what needed replacing and why, as well as follow precise guidelines and procedures in making their repairs. That meant traceability, and with it lots of paperwork. In 2007, the number of reports generated during electrical system repairs was getting out of hand-placing among the top three systems in terms of paperwork volume. Repair specialists at Kennedy Space Center were unhappy spending so much time at a desk and so little time actually working on the shuttle. "Engineers weren't spending their time doing technical work," says Joseph Schuh, an electrical engineer at Kennedy. "Instead, they were busy with repetitive, time-consuming processes that, while important in their own right, provided a low return on time invested." The strain of such inefficiency was bad enough that slow electrical repairs jeopardized rollout on several occasions. Knowing there had to be a way to streamline operations, Kennedy asked Martin Belson, a project manager with 30 years experience as an aerospace contractor, to co-lead a team in developing software that would reduce the effort required to document shuttle repairs. The result was System Maintenance Automated Repair Tasks (SMART) software. SMART is a tool for aggregating and applying information on every aspect of repairs, from procedures and instructions to a vehicle s troubleshooting history. Drawing on that data, SMART largely automates the processes of generating repair instructions and post-repair paperwork. In the case of the space shuttle, this meant that SMART had 30 years worth of operations that it could apply to ongoing maintenance work. According to Schuh, "SMART standardized and streamlined many shuttle repair processes, saving time and money while increasing safety and the quality of repairs." Maintenance technicians and engineers now had a tool that kept them in the field, and because SMART is capable of continually evolving, each time an engineer put it to use, it would enrich the Agency-wide knowledge base. "If an engineer sees something in the work environment that they could improve, a repair process or a procedure, SMART can incorporate that data for use in future operations," says Belson.
Fojtu, Michaela; Gumulec, Jaromir; Balvan, Jan; Raudenska, Martina; Sztalmachova, Marketa; Polanska, Hana; Smerkova, Kristyna; Adam, Vojtech; Kizek, Rene; Masarik, Michal
2014-02-01
Determination of serum mRNA gained a lot of attention in recent years, particularly from the perspective of disease markers. Streptavidin-modified paramagnetic particles (SMPs) seem an interesting technique, mainly due to possible automated isolation and high efficiency. The aim of this study was to optimize serum isolation protocol to reduce the consumption of chemicals and sample volume. The following factors were optimized: amounts of (i) paramagnetic particles, (ii) oligo(dT)20 probe, (iii) serum, and (iv) the binding sequence (SMPs, oligo(dT)20 , serum vs. oligo(dT)20 , serum and SMPs). RNA content was measured, and the expression of metallothionein-2A as possible prostate cancer marker was analyzed to demonstrate measurable RNA content with ability for RT-PCR detection. Isolation is possible on serum volume range (10-200 μL) without altering of efficiency or purity. Amount of SMPs can be reduced up to 5 μL, with optimal results within 10-30 μL SMPs. Volume of oligo(dT)20 does not affect efficiency, when used within 0.1-0.4 μL. This optimized protocol was also modified to fit needs of automated one-step single-tube analysis with identical efficiency compared to conventional setup. One-step analysis protocol is considered a promising simplification, making RNA isolation suitable for automatable process. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Manousaki, D.; Panagiotopoulou, A.; Bizimi, V.; Haynes, M. S.; Love, S.; Kallergi, M.
2017-11-01
The purpose of this study was the generation of ground truth files (GTFs) of the breast ducts from 3D images of the Invenia™ Automated Breast Ultrasound System (ABUS) system (GE Healthcare, Little Chalfont, UK) and the application of these GTFs for the optimization of the imaging protocol and the evaluation of a computer aided detection (CADe) algorithm developed for automated duct detection. Six lactating, nursing volunteers were scanned with the ABUS before and right after breastfeeding their infants. An expert in breast ultrasound generated rough outlines of the milk-filled ducts in the transaxial slices of all image volumes and the final GTFs were created by using thresholding and smoothing tools in ImageJ. In addition, a CADe algorithm automatically segmented duct like areas and its results were compared to the expert’s GTFs by estimating true positive fraction (TPF) or % overlap. The CADe output differed significantly from the expert’s but both detected a smaller than expected volume of the ducts due to insufficient contrast (ducts were partially filled with milk), discontinuities, and artifacts. GTFs were used to modify the imaging protocol and improve the CADe method. In conclusion, electronic GTFs provide a valuable tool in the optimization of a tomographic imaging system, the imaging protocol, and the CADe algorithms. Their generation, however, is an extremely time consuming, strenuous process, particularly for multi-slice examinations, and alternatives based on phantoms or simulations are highly desirable.
Application of an industrial robot to nuclear pharmacy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viola, J.
1994-12-31
Increased patient throughput and lengthened P.E.T. scan protocols have increased the radiation dose received by P.E.T. technologists. Automated methods of tracer infusion and blood sampling have been introduced to reduce direct contact with the radioisotopes, but significant radiation exposure still exists during the receipt and dispensing of the patient dose. To address this situation the authors have developed an automated robotic system which performs these tasks, thus limiting the physical contact between operator and radioisotope.
A universal method for automated gene mapping
Zipperlen, Peder; Nairz, Knud; Rimann, Ivo; Basler, Konrad; Hafen, Ernst; Hengartner, Michael; Hajnal, Alex
2005-01-01
Small insertions or deletions (InDels) constitute a ubiquituous class of sequence polymorphisms found in eukaryotic genomes. Here, we present an automated high-throughput genotyping method that relies on the detection of fragment-length polymorphisms (FLPs) caused by InDels. The protocol utilizes standard sequencers and genotyping software. We have established genome-wide FLP maps for both Caenorhabditis elegans and Drosophila melanogaster that facilitate genetic mapping with a minimum of manual input and at comparatively low cost. PMID:15693948
Winter, York; Schaefers, Andrea T U
2011-03-30
Behavioral experiments based on operant procedures can be time-consuming for small amounts of data. While individual testing and handling of animals can influence attention, emotion, and behavior, and interfere with experimental outcome, many operant protocols require individual testing. We developed an RFID-technology- and transponder-based sorting system that allows removing the human factor for longer-term experiments. Identity detectors and automated gates route mice individually from their social home cage to an adjacent operant compartment with 24/7 operation. CD1-mice learnt quickly to individually pass through the sorting system. At no time did more than a single mouse enter the operant compartment. After 3 days of adjusting to the sorting system, groups of 4 mice completed about 50 experimental trials per day in the operant compartment without experimenter intervention. The automated sorting system eliminates handling, isolation, and disturbance of the animals, eliminates experimenter-induced variability, saves experimenter time, and is financially economical. It makes possible a new approach for high-throughput experimentation, and is a viable tool for increasing quality and efficiency of many behavioral and neurobiological investigations. It can connect a social home cage, through individual sorting automation, to diverse setups including classical operant chambers, mazes, or arenas with video-based behavior classification. Such highly automated systems will permit efficient high-throughput screening even for transgenic animals with only subtle neurological or psychiatric symptoms where elaborate or longer-term protocols are required for behavioral diagnosis. Copyright © 2011 Elsevier B.V. All rights reserved.
Re-refinement from deposited X-ray data can deliver improved models for most PDB entries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joosten, Robbie P.; Womack, Thomas; Vriend, Gert, E-mail: vriend@cmbi.ru.nl
2009-02-01
An evaluation of validation and real-space intervention possibilities for improving existing automated (re-)refinement methods. The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation andmore » difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data.« less
Khan, Ali R; Wang, Lei; Beg, Mirza Faisal
2008-07-01
Fully-automated brain segmentation methods have not been widely adopted for clinical use because of issues related to reliability, accuracy, and limitations of delineation protocol. By combining the probabilistic-based FreeSurfer (FS) method with the Large Deformation Diffeomorphic Metric Mapping (LDDMM)-based label-propagation method, we are able to increase reliability and accuracy, and allow for flexibility in template choice. Our method uses the automated FreeSurfer subcortical labeling to provide a coarse-to-fine introduction of information in the LDDMM template-based segmentation resulting in a fully-automated subcortical brain segmentation method (FS+LDDMM). One major advantage of the FS+LDDMM-based approach is that the automatically generated segmentations generated are inherently smooth, thus subsequent steps in shape analysis can directly follow without manual post-processing or loss of detail. We have evaluated our new FS+LDDMM method on several databases containing a total of 50 subjects with different pathologies, scan sequences and manual delineation protocols for labeling the basal ganglia, thalamus, and hippocampus. In healthy controls we report Dice overlap measures of 0.81, 0.83, 0.74, 0.86 and 0.75 for the right caudate nucleus, putamen, pallidum, thalamus and hippocampus respectively. We also find statistically significant improvement of accuracy in FS+LDDMM over FreeSurfer for the caudate nucleus and putamen of Huntington's disease and Tourette's syndrome subjects, and the right hippocampus of Schizophrenia subjects.
NASA Technical Reports Server (NTRS)
Wolf, S. W. D.; Goodyer, M. J.
1982-01-01
Operation of the Transonic Self-Streamlining Wind Tunnel (TSWT) involved on-line data acquisition with automatic wall adjustment. A tunnel run consisted of streamlining the walls from known starting contours in iterative steps and acquiring model data. Each run performs what is described as a streamlining cycle. The associated software is presented.
Barnett, Adrian G; Graves, Nicholas; Clarke, Philip; Herbert, Danielle
2015-01-01
Objective To examine if streamlining a medical research funding application process saved time for applicants. Design Cross-sectional surveys before and after the streamlining. Setting The National Health and Medical Research Council (NHMRC) of Australia. Participants Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures Average researcher time spent preparing an application and the total time for all applications in working days. Results The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications. PMID:25596201
Automated Formosat Image Processing System for Rapid Response to International Disasters
NASA Astrophysics Data System (ADS)
Cheng, M. C.; Chou, S. C.; Chen, Y. C.; Chen, B.; Liu, C.; Yu, S. J.
2016-06-01
FORMOSAT-2, Taiwan's first remote sensing satellite, was successfully launched in May of 2004 into the Sun-synchronous orbit at 891 kilometers of altitude. With the daily revisit feature, the 2-m panchromatic, 8-m multi-spectral resolution images captured have been used for researches and operations in various societal benefit areas. This paper details the orchestration of various tasks conducted in different institutions in Taiwan in the efforts responding to international disasters. The institutes involved including its space agency-National Space Organization (NSPO), Center for Satellite Remote Sensing Research of National Central University, GIS Center of Feng-Chia University, and the National Center for High-performance Computing. Since each institution has its own mandate, the coordinated tasks ranged from receiving emergency observation requests, scheduling and tasking of satellite operation, downlink to ground stations, images processing including data injection, ortho-rectification, to delivery of image products. With the lessons learned from working with international partners, the FORMOSAT Image Processing System has been extensively automated and streamlined with a goal to shorten the time between request and delivery in an efficient manner. The integrated team has developed an Application Interface to its system platform that provides functions of search in archive catalogue, request of data services, mission planning, inquiry of services status, and image download. This automated system enables timely image acquisition and substantially increases the value of data product. Example outcome of these efforts in recent response to support Sentinel Asia in Nepal Earthquake is demonstrated herein.
Fernandez-Ricaud, Luciano; Kourtchenko, Olga; Zackrisson, Martin; Warringer, Jonas; Blomberg, Anders
2016-06-23
Phenomics is a field in functional genomics that records variation in organismal phenotypes in the genetic, epigenetic or environmental context at a massive scale. For microbes, the key phenotype is the growth in population size because it contains information that is directly linked to fitness. Due to technical innovations and extensive automation our capacity to record complex and dynamic microbial growth data is rapidly outpacing our capacity to dissect and visualize this data and extract the fitness components it contains, hampering progress in all fields of microbiology. To automate visualization, analysis and exploration of complex and highly resolved microbial growth data as well as standardized extraction of the fitness components it contains, we developed the software PRECOG (PREsentation and Characterization Of Growth-data). PRECOG allows the user to quality control, interact with and evaluate microbial growth data with ease, speed and accuracy, also in cases of non-standard growth dynamics. Quality indices filter high- from low-quality growth experiments, reducing false positives. The pre-processing filters in PRECOG are computationally inexpensive and yet functionally comparable to more complex neural network procedures. We provide examples where data calibration, project design and feature extraction methodologies have a clear impact on the estimated growth traits, emphasising the need for proper standardization in data analysis. PRECOG is a tool that streamlines growth data pre-processing, phenotypic trait extraction, visualization, distribution and the creation of vast and informative phenomics databases.
E-novo: an automated workflow for efficient structure-based lead optimization.
Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit
2009-07-01
An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.
Graphical user interface for wireless sensor networks simulator
NASA Astrophysics Data System (ADS)
Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy
2008-01-01
Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.
Combined RT-qPCR of mRNA and microRNA Targets within One Fluidigm Integrated Fluidic Circuit.
Baldwin, Don A; Horan, Annamarie D; Hesketh, Patrick J; Mehta, Samir
2016-07-01
The ability to profile expression levels of a large number of mRNAs and microRNAs (miRNAs) within the same sample, using a single assay method, would facilitate investigations of miRNA effects on mRNA abundance and streamline biomarker screening across multiple RNA classes. A protocol is described for reverse transcription of long RNA and miRNA targets, followed by preassay amplification of the pooled cDNAs and quantitative PCR (qPCR) detection for a mixed panel of candidate RNA biomarkers. The method provides flexibility for designing custom target panels, is robust over a range of input RNA amounts, and demonstrated a high assay success rate.
Crowd-sourcing Meteorological Data for Student Field Projects
NASA Astrophysics Data System (ADS)
Bullard, J. E.
2016-12-01
This paper explains how students can rapidly collect large datasets to characterise wind speed and direction under different meteorological conditions. The tools used include a mobile device (tablet or phone), low cost wind speed/direction meters that are plugged in to the mobile device, and an app with online web support for uploading, collating and georeferencing data. Electronic customised data input forms downloaded to the mobile device are used to ensure students collect data using specified protocols which streamlines data management and reduces the likelihood of data entry errors. A key benefit is the rapid collection and quality control of field data that can be promptly disseminated to students for subsequent analysis.
Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang
2013-06-01
Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Air Permitting Streamlining Techniques and Approaches for Greenhouse Gases, 2012
This report presents potential GHG permit streamlining options and observations developed by the Clean Air Act Advisory Committee (CAAAC): Permits, New Source Review and Toxics Subcommittee GHG Permit Streamlining Workgroup
Ibrahim, Sarah A; Martini, Luigi
2014-08-01
Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.
Comparison of Actual Costs to Integrate Commercial Buildings with the Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piette, Mary Ann; Black, Doug; Yin, Rongxin
During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This paper discusses the impact factors that contribute to the costs of automated DR systems, with a focus on OpenADR 1.0 and 2.0 systems. In addition, this report comparesmore » cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. In summary, median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude greater or less than median. Costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total such costs.« less
Protocol-based care: the standardisation of decision-making?
Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra
2009-05-01
To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.
IntelliCages and automated assessment of learning in group-housed mice
NASA Astrophysics Data System (ADS)
Puścian, Alicja; Knapska, Ewelina
2014-11-01
IntelliCage is a fully automated, computer controlled system, which can be used for long-term monitoring of behavior of group-housed mice. Using standardized experimental protocols we can assess cognitive abilities and behavioral flexibility in appetitively and aversively motivated tasks, as well as measure social influences on learning of the subjects. We have also identified groups of neurons specifically activated by appetitively and aversively motivated learning within the amygdala, function of which we are going to investigate optogenetically in the future.
2011-01-01
Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303
Barnett, Adrian G; Graves, Nicholas; Clarke, Philip; Herbert, Danielle
2015-01-16
To examine if streamlining a medical research funding application process saved time for applicants. Cross-sectional surveys before and after the streamlining. The National Health and Medical Research Council (NHMRC) of Australia. Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Average researcher time spent preparing an application and the total time for all applications in working days. The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Automated sample-preparation technologies in genome sequencing projects.
Hilbert, H; Lauber, J; Lubenow, H; Düsterhöft, A
2000-01-01
A robotic workstation system (BioRobot 96OO, QIAGEN) and a 96-well UV spectrophotometer (Spectramax 250, Molecular Devices) were integrated in to the process of high-throughput automated sequencing of double-stranded plasmid DNA templates. An automated 96-well miniprep kit protocol (QIAprep Turbo, QIAGEN) provided high-quality plasmid DNA from shotgun clones. The DNA prepared by this procedure was used to generate more than two mega bases of final sequence data for two genomic projects (Arabidopsis thaliana and Schizosaccharomyces pombe), three thousand expressed sequence tags (ESTs) plus half a mega base of human full-length cDNA clones, and approximately 53,000 single reads for a whole genome shotgun project (Pseudomonas putida).
Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey
2017-02-01
Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.
Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S
2017-01-20
Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.
Mangold, Stefanie; De Cecco, Carlo N; Wichmann, Julian L; Canstein, Christian; Varga-Szemes, Akos; Caruso, Damiano; Fuller, Stephen R; Bamberg, Fabian; Nikolaou, Konstantin; Schoepf, U Joseph
2016-05-01
To compare, on an intra-individual basis, the effect of automated tube voltage selection (ATVS), integrated circuit detector and advanced iterative reconstruction on radiation dose and image quality of aortic CTA studies using 2nd and 3rd generation dual-source CT (DSCT). We retrospectively evaluated 32 patients who had undergone CTA of the entire aorta with both 2nd generation DSCT at 120kV using filtered back projection (FBP) (protocol 1) and 3rd generation DSCT using ATVS, an integrated circuit detector and advanced iterative reconstruction (protocol 2). Contrast-to-noise ratio (CNR) was calculated. Image quality was subjectively evaluated using a five-point scale. Radiation dose parameters were recorded. All studies were considered of diagnostic image quality. CNR was significantly higher with protocol 2 (15.0±5.2 vs 11.0±4.2; p<.0001). Subjective image quality analysis revealed no significant differences for evaluation of attenuation (p=0.08501) but image noise was rated significantly lower with protocol 2 (p=0.0005). Mean tube voltage and effective dose were 94.7±14.1kV and 6.7±3.9mSv with protocol 2; 120±0kV and 11.5±5.2mSv with protocol 1 (p<0.0001, respectively). Aortic CTA performed with 3rd generation DSCT, ATVS, integrated circuit detector, and advanced iterative reconstruction allow a substantial reduction of radiation exposure while improving image quality in comparison to 120kV imaging with FBP. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Hirasawa, Kazunori; Ito, Hikaru; Ohori, Yukari; Takano, Yui; Shoji, Nobuyuki
2017-01-01
AIM To evaluate the refractive correction for standard automated perimetry (SAP) in eyes with refractive multifocal contact lenses (CL) in healthy young participants. METHODS Twenty-nine eyes of 29 participants were included. Accommodation was paralyzed in all participants with 1% cyclopentolate hydrochloride. SAP was performed using the Humphrey SITA-standard 24-2 and 10-2 protocol under three refractive conditions: monofocal CL corrected for near distance (baseline); multifocal CL corrected for distance (mCL-D); and mCL-D corrected for near vision using a spectacle lens (mCL-N). Primary outcome measures were the foveal threshold, mean deviation (MD), and pattern standard deviation (PSD). RESULTS The foveal threshold of mCL-N with both the 24-2 and 10-2 protocols significantly decreased by 2.2-2.5 dB (P<0.001), while that of mCL-D with the 24-2 protocol significantly decreased by 1.5 dB (P=0.0427), as compared with that of baseline. Although there was no significant difference between the MD of baseline and mCL-D with the 24-2 and 10-2 protocols, the MD of mCL-N was significantly decreased by 1.0-1.3 dB (P<0.001) as compared with that of both baseline and mCL-D, with both 24-2 and 10-2 protocols. There was no significant difference in the PSD among the three refractive conditions with both the 24-2 and 10-2 protocols. CONCLUSION Despite the induced mydriasis and the optical design of the multifocal lens used in this study, our results indicated that, when the dome-shaped visual field test is performed with eyes with large pupils and wearing refractive multifocal CLs, distance correction without additional near correction is to be recommended. PMID:29062776
Hirasawa, Kazunori; Ito, Hikaru; Ohori, Yukari; Takano, Yui; Shoji, Nobuyuki
2017-01-01
To evaluate the refractive correction for standard automated perimetry (SAP) in eyes with refractive multifocal contact lenses (CL) in healthy young participants. Twenty-nine eyes of 29 participants were included. Accommodation was paralyzed in all participants with 1% cyclopentolate hydrochloride. SAP was performed using the Humphrey SITA-standard 24-2 and 10-2 protocol under three refractive conditions: monofocal CL corrected for near distance (baseline); multifocal CL corrected for distance (mCL-D); and mCL-D corrected for near vision using a spectacle lens (mCL-N). Primary outcome measures were the foveal threshold, mean deviation (MD), and pattern standard deviation (PSD). The foveal threshold of mCL-N with both the 24-2 and 10-2 protocols significantly decreased by 2.2-2.5 dB ( P <0.001), while that of mCL-D with the 24-2 protocol significantly decreased by 1.5 dB ( P =0.0427), as compared with that of baseline. Although there was no significant difference between the MD of baseline and mCL-D with the 24-2 and 10-2 protocols, the MD of mCL-N was significantly decreased by 1.0-1.3 dB ( P <0.001) as compared with that of both baseline and mCL-D, with both 24-2 and 10-2 protocols. There was no significant difference in the PSD among the three refractive conditions with both the 24-2 and 10-2 protocols. Despite the induced mydriasis and the optical design of the multifocal lens used in this study, our results indicated that, when the dome-shaped visual field test is performed with eyes with large pupils and wearing refractive multifocal CLs, distance correction without additional near correction is to be recommended.
Manning’s equation and two-dimensional flow analogs
NASA Astrophysics Data System (ADS)
Hromadka, T. V., II; Whitley, R. J.; Jordan, N.; Meyer, T.
2010-07-01
SummaryTwo-dimensional (2D) flow models based on the well-known governing 2D flow equations are applied to floodplain analysis purposes. These 2D models numerically solve the governing flow equations simultaneously or explicitly on a discretization of the floodplain using grid tiles or similar tile cell geometry, called "elements". By use of automated information systems such as digital terrain modeling, digital elevation models, and GIS, large-scale topographic floodplain maps can be readily discretized into thousands of elements that densely cover the floodplain in an edge-to-edge form. However, the assumed principal flow directions of the flow model analog, as applied across an array of elements, typically do not align with the floodplain flow streamlines. This paper examines the mathematical underpinnings of a four-direction flow analog using an array of square elements with respect to floodplain flow streamlines that are not in alignment with the analog's principal flow directions. It is determined that application of Manning's equation to estimate the friction slope terms of the governing flow equations, in directions that are not coincident with the flow streamlines, may introduce a bias in modeling results, in the form of slight underestimation of flow depths. It is also determined that the maximum theoretical bias, occurs when a single square element is rotated by about 13°, and not 45° as would be intuitively thought. The bias as a function of rotation angle for an array of square elements follows approximately the bias for a single square element. For both the theoretical single square element and an array of square elements, the bias as a function of alignment angle follows a relatively constant value from about 5° to about 85°, centered at about 45°. This bias was first noted about a decade prior to the present paper, and the magnitude of this bias was estimated then to be about 20% at about 10° misalignment. An adjustment of Manning's n is investigated based on a considered steady state uniform flow problem, but the magnitude of the adjustment (about 20%) is on the order of the magnitude of the accepted ranges of friction factors. For usual cases where random streamline trajectory variability within the floodplain flow is greater than a few degrees from perfect alignment, the apparent bias appears to be implicitly included in the Manning's n values. It can be concluded that the array of square elements may be applied over the digital terrain model without respect to topographic flow directions.
Streamline-curvature effect in three-dimensional boundary layers
NASA Technical Reports Server (NTRS)
Reed, Helen L.; Lin, Ray-Sing; Petraglia, Media M.
1992-01-01
The effect of including wall and streamline curvature terms in swept-wing boundary-layer stability calculations is studied. The linear disturbance equations are cast on a fixed, body-intrinsic, curvilinear coordinate system. Those nonparallel terms which contribute mainly to the streamline-curvature effect are retained in this formulation and approximated by their local finite-difference values. Convex-wall curvature has a stabilizing effect, while streamline curvature is destabilizing if the curvature exceeds a critical value.
View-Dependent Streamline Deformation and Exploration
Tong, Xin; Edwards, John; Chen, Chun-Ming; Shen, Han-Wei; Johnson, Chris R.; Wong, Pak Chung
2016-01-01
Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual clutter in 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures. Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely. PMID:26600061
Animating streamlines with repeated asymmetric patterns for steady flow visualization
NASA Astrophysics Data System (ADS)
Yeh, Chih-Kuo; Liu, Zhanping; Lee, Tong-Yee
2012-01-01
Animation provides intuitive cueing for revealing essential spatial-temporal features of data in scientific visualization. This paper explores the design of Repeated Asymmetric Patterns (RAPs) in animating evenly-spaced color-mapped streamlines for dense accurate visualization of complex steady flows. We present a smooth cyclic variable-speed RAP animation model that performs velocity (magnitude) integral luminance transition on streamlines. This model is extended with inter-streamline synchronization in luminance varying along the tangential direction to emulate orthogonal advancing waves from a geometry-based flow representation, and then with evenly-spaced hue differing in the orthogonal direction to construct tangential flow streaks. To weave these two mutually dual sets of patterns, we propose an energy-decreasing strategy that adopts an iterative yet efficient procedure for determining the luminance phase and hue of each streamline in HSL color space. We also employ adaptive luminance interleaving in the direction perpendicular to the flow to increase the contrast between streamlines.
View-Dependent Streamline Deformation and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, Xin; Edwards, John; Chen, Chun-Ming
Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual cluttering for visualizing 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures.more » Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely.« less
View-Dependent Streamline Deformation and Exploration.
Tong, Xin; Edwards, John; Chen, Chun-Ming; Shen, Han-Wei; Johnson, Chris R; Wong, Pak Chung
2016-07-01
Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual clutter in 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures. Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely.
Streamlined design and self reliant hardware for active control of precision space structures
NASA Technical Reports Server (NTRS)
Hyland, David C.; King, James A.; Phillips, Douglas J.
1994-01-01
Precision space structures may require active vibration control to satisfy critical performance requirements relating to line-of-sight pointing accuracy and the maintenance of precise, internal alignments. In order for vibration control concepts to become operational, it is necessary that their benefits be practically demonstrated in large scale ground-based experiments. A unique opportunity to carry out such demonstrations on a wide variety of experimental testbeds was provided by the NASA Control-Structure Integration (CSI) Guest Investigator (GI) Program. This report surveys the experimental results achieved by the Harris Corporation GI team on both Phases 1 and 2 of the program and provides a detailed description of Phase 2 activities. The Phase 1 results illustrated the effectiveness of active vibration control for space structures and demonstrated a systematic methodology for control design, implementation test. In Phase 2, this methodology was significantly streamlined to yield an on-site, single session design/test capability. Moreover, the Phase 2 research on adaptive neural control techniques made significant progress toward fully automated, self-reliant space structure control systems. As a further thrust toward productized, self-contained vibration control systems, the Harris Phase II activity concluded with experimental demonstration of new vibration isolation hardware suitable for a wide range of space-flight and ground-based commercial applications.The CSI GI Program Phase 1 activity was conducted under contract NASA1-18872, and the Phase 2 activity was conducted under NASA1-19372.
Qiu, Shuming; Xu, Guoai; Ahmad, Haseeb; Guo, Yanhui
2018-01-01
The Session Initiation Protocol (SIP) is an extensive and esteemed communication protocol employed to regulate signaling as well as for controlling multimedia communication sessions. Recently, Kumari et al. proposed an improved smart card based authentication scheme for SIP based on Farash's scheme. Farash claimed that his protocol is resistant against various known attacks. But, we observe some accountable flaws in Farash's protocol. We point out that Farash's protocol is prone to key-compromise impersonation attack and is unable to provide pre-verification in the smart card, efficient password change and perfect forward secrecy. To overcome these limitations, in this paper we present an enhanced authentication mechanism based on Kumari et al.'s scheme. We prove that the proposed protocol not only overcomes the issues in Farash's scheme, but it can also resist against all known attacks. We also provide the security analysis of the proposed scheme with the help of widespread AVISPA (Automated Validation of Internet Security Protocols and Applications) software. At last, comparing with the earlier proposals in terms of security and efficiency, we conclude that the proposed protocol is efficient and more secure.
Tang, Wanchun; Weil, Max Harry; Jorgenson, Dawn; Klouche, Kada; Morgan, Carl; Yu, Ting; Sun, Shijie; Snyder, David
2002-12-01
For adults, 150-J fixed-energy, impedance-compensating biphasic truncated exponential (ICBTE) shocks are now effectively used in automated defibrillators. However, the high energy levels delivered by adult automated defibrillators preclude their use for pediatric patients. Accordingly, we investigated a method by which adult automated defibrillators may be adapted to deliver a 50-J ICBTE shock for pediatric defibrillation. Prospective, randomized study. A university-affiliated research institution. Domestic piglets. We initially investigated four groups of anesthetized mechanically ventilated piglets weighing 3.8, 7.5, 15, and 25 kg. Ventricular fibrillation was induced with an AC current delivered to the right ventricular endocardium. After 7 mins of untreated ventricular fibrillation, a conventional manual defibrillator was used to deliver up to three 50-J ICBTE shocks. If ventricular fibrillation was not reversed, a 1-min interval of precordial compression preceded a second sequence of up to three shocks. The protocol was repeated until spontaneous circulation was restored, or for a total of 15 mins. In a second set of experiments, we evaluated a 150-J biphasic adult automated defibrillator that was operated in conjunction with energy-reducing electrodes such as to deliver 50-J shocks. The same resuscitation protocol was then exercised on piglets weighing 3.7, 13.5, and 24.2 kg. All animals were successfully resuscitated. Postresuscitation hemodynamic and myocardial function quickly returned to baseline values in both experimental groups, and all animals survived. An adaptation of a 150-J biphasic adult automated defibrillator in which energy-reducing electrodes delivered 50-J shocks successfully resuscitated animals ranging from 3.7 to 25 kg without compromise of postresuscitation myocardial function or survival.
Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.
Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
2016-09-06
Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies.
Design and Verification of a Distributed Communication Protocol
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
Automated measurement of office, home and ambulatory blood pressure in atrial fibrillation.
Kollias, Anastasios; Stergiou, George S
2014-01-01
1. Hypertension and atrial fibrillation (AF) often coexist and are strong risk factors for stroke. Current guidelines for blood pressure (BP) measurement in AF recommend repeated measurements using the auscultatory method, whereas the accuracy of the automated devices is regarded as questionable. This review presents the current evidence on the feasibility and accuracy of automated BP measurement in the presence of AF and the potential for automated detection of undiagnosed AF during such measurements. 2. Studies evaluating the use of automated BP monitors in AF are limited and have significant heterogeneity in methodology and protocols. Overall, the oscillometric method is feasible for static (office or home) and ambulatory use and appears to be more accurate for systolic than diastolic BP measurement. 3. Given that systolic hypertension is particularly common and important in the elderly, the automated BP measurement method may be acceptable for self-home and ambulatory monitoring, but not for professional office or clinic measurement. 4. An embedded algorithm for the detection of asymptomatic AF during routine automated BP measurement with high diagnostic accuracy has been developed and appears to be a useful screening tool for elderly hypertensives. © 2013 Wiley Publishing Asia Pty Ltd.
A fully automated non-external marker 4D-CT sorting algorithm using a serial cine scanning protocol.
Carnes, Greg; Gaede, Stewart; Yu, Edward; Van Dyk, Jake; Battista, Jerry; Lee, Ting-Yim
2009-04-07
Current 4D-CT methods require external marker data to retrospectively sort image data and generate CT volumes. In this work we develop an automated 4D-CT sorting algorithm that performs without the aid of data collected from an external respiratory surrogate. The sorting algorithm requires an overlapping cine scan protocol. The overlapping protocol provides a spatial link between couch positions. Beginning with a starting scan position, images from the adjacent scan position (which spatial match the starting scan position) are selected by maximizing the normalized cross correlation (NCC) of the images at the overlapping slice position. The process was continued by 'daisy chaining' all couch positions using the selected images until an entire 3D volume was produced. The algorithm produced 16 phase volumes to complete a 4D-CT dataset. Additional 4D-CT datasets were also produced using external marker amplitude and phase angle sorting methods. The image quality of the volumes produced by the different methods was quantified by calculating the mean difference of the sorted overlapping slices from adjacent couch positions. The NCC sorted images showed a significant decrease in the mean difference (p < 0.01) for the five patients.
Garrette, Rachel; Jones, Alisha L; Wilson, Martha W
2018-05-15
The purpose of this study is to investigate whether acoustic reflex threshold testing before administration of distortion product otoacoustic emissions can affect the results of the distortion product otoacoustic emissions testing using an automated protocol. Fifteen young adults with normal hearing ranging in age from 19 to 25 years participated in the study. All participants had clear external ear canals and normal Jerger Type A tympanograms and had passed a hearing screening. Testing was performed using the Interacoustics Titan acoustic reflex threshold and distortion product otoacoustic emissions protocol. Participants underwent baseline distortion product otoacoustic emissions. A paired-samples t test was conducted for both the right and left ears to assess within-group differences between baseline distortion product otoacoustic emissions and repeated distortion product otoacoustic emissions measures. No significant differences were found in distortion product otoacoustic emission measures following administration of acoustic reflexes. The use of a protocol when using an automated system that includes both acoustic reflexes and distortion product otoacoustic emissions is important. Overall, presentation of acoustic reflexes prior to measuring distortion product otoacoustic emission did not affect distortion product otoacoustic emission results; therefore, test sequence can be modified as needed.
EHR strategy: top down, bottom up or middle out?
Bowden, Thomas C
2011-01-01
Around the world a number of countries have made a concerted effort to embed Information and Communications Technology (ICT) within their health systems. It is widely acknowledged that the successful application of ICT to health systems can bring about significant benefits. A number of areas commonly singled out for improvement include: coordination of care; improved medication management; and streamlining the transfer of a patient's care from one healthcare provider to another. There are also perceived cost-benefits including reduced duplication of services and improved service utilization. Countries across the world have chosen many and varied paths to automating their health systems. Health systems are intrinsically very complicated and changing rapidly. Because they represent a high proportion of government expenditure, it is important to understand what is being achieved by each of the broad approaches that are being taken.
XMM-Newton mission operations - ready for its third decade
NASA Astrophysics Data System (ADS)
Kirsch, M.; Finn, T.; Godard, T.; v. Krusenstiern, N.; Pfeil, N.; Salt, D.; Toma, L.; Webert, D.; Weissmann, U.
2017-10-01
The XMM-Newton X-ray space observatory is approaching its third decade of operations. The spacecraft and payload are operating without major degradation and scientific demand is continuously very high. With the change to a new way of using the Attitude and Orbit control System in 2013 the fuel consumption was reduced by a factor of two, additionally this has reduced stress on the reaction wheels. The challenge for the next decade is now to ensure that the saved fuel is available for continuous usage. We will describe the process of the so called 'fuel migration and replenishment' activities needed to keep the spacecraft operational potentially up to 2029+. We provide as well an overall health status of the mission, the evolution of the ground segment and concepts on streamlining mission operations with continued high safety requirements using automation tools.
Using CCSDS Standards to Reduce Mission Costs
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan
2017-01-01
NASA's open source Core Flight System (cFS) software framework has been using several Consultative Committee for Space Data Systems (CCSDS) standards since its inception. Recently developed CCSDS standards are now being applied by NASA, ESA and other organizations to streamline and automate aspects of mission development, test, and operations, speeding mission schedules and reducing mission costs. This paper will present the new CCSDS Spacecraft Onboard Interfaces Services (SOIS) Electronic Data Sheet (EDS) standards and show how they are being applied to data interfaces in the cFS software framework, tool chain, and ground systems across a range of missions at NASA. Although NASA is focusing on the cFS, it expected that these technologies are well suited for use in other system architectures and can lower costs for a wide range of both large and small satellites.
78 FR 78948 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...In compliance with Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995, and as part of an effort to streamline the process to seek feedback from the public on service delivery, the Department of Defense announces a proposed generic information collection and seeks public comment on the provisions thereof. Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed information collection; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the information collection on respondents, including through the use of automated collection techniques or other forms of information technology.
78 FR 78947 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...In compliance with Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995, and as part of an effort to streamline the process to seek feedback from the public on service delivery, the Department of Defense announces a proposed generic information collection and seeks public comment on the provisions thereof. Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed information collection; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the information collection on respondents, including through the use of automated collection techniques or other forms of information technology.
78 FR 78950 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...In compliance with Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995, and as part of an effort to streamline the process to seek feedback from the public on service delivery, the Department of Defense announces a proposed generic information collection and seeks public comment on the provisions thereof. Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed information collection; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the information collection on respondents, including through the use of automated collection techniques or other forms of information technology.
78 FR 78938 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...In compliance with Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995, and as part of an effort to streamline the process to seek feedback from the public on service delivery, the Department of Defense announces a proposed generic information collection and seeks public comment on the provisions thereof. Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility; (b) the accuracy of the agency's estimate of the burden of the proposed information collection; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the information collection on respondents, including through the use of automated collection techniques or other forms of information technology.
[Automated anesthesia record systems].
Heinrichs, W; Mönk, S; Eberle, B
1997-07-01
The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in the near future. The advantages of accurate documentation and quality control in the presence of careful planning outweight cost considerations by far.
Job Prospects for Manufacturing Engineers.
ERIC Educational Resources Information Center
Basta, Nicholas
1985-01-01
Coming from a variety of disciplines, manufacturing engineers are keys to industry's efforts to modernize, with demand exceeding supply. The newest and fastest-growing areas include machine vision, composite materials, and manufacturing automation protocols, each of which is briefly discussed. (JN)
AHTD cracking protocol application with automated distress survey for design and management.
DOT National Transportation Integrated Search
2011-03-09
Manual surveys of pavement cracking have problems associated with variability, repeatability, processing : speed, and cost. If conducted in the field, safety and related liability of manual survey present challenges : to highway agencies. Therefore a...
NASA Technical Reports Server (NTRS)
Sulyma, P. R.; Mcanally, J. V.
1975-01-01
The streamline divergence program was developed to demonstrate the capability to trace inviscid surface streamlines and to calculate outflow-corrected laminar and turbulent convective heating rates on surfaces subjected to exhaust plume impingement. The analytical techniques used in formulating this program are discussed. A brief description of the streamline divergence program is given along with a user's guide. The program input and output for a sample case are also presented.
Aerofoil testing in a self-streamlining flexible walled wind tunnel. Ph.D. Thesis - Jul. 1987
NASA Technical Reports Server (NTRS)
Lewis, Mark Charles
1988-01-01
Two-dimensional self-streamlining flexible walled test sections eliminate, as far as experimentally possible, the top and bottom wall interference effects in transonic airfoil testing. The test section sidewalls are rigid, while the impervious top and bottom walls are flexible and contoured to streamline shapes by a system of jacks, without reference to the airfoil model. The concept of wall contouring to eliminate or minimize test section boundary interference in 2-D testing was first demonstrated by NPL in England during the early 40's. The transonic streamlining strategy proposed, developed and used by NPL has been compared with several modern strategies. The NPL strategy has proved to be surprisingly good at providing a wall interference-free test environment, giving model performance indistinguishable from that obtained using the modern strategies over a wide range of test conditions. In all previous investigations the achievement of wall streamlining in flexible walled test sections has been limited to test sections up to those resulting in the model's shock just extending to a streamlined wall. This work however, has also successfully demonstrated the feasibility of 2-D wall streamlining at test conditions where both model shocks have reached and penetrated through their respective flexible walls. Appropriate streamlining procedures have been established and are uncomplicated, enabling flexible walled test sections to cope easily with these high transonic flows.
Post, Harm; Penning, Renske; Fitzpatrick, Martin A; Garrigues, Luc B; Wu, W; MacGillavry, Harold D; Hoogenraad, Casper C; Heck, Albert J R; Altelaar, A F Maarten
2017-02-03
Because of the low stoichiometry of protein phosphorylation, targeted enrichment prior to LC-MS/MS analysis is still essential. The trend in phosphoproteome analysis is shifting toward an increasing number of biological replicates per experiment, ideally starting from very low sample amounts, placing new demands on enrichment protocols to make them less labor-intensive, more sensitive, and less prone to variability. Here we assessed an automated enrichment protocol using Fe(III)-IMAC cartridges on an AssayMAP Bravo platform to meet these demands. The automated Fe(III)-IMAC-based enrichment workflow proved to be more effective when compared to a TiO 2 -based enrichment using the same platform and a manual Ti(IV)-IMAC-based enrichment workflow. As initial samples, a dilution series of both human HeLa cell and primary rat hippocampal neuron lysates was used, going down to 0.1 μg of peptide starting material. The optimized workflow proved to be efficient, sensitive, and reproducible, identifying, localizing, and quantifying thousands of phosphosites from just micrograms of starting material. To further test the automated workflow in genuine biological applications, we monitored EGF-induced signaling in hippocampal neurons, starting with only 200 000 primary cells, resulting in ∼50 μg of protein material. This revealed a comprehensive phosphoproteome, showing regulation of multiple members of the MAPK pathway and reduced phosphorylation status of two glutamate receptors involved in synaptic plasticity.
CASE STUDIES EXAMINING LCA STREAMLINING TECHNIQUES
Pressure is mounting for more streamlined Life Cycle Assessment (LCA) methods that allow for evaluations that are quick and simple, but accurate. As part of an overall research effort to develop and demonstrate streamlined LCA, the U.S. Environmental Protection Agency has funded ...
Acquisition streamlining: A cultural change
NASA Technical Reports Server (NTRS)
Stewart, Jesse
1992-01-01
The topics are presented in viewgraph form and include the following: the defense systems management college, educational philosophy, the defense acquisition environment, streamlining initiatives, organizational streamlining types, defense law review, law review purpose, law review objectives, the Public Law Pilot Program, and cultural change.
Identifying Requirements for Effective Human-Automation Teamwork
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; John O'Hara; Heather D. Medema
Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based onmore » a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.« less
Ford, Andria L; Williams, Jennifer A; Spencer, Mary; McCammon, Craig; Khoury, Naim; Sampson, Tomoko R; Panagos, Peter; Lee, Jin-Moo
2012-12-01
Earlier tissue-type plasminogen activator (tPA) treatment for acute ischemic stroke increases efficacy, prompting national efforts to reduce door-to-needle times. We used lean process improvement methodology to develop a streamlined intravenous tPA protocol. In early 2011, a multidisciplinary team analyzed the steps required to treat patients with acute ischemic stroke with intravenous tPA using value stream analysis (VSA). We directly compared the tPA-treated patients in the "pre-VSA" epoch with the "post-VSA" epoch with regard to baseline characteristics, protocol metrics, and clinical outcomes. The VSA revealed several tPA protocol inefficiencies: routing of patients to room, then to CT, then back to the room; serial processing of workflow; and delays in waiting for laboratory results. On March 1, 2011, a new protocol incorporated changes to minimize delays: routing patients directly to head CT before the patient room, using parallel process workflow, and implementing point-of-care laboratories. In the pre and post-VSA epochs, 132 and 87 patients were treated with intravenous tPA, respectively. Compared with pre-VSA, door-to-needle times and percent of patients treated ≤60 minutes from hospital arrival were improved in the post-VSA epoch: 60 minutes versus 39 minutes (P<0.0001) and 52% versus 78% (P<0.0001), respectively, with no change in symptomatic hemorrhage rate. Lean process improvement methodology can expedite time-dependent stroke care without compromising safety.
Reducing Door-to-Needle Times using Toyota’s Lean Manufacturing Principles and Value Stream Analysis
Ford, Andria L.; Williams, Jennifer A.; Spencer, Mary; McCammon, Craig; Khoury, Naim; Sampson, Tomoko; Panagos, Peter; Lee, Jin-Moo
2012-01-01
Background Earlier tPA treatment for acute ischemic stroke increases efficacy, prompting national efforts to reduce door-to-needle times (DNTs). We utilized lean process improvement methodology to develop a streamlined IV tPA protocol. Methods In early 2011, a multi-disciplinary team analyzed the steps required to treat acute ischemic stroke patients with IV tPA, utilizing value stream analysis (VSA). We directly compared the tPA-treated patients in the “pre-VSA” epoch to the “post-VSA” epoch with regard to baseline characteristics, protocol metrics, and clinical outcomes. Results The VSA revealed several tPA protocol inefficiencies: routing of patients to room, then to CT, then back to room; serial processing of work flow; and delays in waiting for lab results. On 3/1/2011, a new protocol incorporated changes to minimize delays: routing patients directly to head CT prior to patient room, utilizing parallel process work-flow, and implementing point-of-care labs. In the pre-and post-VSA epochs, 132 and 87 patients were treated with IV tPA, respectively. Compared to pre-VSA, DNTs and percent of patients treated ≤60 minutes from hospital arrival were improved in the post-VSA epoch: 60 min vs. 39 min (p<0.0001) and 52% vs. 78% (p<0.0001), respectively, with no change in symptomatic hemorrhage rate. Conclusions Lean process improvement methodology can expedite time-dependent stroke care, without compromising safety. PMID:23138440
NASA Astrophysics Data System (ADS)
Polan, Daniel F.; Brady, Samuel L.; Kaufman, Robert A.
2016-09-01
There is a need for robust, fully automated whole body organ segmentation for diagnostic CT. This study investigates and optimizes a Random Forest algorithm for automated organ segmentation; explores the limitations of a Random Forest algorithm applied to the CT environment; and demonstrates segmentation accuracy in a feasibility study of pediatric and adult patients. To the best of our knowledge, this is the first study to investigate a trainable Weka segmentation (TWS) implementation using Random Forest machine-learning as a means to develop a fully automated tissue segmentation tool developed specifically for pediatric and adult examinations in a diagnostic CT environment. Current innovation in computed tomography (CT) is focused on radiomics, patient-specific radiation dose calculation, and image quality improvement using iterative reconstruction, all of which require specific knowledge of tissue and organ systems within a CT image. The purpose of this study was to develop a fully automated Random Forest classifier algorithm for segmentation of neck-chest-abdomen-pelvis CT examinations based on pediatric and adult CT protocols. Seven materials were classified: background, lung/internal air or gas, fat, muscle, solid organ parenchyma, blood/contrast enhanced fluid, and bone tissue using Matlab and the TWS plugin of FIJI. The following classifier feature filters of TWS were investigated: minimum, maximum, mean, and variance evaluated over a voxel radius of 2 n , (n from 0 to 4), along with noise reduction and edge preserving filters: Gaussian, bilateral, Kuwahara, and anisotropic diffusion. The Random Forest algorithm used 200 trees with 2 features randomly selected per node. The optimized auto-segmentation algorithm resulted in 16 image features including features derived from maximum, mean, variance Gaussian and Kuwahara filters. Dice similarity coefficient (DSC) calculations between manually segmented and Random Forest algorithm segmented images from 21 patient image sections, were analyzed. The automated algorithm produced segmentation of seven material classes with a median DSC of 0.86 ± 0.03 for pediatric patient protocols, and 0.85 ± 0.04 for adult patient protocols. Additionally, 100 randomly selected patient examinations were segmented and analyzed, and a mean sensitivity of 0.91 (range: 0.82-0.98), specificity of 0.89 (range: 0.70-0.98), and accuracy of 0.90 (range: 0.76-0.98) were demonstrated. In this study, we demonstrate that this fully automated segmentation tool was able to produce fast and accurate segmentation of the neck and trunk of the body over a wide range of patient habitus and scan parameters.
Chaudhry, Waseem; Hussain, Nasir; Ahlberg, Alan W.; Croft, Lori B.; Fernandez, Antonio B.; Parker, Mathew W.; Swales, Heather H.; Slomka, Piotr J.; Henzlova, Milena J.; Duvall, W. Lane
2016-01-01
Background A stress-first myocardial perfusion imaging (MPI) protocol saves time, is cost effective, and decreases radiation exposure. A limitation of this protocol is the requirement for physician review of the stress images to determine the need for rest images. This hurdle could be eliminated if an experienced technologist and/or automated computer quantification could make this determination. Methods Images from consecutive patients who were undergoing a stress-first MPI with attenuation correction at two tertiary care medical centers were prospectively reviewed independently by a technologist and cardiologist blinded to clinical and stress test data. Their decision on the need for rest imaging along with automated computer quantification of perfusion results was compared with the clinical reference standard of an assessment of perfusion images by a board-certified nuclear cardiologist that included clinical and stress test data. Results A total of 250 patients (mean age 61 years and 55% female) who underwent a stress-first MPI were studied. According to the clinical reference standard, 42 (16.8%) and 208 (83.2%) stress-first images were interpreted as “needing” and “not needing” rest images, respectively. The technologists correctly classified 229 (91.6%) stress-first images as either “needing” (n = 28) or “not needing” (n = 201) rest images. Their sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were 66.7%, 96.6%, 80.0%, and 93.5%, respectively. An automated stress TPD score ≥1.2 was associated with optimal sensitivity and specificity and correctly classified 179 (71.6%) stress-first images as either “needing” (n = 31) or “not needing” (n = 148) rest images. Its sensitivity, specificity, PPV, and NPV were 73.8%, 71.2%, 34.1%, and 93.1%, respectively. In a model whereby the computer or technologist could correct for the other's incorrect classification, 242 (96.8%) stress-first images were correctly classified. The composite sensitivity, specificity, PPV, and NPV were 83.3%, 99.5%, 97.2%, and 96.7%, respectively. Conclusion Technologists and automated quantification software had a high degree of agreement with the clinical reference standard for determining the need for rest images in a stress-first imaging protocol. Utilizing an experienced technologist and automated systems to screen stress-first images could expand the use of stress-first MPI to sites where the cardiologist is not immediately available for interpretation. PMID:26566774
Jonas, Wayne B; Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela
2017-01-01
Answering the question of "what works" in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL © ), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods-CAP, REAL, and EP-can be integrated into a strategic approach to help answer the question "what works in healthcare?" and what it means in a comprehensive way. SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and wellness.
77 FR 14700 - Streamlining Inherited Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-13
... contains notices to the public of #0;the proposed issuance of rules and regulations. The purpose of these... X [Docket No. CFPB-2011-0039] Streamlining Inherited Regulations AGENCY: Bureau of Consumer... the public for streamlining regulations it recently inherited from other Federal agencies (the...
Synthesis of many different types of organic small molecules using one automated process.
Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D
2015-03-13
Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis. Copyright © 2015, American Association for the Advancement of Science.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
Carter, Catherine F; Lange, Heiko; Sakai, Daiki; Baxendale, Ian R; Ley, Steven V
2011-03-14
Diastereoselective chain-elongation reactions are important transformations for the assembly of complex molecular structures, such as those present in polyketide natural products. Here we report new methods for performing crotylation reactions and homopropargylation reactions by using newly developed low-temperature flow-chemistry technology. In-line purification protocols are described, as well as the application of the crotylation protocol in an automated multi-step sequence. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1982-02-23
segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with
Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Garg, Shailesh; Hori, Masatoshi; Oto, Aytekin; Baron, Richard L.
2014-01-01
OBJECTIVE The purpose of this study was to evaluate automated CT volumetry in the assessment of living-donor livers for transplant and to compare this technique with software-aided interactive volumetry and manual volumetry. MATERIALS AND METHODS Hepatic CT scans of 18 consecutively registered prospective liver donors were obtained under a liver transplant protocol. Automated liver volumetry was developed on the basis of 3D active-contour segmentation. To establish reference standard liver volumes, a radiologist manually traced the contour of the liver on each CT slice. We compared the results obtained with automated and interactive volumetry with those obtained with the reference standard for this study, manual volumetry. RESULTS The average interactive liver volume was 1553 ± 343 cm3, and the average automated liver volume was 1520 ± 378 cm3. The average manual volume was 1486 ± 343 cm3. Both interactive and automated volumetric results had excellent agreement with manual volumetric results (intraclass correlation coefficients, 0.96 and 0.94). The average user time for automated volumetry was 0.57 ± 0.06 min/case, whereas those for interactive and manual volumetry were 27.3 ± 4.6 and 39.4 ± 5.5 min/case, the difference being statistically significant (p < 0.05). CONCLUSION Both interactive and automated volumetry are accurate for measuring liver volume with CT, but automated volumetry is substantially more efficient. PMID:21940543
Suzuki, Kenji; Epstein, Mark L; Kohlbrenner, Ryan; Garg, Shailesh; Hori, Masatoshi; Oto, Aytekin; Baron, Richard L
2011-10-01
The purpose of this study was to evaluate automated CT volumetry in the assessment of living-donor livers for transplant and to compare this technique with software-aided interactive volumetry and manual volumetry. Hepatic CT scans of 18 consecutively registered prospective liver donors were obtained under a liver transplant protocol. Automated liver volumetry was developed on the basis of 3D active-contour segmentation. To establish reference standard liver volumes, a radiologist manually traced the contour of the liver on each CT slice. We compared the results obtained with automated and interactive volumetry with those obtained with the reference standard for this study, manual volumetry. The average interactive liver volume was 1553 ± 343 cm(3), and the average automated liver volume was 1520 ± 378 cm(3). The average manual volume was 1486 ± 343 cm(3). Both interactive and automated volumetric results had excellent agreement with manual volumetric results (intraclass correlation coefficients, 0.96 and 0.94). The average user time for automated volumetry was 0.57 ± 0.06 min/case, whereas those for interactive and manual volumetry were 27.3 ± 4.6 and 39.4 ± 5.5 min/case, the difference being statistically significant (p < 0.05). Both interactive and automated volumetry are accurate for measuring liver volume with CT, but automated volumetry is substantially more efficient.
Jones, Darryl R; Thomas, Dallas; Alger, Nicholas; Ghavidel, Ata; Inglis, G Douglas; Abbott, D Wade
2018-01-01
Deposition of new genetic sequences in online databases is expanding at an unprecedented rate. As a result, sequence identification continues to outpace functional characterization of carbohydrate active enzymes (CAZymes). In this paradigm, the discovery of enzymes with novel functions is often hindered by high volumes of uncharacterized sequences particularly when the enzyme sequence belongs to a family that exhibits diverse functional specificities (i.e., polyspecificity). Therefore, to direct sequence-based discovery and characterization of new enzyme activities we have developed an automated in silico pipeline entitled: Sequence Analysis and Clustering of CarboHydrate Active enzymes for Rapid Informed prediction of Specificity (SACCHARIS). This pipeline streamlines the selection of uncharacterized sequences for discovery of new CAZyme or CBM specificity from families currently maintained on the CAZy website or within user-defined datasets. SACCHARIS was used to generate a phylogenetic tree of a GH43, a CAZyme family with defined subfamily designations. This analysis confirmed that large datasets can be organized into sequence clusters of manageable sizes that possess related functions. Seeding this tree with a GH43 sequence from Bacteroides dorei DSM 17855 (BdGH43b, revealed it partitioned as a single sequence within the tree. This pattern was consistent with it possessing a unique enzyme activity for GH43 as BdGH43b is the first described α-glucanase described for this family. The capacity of SACCHARIS to extract and cluster characterized carbohydrate binding module sequences was demonstrated using family 6 CBMs (i.e., CBM6s). This CBM family displays a polyspecific ligand binding profile and contains many structurally determined members. Using SACCHARIS to identify a cluster of divergent sequences, a CBM6 sequence from a unique clade was demonstrated to bind yeast mannan, which represents the first description of an α-mannan binding CBM. Additionally, we have performed a CAZome analysis of an in-house sequenced bacterial genome and a comparative analysis of B. thetaiotaomicron VPI-5482 and B. thetaiotaomicron 7330, to demonstrate that SACCHARIS can generate "CAZome fingerprints", which differentiate between the saccharolytic potential of two related strains in silico. Establishing sequence-function and sequence-structure relationships in polyspecific CAZyme families are promising approaches for streamlining enzyme discovery. SACCHARIS facilitates this process by embedding CAZyme and CBM family trees generated from biochemically to structurally characterized sequences, with protein sequences that have unknown functions. In addition, these trees can be integrated with user-defined datasets (e.g., genomics, metagenomics, and transcriptomics) to inform experimental characterization of new CAZymes or CBMs not currently curated, and for researchers to compare differential sequence patterns between entire CAZomes. In this light, SACCHARIS provides an in silico tool that can be tailored for enzyme bioprospecting in datasets of increasing complexity and for diverse applications in glycobiotechnology.
Lo, Sheng-Ying; Baird, Geoffrey S; Greene, Dina N
2015-12-07
Proper utilization of resources is an important operational objective for clinical laboratories. To reduce unnecessary manual interventions on automated instruments, we conducted a workflow analysis that optimized dilution parameters and reporting of abnormally high chemistry results for the Beckman AU series of chemistry analyzers while maintaining clinically acceptable reportable ranges. Workflow analysis for the Beckman AU680/5812 and DxC800 chemistry analyzers was performed using historical data. Clinical reportable ranges for 53 chemistry analytes were evaluated. Optimized dilution parameters and upper limit of reportable ranges for the AU680/5812 instruments were derived and validated to meet these reportable ranges. The number of specimens that required manual dilutions before and after optimization was determined for both the AU680/5812 and DxC800, with the DxC800 serving as the reference instrument. Retrospective data analysis revealed that 7700 specimens required manual dilutions on the DxC over a 2-y period. Using our optimized AU-specific dilution and reporting parameters, the data-driven simulation analysis showed a 61% reduction in manual dilutions. For the specimens that required manual dilutions on the AU680/5812, we developed standardized dilution procedures to further streamline workflow. We provide a data-driven, practical outline for clinical laboratories to efficiently optimize their use of automated chemistry analyzers. The outcomes can be used to assist laboratories wishing to improve their existing procedures or to facilitate transitioning into a new line of instrumentation, regardless of the instrument model or manufacturer. Copyright © 2015 Elsevier B.V. All rights reserved.
Novel microneutralization assay for HCMV using automated data collection and analysis.
Abai, Anna Maria; Smith, Larry R; Wloch, Mary K
2007-04-30
In addition to being sensitive and specific, an assay for the assessment of neutralizing antibody activity from clinical trial samples must be amenable to automation for use in high-volume screening. To that effect, we developed a 96-well microplate assay for the measurement of HCMV-neutralizing activity in human sera using the HCMV-permissive human cell line HEL-299 and the laboratory strain of HCMV AD169. The degree to which neutralizing antibodies diminish HCMV infection of cells in the assay is determined by quantifying the nuclei of infected cells based on expression of the 72 kDa IE1 viral protein. Nuclear IE1 is visualized using a highly sensitive immunoperoxidase staining and the stained nuclei are counted using an automated ELISPOT analyzer. The use of Half Area 96-well microplates, with wells in which the surface area of the well bottom is half the area of a standard 96-well microplate plate, improves signal detection compared with standard microplates and economizes on the usage of indicator cells, virus, and reagents. The staining process was also streamlined by using a microplate washer and data analysis was simplified and accelerated by employing a software program that automatically plots neutralization curves and determines NT(50) values using 4-PL curve fitting. The optimized assay is not only fast and convenient, but also specific, sensitive, precise and reproducible and thus has the characteristics necessary for use in measuring HCMV-neutralizing activity in the sera of vaccine trial subjects such as the recipients of Vical's HCMV pDNA vaccine candidates.
Mathur, Gagan; Haugen, Thomas H; Davis, Scott L; Krasowski, Matthew D
2014-01-01
Interfacing of clinical laboratory instruments with the laboratory information system (LIS) via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia(®) Capillarys-2™ (Norcross, GA, USA) instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner) that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.
La Prairie, A J; Gross, M
1991-02-01
The banking of femoral heads from patients who undergo total hip arthroplasty provides a valuable resource for orthopedic surgery. Quality assurance of the banked bone used in clinical procedures requires documented policies for screening, procuring, storing and distributing. Potential donors are screened at the time of donation for malignant disease, possible communicable disease, sepsis and high-risk life-styles. After negative culture results are confirmed and appropriate documentation has been completed, the bone is frozen at -70 degrees C. A quarantine period of 90 days follows. The donor is followed up 90 days or more postoperatively. At that time written consent is obtained for donation of the recovered tissue to the bone bank and for serology testing for human immunodeficiency virus (HIV-1) antibody, hepatitis B surface antigen (HBsAG), hepatitis B core antibody (HBcAb) and syphilis, and the donor is rescreened for contraindications. This protocol meets or exceeds all existing standards. The combination of obtaining consent and serology testing at 90 days streamlines the logistics of banking bone from surgical donors.
Del Mazo-Barbara, Anna; Mirabel, Clémentine; Nieto, Valentín; Reyes, Blanca; García-López, Joan; Oliver-Vila, Irene; Vives, Joaquim
2016-09-01
Computerized systems (CS) are essential in the development and manufacture of cell-based medicines and must comply with good manufacturing practice, thus pushing academic developers to implement methods that are typically found within pharmaceutical industry environments. Qualitative and quantitative risk analyses were performed by Ishikawa and Failure Mode and Effects Analysis, respectively. A process for qualification of a CS that keeps track of environmental conditions was designed and executed. The simplicity of the Ishikawa analysis permitted to identify critical parameters that were subsequently quantified by Failure Mode Effects Analysis, resulting in a list of test included in the qualification protocols. The approach presented here contributes to simplify and streamline the qualification of CS in compliance with pharmaceutical quality standards.
Programs Automate Complex Operations Monitoring
NASA Technical Reports Server (NTRS)
2009-01-01
Kennedy Space Center, just off the east coast of Florida on Merritt Island, has been the starting place of every human space flight in NASA s history. It is where the first Americans left Earth during Project Mercury, the terrestrial departure point of the lunar-bound Apollo astronauts, as well as the last solid ground many astronauts step foot on before beginning their long stays aboard the International Space Station. It will also be the starting point for future NASA missions to the Moon and Mars and temporary host of the new Ares series rockets designed to take us there. Since the first days of the early NASA missions, in order to keep up with the demands of the intricate and critical Space Program, the launch complex - host to the large Vehicle Assembly Building, two launch pads, and myriad support facilities - has grown increasingly complex to accommodate the sophisticated technologies needed to manage today s space missions. To handle the complicated launch coordination safely, NASA found ways to automate mission-critical applications, resulting in streamlined decision-making. One of these methods, management software called the Control Monitor Unit (CMU), created in conjunction with McDonnell Douglas Space & Defense Systems, has since left NASA, and is finding its way into additional applications.
StrAuto: automation and parallelization of STRUCTURE analysis.
Chhatre, Vikram E; Emerson, Kevin J
2017-03-24
Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .
EPICS controlled sample mounting robots at the GM/CA CAT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, O. A.; Benn, R.; Corcoran, S.
2007-11-11
GM/CA CAT at Sector 23 of the advanced photon source (APS) is an NIH funded facility for crystallographic structure determination of biological macromolecules by X-ray diffraction [R.F. Fischetti, et al., GM/CA canted undulator beamlines for protein crystallography, Acta Crystallogr. A 61 (2005) C139]. The facility consists of three beamlines; two based on canted undulators and one on a bending magnet. The scientific and technical goals of the CAT emphasize streamlined, efficient throughput for a variety of sample types, sizes and qualities, representing the cutting edge of structural biology research. For this purpose all three beamlines are equipped with the ALS-stylemore » robots [C.W.Cork, et al. Status of the BCSB automated sample mounting and alignment system for macromolecular crystallography at the Advanced Light Source, SRI-2003, San-Francisco, CA, USA, August 25-29, 2003] for an automated mounting of cryo-protected macromolecular crystals. This report summarizes software and technical solutions implemented with the first of the three operational robots at beamline 23-ID-B. The automounter's Dewar can hold up to 72 or 96 samples residing in six Rigaku ACTOR magazines or ALS-style pucks, respectively. Mounting of a crystal takes approximately 2 s, during which time the temperature of the crystal is maintained near that of liquid nitrogen.« less
Mühlebach, Anneke; Adam, Joachim; Schön, Uwe
2011-11-01
Automated medicinal chemistry (parallel chemistry) has become an integral part of the drug-discovery process in almost every large pharmaceutical company. Parallel array synthesis of individual organic compounds has been used extensively to generate diverse structural libraries to support different phases of the drug-discovery process, such as hit-to-lead, lead finding, or lead optimization. In order to guarantee effective project support, efficiency in the production of compound libraries has been maximized. As a consequence, also throughput in chromatographic purification and analysis has been adapted. As a recent trend, more laboratories are preparing smaller, yet more focused libraries with even increasing demands towards quality, i.e. optimal purity and unambiguous confirmation of identity. This paper presents an automated approach how to combine effective purification and structural conformation of a lead optimization library created by microwave-assisted organic synthesis. The results of complementary analytical techniques such as UHPLC-HRMS and NMR are not only regarded but even merged for fast and easy decision making, providing optimal quality of compound stock. In comparison with the previous procedures, throughput times are at least four times faster, while compound consumption could be decreased more than threefold. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improving Initiation and Tracking of Research Projects at an Academic Health Center: A Case Study.
Schmidt, Susanne; Goros, Martin; Parsons, Helen M; Saygin, Can; Wan, Hung-Da; Shireman, Paula K; Gelfond, Jonathan A L
2017-09-01
Research service cores at academic health centers are important in driving translational advancements. Specifically, biostatistics and research design units provide services and training in data analytics, biostatistics, and study design. However, the increasing demand and complexity of assigning appropriate personnel to time-sensitive projects strains existing resources, potentially decreasing productivity and increasing costs. Improving processes for project initiation, assigning appropriate personnel, and tracking time-sensitive projects can eliminate bottlenecks and utilize resources more efficiently. In this case study, we describe our application of lean six sigma principles to our biostatistics unit to establish a systematic continual process improvement cycle for intake, allocation, and tracking of research design and data analysis projects. The define, measure, analyze, improve, and control methodology was used to guide the process improvement. Our goal was to assess and improve the efficiency and effectiveness of operations by objectively measuring outcomes, automating processes, and reducing bottlenecks. As a result, we developed a web-based dashboard application to capture, track, categorize, streamline, and automate project flow. Our workflow system resulted in improved transparency, efficiency, and workload allocation. Using the dashboard application, we reduced the average study intake time from 18 to 6 days, a 66.7% reduction over 12 months (January to December 2015).
Why physicians need to be more than automated medical kiosks.
Bynum, William
2014-02-01
The last 20 years have seen an unprecedented technological revolution, including the development of the personal computer. The new technologies that have emerged during this age of innovation have allowed human beings to connect widely with one another through electronic media and have made life more efficient and streamlined. Likewise, this technological renaissance has helped to define medicine as one of the most innovative professions by providing physicians with diagnostics and interventions that are more accurate, efficacious, and safe, to the benefit of physicians and the public. However, in both life and the practice of medicine, these new technologies have had the unintended consequence of reducing the value of direct human connection and threaten to isolate individuals in spite of advancing society. In this commentary, the author argues that human beings need to make a more concerted effort to connect with each other through both enhanced communication technologies and direct human contact. Likewise, leaders in medicine need to embrace and promote technological advancement while at the same time working to maintain the human connection that physicians have with their patients and teaching learners to do the same. Doing so will prevent physicians from becoming automated medical kiosks that offer sound, innovative medical advice but that lack the personality, compassion, and emotion that will lead to better health.
Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries
Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.
2015-01-01
PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706
Evaluation of four automated protocols for extraction of DNA from FTA cards.
Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels
2013-10-01
Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.
SWIFT MODELLER: a Java based GUI for molecular modeling.
Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S
2011-10-01
MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.
Re-refinement from deposited X-ray data can deliver improved models for most PDB entries.
Joosten, Robbie P; Womack, Thomas; Vriend, Gert; Bricogne, Gérard
2009-02-01
The deposition of X-ray data along with the customary structural models defining PDB entries makes it possible to apply large-scale re-refinement protocols to these entries, thus giving users the benefit of improvements in X-ray methods that have occurred since the structure was deposited. Automated gradient refinement is an effective method to achieve this goal, but real-space intervention is most often required in order to adequately address problems detected by structure-validation software. In order to improve the existing protocol, automated re-refinement was combined with structure validation and difference-density peak analysis to produce a catalogue of problems in PDB entries that are amenable to automatic correction. It is shown that re-refinement can be effective in producing improvements, which are often associated with the systematic use of the TLS parameterization of B factors, even for relatively new and high-resolution PDB entries, while the accompanying manual or semi-manual map analysis and fitting steps show good prospects for eventual automation. It is proposed that the potential for simultaneous improvements in methods and in re-refinement results be further encouraged by broadening the scope of depositions to include refinement metadata and ultimately primary rather than reduced X-ray data.
ACHP | News | Nationwide Programmatic Agreement Streamlines 106 Process for
Publications Search skip specific nav links Home arrow News arrow Nationwide Programmatic Agreement Streamlines 106 Process for NPS Nationwide Programmatic Agreement Streamlines 106 Process for NPS Pursuant to Service (NPS) on November 14, 2008, executed a nationwide Programmatic Agreement (PA) with the Advisory
Enhanced Performance of Streamline-Traced External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2015-01-01
A computational design study was conducted to enhance the aerodynamic performance of streamline-traced, external-compression inlets for Mach 1.6. The current study explored a new parent flowfield for the streamline tracing and several variations of inlet design factors, including the axial displacement and angle of the subsonic cowl lip, the vertical placement of the engine axis, and the use of porous bleed in the subsonic diffuser. The performance was enhanced over that of an earlier streamline-traced inlet such as to increase the total pressure recovery and reduce total pressure distortion
Streamlining genomes: toward the generation of simplified and stabilized microbial systems.
Leprince, Audrey; van Passel, Mark W J; dos Santos, Vitor A P Martins
2012-10-01
At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and random) helps to generate simplified, stabilized and predictable genomes, whereas multiplexing genome engineering reveals a broad functional genetic diversity. The decrease in oligo and gene synthesis costs promises effective combinatorial tools for the generation of chassis based on streamlined and tractable genomes. Here we review recent progresses in streamlining genomes through recombineering techniques aiming to generate insights into cellular mechanisms and responses towards the design and assembly of streamlined genome chassis together with new cellular modules in diverse biotechnological applications. Copyright © 2012 Elsevier Ltd. All rights reserved.
Enhanced Performance of Streamline-Traced External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2015-01-01
A computational design study was conducted to enhance the aerodynamic performance of streamline-traced, external-compression inlets for Mach 1.6. Compared to traditional external-compression, two-dimensional and axisymmetric inlets, streamline-traced inlets promise reduced cowl wave drag and sonic boom, but at the expense of reduced total pressure recovery and increased total pressure distortion. The current study explored a new parent flowfield for the streamline tracing and several variations of inlet design factors, including the axial displacement and angle of the subsonic cowl lip, the vertical placement of the engine axis, and the use of porous bleed in the subsonic diffuser. The performance was enhanced over that of an earlier streamline-traced inlet such as to increase the total pressure recovery and reduce total pressure distortion.
The Carnegie Mellon/Sirsi Corporation Alliance.
ERIC Educational Resources Information Center
Troll, Denise A.; Depellegrin, Tracey A.; Myers, Melanie D.
1999-01-01
Describes the relationship between Carnegie Mellon University libraries and Sirsi Corporation, their integrated library-management system vendor. Topics include Carnegie Mellon's expertise in library automation research and development; and three primary elements of the alliance: research, including user protocols, surveys, and focus groups;…
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks.
Baig, Ahmed Fraz; Hassan, Khwaja Mansoor Ul; Ghani, Anwar; Chaudhry, Shehzad Ashraf; Khan, Imran; Ashraf, Muhammad Usman
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.'s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols.
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.’s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols. PMID:29702675
Enhanced Multi-Modal Access to Planetary Exploration
NASA Technical Reports Server (NTRS)
Lamarra, Norm; Doyle, Richard; Wyatt, Jay
2003-01-01
Tomorrow's Interplanetary Network (IPN) will evolve from JPL's Deep-Space Network (DSN) and provide key capabilities to future investigators, such as simplified acquisition of higher-quality science at remote sites and enriched access to these sites. These capabilities could also be used to foster public interest, e.g., by making it possible for students to explore these environments personally, eventually perhaps interacting with a virtual world whose models could be populated by data obtained continuously from the IPN. Our paper looks at JPL's approach to making this evolution happen, starting from improved communications. Evolving space protocols (e.g., today's CCSDS proximity and file-transfer protocols) will provide the underpinning of such communications in the next decades, just as today's rich web was enabled by progress in Internet Protocols starting from the early 1970's (ARPAnet research). A key architectural thrust of this effort is to deploy persistent infrastructure incrementally, using a layered service model, where later higher-layer capabilities (such as adaptive science planning) are enabled by earlier lower-layer services (such as automated routing of object-based messages). In practice, there is also a mind shift needed from an engineering culture raised on point-to-point single-function communications (command uplink, telemetry downlink), to one in which assets are only indirectly accessed, via well-defined interfaces. We are aiming to foster a 'community of access' both among space assets and the humans who control them. This enables appropriate (perhaps eventually optimized) sharing of services and resources to the greater benefit of all participants. We envision such usage to be as automated in the future as using a cell phone is today - with all the steps in creating the real-time link being automated.
2018-01-01
The Session Initiation Protocol (SIP) is an extensive and esteemed communication protocol employed to regulate signaling as well as for controlling multimedia communication sessions. Recently, Kumari et al. proposed an improved smart card based authentication scheme for SIP based on Farash’s scheme. Farash claimed that his protocol is resistant against various known attacks. But, we observe some accountable flaws in Farash’s protocol. We point out that Farash’s protocol is prone to key-compromise impersonation attack and is unable to provide pre-verification in the smart card, efficient password change and perfect forward secrecy. To overcome these limitations, in this paper we present an enhanced authentication mechanism based on Kumari et al.’s scheme. We prove that the proposed protocol not only overcomes the issues in Farash’s scheme, but it can also resist against all known attacks. We also provide the security analysis of the proposed scheme with the help of widespread AVISPA (Automated Validation of Internet Security Protocols and Applications) software. At last, comparing with the earlier proposals in terms of security and efficiency, we conclude that the proposed protocol is efficient and more secure. PMID:29547619
Comparability of automated human induced pluripotent stem cell culture: a pilot study.
Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J
2016-12-01
Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.
Szydzik, C; Gavela, A F; Herranz, S; Roccisano, J; Knoerzer, M; Thurgood, P; Khoshmanesh, K; Mitchell, A; Lechuga, L M
2017-08-08
A primary limitation preventing practical implementation of photonic biosensors within point-of-care platforms is their integration with fluidic automation subsystems. For most diagnostic applications, photonic biosensors require complex fluid handling protocols; this is especially prominent in the case of competitive immunoassays, commonly used for detection of low-concentration, low-molecular weight biomarkers. For this reason, complex automated microfluidic systems are needed to realise the full point-of-care potential of photonic biosensors. To fulfil this requirement, we propose an on-chip valve-based microfluidic automation module, capable of automating such complex fluid handling. This module is realised through application of a PDMS injection moulding fabrication technique, recently described in our previous work, which enables practical fabrication of normally closed pneumatically actuated elastomeric valves. In this work, these valves are configured to achieve multiplexed reagent addressing for an on-chip diaphragm pump, providing the sample and reagent processing capabilities required for automation of cyclic competitive immunoassays. Application of this technique simplifies fabrication and introduces the potential for mass production, bringing point-of-care integration of complex automated microfluidics into the realm of practicality. This module is integrated with a highly sensitive, label-free bimodal waveguide photonic biosensor, and is demonstrated in the context of a proof-of-concept biosensing assay, detecting the low-molecular weight antibiotic tetracycline.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aykac, Deniz; Chaum, Edward; Fox, Karen
A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion/anomaly detection is a low-cost way of achieving broad-based screening for diabetic retinopathy (DR) and other eye diseases. In the process of a routine eye-screening examination, other non-image data is often available which may be useful in automated diagnosis of disease. In this work, we report on the results of combining this non-image data with image data, using the protocol and processing steps of a prototype system for automated disease diagnosis of retina examinations from a telemedicine network. The system includes quality assessments, automated physiology detection,more » and automated lesion detection to create an archive of known cases. Non-image data such as diabetes onset date and hemoglobin A1c (HgA1c) for each patient examination are included as well, and the system is used to create a content-based image retrieval engine capable of automated diagnosis of disease into 'normal' and 'abnormal' categories. The system achieves a sensitivity and specificity of 91.2% and 71.6% using hold-one-out validation testing.« less
Sweet, Burgunda V; Tamer, Helen R; Siden, Rivka; McCreadie, Scott R; McGregory, Michael E; Benner, Todd; Tankanow, Roberta M
2008-05-15
The development of a computerized system for protocol management, dispensing, inventory accountability, and billing by the investigational drug service (IDS) of a university health system is described. After an unsuccessful search for a commercial system that would accommodate the variation among investigational protocols and meet regulatory requirements, the IDS worked with the health-system pharmacy's information technology staff and informatics pharmacists to develop its own system. The informatics pharmacists observed work-flow and information capture in the IDS and identified opportunities for improved efficiency with an automated system. An iterative build-test-design process was used to provide the flexibility needed for individual protocols. The intent was to design a system that would support most IDS processes, using components that would allow automated backup and redundancies. A browser-based system was chosen to allow remote access. Servers, bar-code scanners, and printers were integrated into the final system design. Initial implementation involved 10 investigational protocols chosen on the basis of dispensing volume and complexity of study design. Other protocols were added over a two-year period; all studies whose drugs were dispensed from the IDS were added, followed by those for which the drugs were dispensed from decentralized pharmacy areas. The IDS briefly used temporary staff to free pharmacist and technician time for system implementation. Decentralized pharmacy areas that rarely dispense investigational drugs continue to use manual processes, with subsequent data transcription into the system. Through the university's technology transfer division, the system was licensed by an external company for sale to other IDSs. The WebIDS system has improved daily operations, enhanced safety and efficiency, and helped meet regulatory requirements for investigational drugs.
Round, A. R.; Franke, D.; Moritz, S.; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D. I.; Roessle, M.
2008-01-01
There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client–server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841
Tulip, Jennifer; Zimmermann, Jonas B; Farningham, David; Jackson, Andrew
2017-06-15
Behavioural training through positive reinforcement techniques is a well-recognised refinement to laboratory animal welfare. Behavioural neuroscience research requires subjects to be trained to perform repetitions of specific behaviours for food/fluid reward. Some animals fail to perform at a sufficient level, limiting the amount of data that can be collected and increasing the number of animals required for each study. We have implemented automated positive reinforcement training systems (comprising a button press task with variable levels of difficulty using LED cues and a fluid reward) at the breeding facility and research facility, to compare performance across these different settings, to pre-screen animals for selection and refine training protocols. Animals learned 1- and 4-choice button tasks within weeks of home enclosure training, with some inter-individual differences. High performance levels (∼200-300 trials per 60min session at ∼80% correct) were obtained without food or fluid restriction. Moreover, training quickly transferred to a laboratory version of the task. Animals that acquired the task at the breeding facility subsequently performed better both in early home enclosure sessions upon arrival at the research facility, and also in laboratory sessions. Automated systems at the breeding facility may be used to pre-screen animals for suitability for behavioural neuroscience research. In combination with conventional training, both the breeding and research facility systems facilitate acquisition and transference of learning. Automated systems have the potential to refine training protocols and minimise requirements for food/fluid control. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Moenninghoff, Christoph; Umutlu, Lale; Kloeters, Christian; Ringelstein, Adrian; Ladd, Mark E; Sombetzki, Antje; Lauenstein, Thomas C; Forsting, Michael; Schlamann, Marc
2013-06-01
Workflow efficiency and workload of radiological technologists (RTs) were compared in head examinations performed with two 1.5 T magnetic resonance (MR) scanners equipped with or without an automated user interface called "day optimizing throughput" (Dot) workflow engine. Thirty-four patients with known intracranial pathology were examined with a 1.5 T MR scanner with Dot workflow engine (Siemens MAGNETOM Aera) and with a 1.5 T MR scanner with conventional user interface (Siemens MAGNETOM Avanto) using four standardized examination protocols. The elapsed time for all necessary work steps, which were performed by 11 RTs within the total examination time, was compared for each examination at both MR scanners. The RTs evaluated the user-friendliness of both scanners by a questionnaire. Normality of distribution was checked for all continuous variables by use of the Shapiro-Wilk test. Normally distributed variables were analyzed by Student's paired t-test, otherwise Wilcoxon signed-rank test was used to compare means. Total examination time of MR examinations performed with Dot engine was reduced from 24:53 to 20:01 minutes (P < .001) and the necessary RT intervention decreased by 61% (P < .001). The Dot engine's automated choice of MR protocols was significantly better assessed by the RTs than the conventional user interface (P = .001). According to this preliminary study, the Dot workflow engine is a time-saving user assistance software, which decreases the RTs' effort significantly and may help to automate neuroradiological examinations for a higher workflow efficiency. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.
A review of mechanical move sprinkler irrigation control and automation technologies
USDA-ARS?s Scientific Manuscript database
Electronic sensors, equipment controls, and communication protocols have been developed to meet the growing interest in site-specific irrigation using center pivot and lateral move irrigation systems. Onboard and field-distributed sensors can collect data necessary for real-time irrigation manageme...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schreibmann, E; Shu, H; Cordova, J
Purpose: We report on an automated segmentation algorithm for defining radiation therapy target volumes using spectroscopic MR images (sMRI) acquired at nominal voxel resolution of 100 microliters. Methods: Wholebrain sMRI combining 3D echo-planar spectroscopic imaging, generalized auto-calibrating partially-parallel acquisitions, and elliptical k-space encoding were conducted on 3T MRI scanner with 32-channel head coil array creating images. Metabolite maps generated include choline (Cho), creatine (Cr), and N-acetylaspartate (NAA), as well as Cho/NAA, Cho/Cr, and NAA/Cr ratio maps. Automated segmentation was achieved by concomitantly considering sMRI metabolite maps with standard contrast enhancing (CE) imaging in a pipeline that first uses the watermore » signal for skull stripping. Subsequently, an initial blob of tumor region is identified by searching for regions of FLAIR abnormalities that also display reduced NAA activity using a mean ratio correlation and morphological filters. These regions are used as starting point for a geodesic level-set refinement that adapts the initial blob to the fine details specific to each metabolite. Results: Accuracy of the segmentation model was tested on a cohort of 12 patients that had sMRI datasets acquired pre, mid and post-treatment, providing a broad range of enhancement patterns. Compared to classical imaging, where heterogeneity in the tumor appearance and shape across posed a greater challenge to the algorithm, sMRI’s regions of abnormal activity were easily detected in the sMRI metabolite maps when combining the detail available in the standard imaging with the local enhancement produced by the metabolites. Results can be imported in the treatment planning, leading in general increase in the target volumes (GTV60) when using sMRI+CE MRI compared to the standard CE MRI alone. Conclusion: Integration of automated segmentation of sMRI metabolite maps into planning is feasible and will likely streamline acceptance of this new acquisition modality in clinical practice.« less
Härmä, Ville; Schukov, Hannu-Pekka; Happonen, Antti; Ahonen, Ilmari; Virtanen, Johannes; Siitari, Harri; Åkerfelt, Malin; Lötjönen, Jyrki; Nees, Matthias
2014-01-01
Glandular epithelial cells differentiate into complex multicellular or acinar structures, when embedded in three-dimensional (3D) extracellular matrix. The spectrum of different multicellular morphologies formed in 3D is a sensitive indicator for the differentiation potential of normal, non-transformed cells compared to different stages of malignant progression. In addition, single cells or cell aggregates may actively invade the matrix, utilizing epithelial, mesenchymal or mixed modes of motility. Dynamic phenotypic changes involved in 3D tumor cell invasion are sensitive to specific small-molecule inhibitors that target the actin cytoskeleton. We have used a panel of inhibitors to demonstrate the power of automated image analysis as a phenotypic or morphometric readout in cell-based assays. We introduce a streamlined stand-alone software solution that supports large-scale high-content screens, based on complex and organotypic cultures. AMIDA (Automated Morphometric Image Data Analysis) allows quantitative measurements of large numbers of images and structures, with a multitude of different spheroid shapes, sizes, and textures. AMIDA supports an automated workflow, and can be combined with quality control and statistical tools for data interpretation and visualization. We have used a representative panel of 12 prostate and breast cancer lines that display a broad spectrum of different spheroid morphologies and modes of invasion, challenged by a library of 19 direct or indirect modulators of the actin cytoskeleton which induce systematic changes in spheroid morphology and differentiation versus invasion. These results were independently validated by 2D proliferation, apoptosis and cell motility assays. We identified three drugs that primarily attenuated the invasion and formation of invasive processes in 3D, without affecting proliferation or apoptosis. Two of these compounds block Rac signalling, one affects cellular cAMP/cGMP accumulation. Our approach supports the growing needs for user-friendly, straightforward solutions that facilitate large-scale, cell-based 3D assays in basic research, drug discovery, and target validation. PMID:24810913
Evaluating conflation methods using uncertainty modeling
NASA Astrophysics Data System (ADS)
Doucette, Peter; Dolloff, John; Canavosio-Zuzelski, Roberto; Lenihan, Michael; Motsko, Dennis
2013-05-01
The classic problem of computer-assisted conflation involves the matching of individual features (e.g., point, polyline, or polygon vectors) as stored in a geographic information system (GIS), between two different sets (layers) of features. The classical goal of conflation is the transfer of feature metadata (attributes) from one layer to another. The age of free public and open source geospatial feature data has significantly increased the opportunity to conflate such data to create enhanced products. There are currently several spatial conflation tools in the marketplace with varying degrees of automation. An ability to evaluate conflation tool performance quantitatively is of operational value, although manual truthing of matched features is laborious and costly. In this paper, we present a novel methodology that uses spatial uncertainty modeling to simulate realistic feature layers to streamline evaluation of feature matching performance for conflation methods. Performance results are compiled for DCGIS street centerline features.
Raja, Kalpana; Patrick, Matthew; Gao, Yilin; Madu, Desmond; Yang, Yuyang
2017-01-01
In the past decade, the volume of “omics” data generated by the different high-throughput technologies has expanded exponentially. The managing, storing, and analyzing of this big data have been a great challenge for the researchers, especially when moving towards the goal of generating testable data-driven hypotheses, which has been the promise of the high-throughput experimental techniques. Different bioinformatics approaches have been developed to streamline the downstream analyzes by providing independent information to interpret and provide biological inference. Text mining (also known as literature mining) is one of the commonly used approaches for automated generation of biological knowledge from the huge number of published articles. In this review paper, we discuss the recent advancement in approaches that integrate results from omics data and information generated from text mining approaches to uncover novel biomedical information. PMID:28331849
Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry.
Ejsing, Christer S; Sampaio, Julio L; Surendranath, Vineeth; Duchoslav, Eva; Ekroos, Kim; Klemm, Robin W; Simons, Kai; Shevchenko, Andrej
2009-02-17
Although the transcriptome, proteome, and interactome of several eukaryotic model organisms have been described in detail, lipidomes remain relatively uncharacterized. Using Saccharomyces cerevisiae as an example, we demonstrate that automated shotgun lipidomics analysis enabled lipidome-wide absolute quantification of individual molecular lipid species by streamlined processing of a single sample of only 2 million yeast cells. By comparative lipidomics, we achieved the absolute quantification of 250 molecular lipid species covering 21 major lipid classes. This analysis provided approximately 95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome. This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches.
Enhancing Science and Automating Operations using Onboard Autonomy
NASA Technical Reports Server (NTRS)
Sherwood, Robert; Chien, Steve; Tran, Daniel; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Mandl, Dan; Szwaczkowski, Joseph; Frye, Stuart; Shulman, Seth
2006-01-01
In this paper, we will describe the evolution of the software from prototype to full time operation onboard Earth Observing One (EO-1). We will quantify the increase in science, decrease in operations cost, and streamlining of operations procedures. Included will be a description of how this software was adapted post-launch to the EO-1 mission, which had very limited computing resources which constrained the autonomy flight software. We will discuss ongoing deployments of this software to the Mars Exploration Rovers and Mars Odyssey Missions as well as a discussion of lessons learned during this project. Finally, we will discuss how the onboard autonomy has been used in conjunction with other satellites and ground sensors to form an autonomous sensor-web to study volcanoes, floods, sea-ice topography, and wild fires. As demonstrated on EO-1, onboard autonomy is a revolutionary advance that will change the operations approach on future NASA missions...
NASA Astrophysics Data System (ADS)
Burnham, Nancy A.; Kadam, Snehalata V.; DeSilva, Erin
2017-11-01
An audience response system (‘clickers’) was gradually incorporated into introductory physics courses at Worcester Polytechnic Institute during the years 2011-14. Clickers were used in lectures, as a means of preparing for labs, and for collection of exam data and grading. Average student grades were 13.5% greater, as measured by comparing exam results with a previous year. Student acceptance of clickers was high, ranging from 66% to 95%, and grading time for exams was markedly reduced, from a full day to a few hours for approximately 150 students. The streamlined grading allowed for a second test on the same material for the students who failed the first one. These improvements have the immediate effects of engagement, learning, and efficiency, and ideally, they will also provide an environment in which more students will succeed in college and their careers.
Identification of Biokinetic Models Using the Concept of Extents.
Mašić, Alma; Srinivasan, Sriniketh; Billeter, Julien; Bonvin, Dominique; Villez, Kris
2017-07-05
The development of a wide array of process technologies to enable the shift from conventional biological wastewater treatment processes to resource recovery systems is matched by an increasing demand for predictive capabilities. Mathematical models are excellent tools to meet this demand. However, obtaining reliable and fit-for-purpose models remains a cumbersome task due to the inherent complexity of biological wastewater treatment processes. In this work, we present a first study in the context of environmental biotechnology that adopts and explores the use of extents as a way to simplify and streamline the dynamic process modeling task. In addition, the extent-based modeling strategy is enhanced by optimal accounting for nonlinear algebraic equilibria and nonlinear measurement equations. Finally, a thorough discussion of our results explains the benefits of extent-based modeling and its potential to turn environmental process modeling into a highly automated task.
Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils
Castejón, David; Fricke, Pascal; Cambero, María Isabel; Herrera, Antonio
2016-01-01
In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC). To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual) was determined by 1H-NMR spectroscopy according to this protocol. PMID:26891323
PaR-PaR Laboratory Automation Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Poust, S
2013-05-01
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less
Doing accelerator physics using SDDS, UNIX, and EPICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borland, M.; Emery, L.; Sereno, N.
1995-12-31
The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less
Hung, Kevin KC; Lai, W Y; Cocks, Robert A; Rainer, Timothy H; Graham, Colin A
2015-10-01
Automated wrist cuff blood pressure (BP) devices are more compact and easier to use, particularly when access to the upper arm is restricted, for example in emergencies. We tested the Omron HEM-650 wrist device using the validation criteria of the British Hypertension Society (BHS) protocol in a major emergency department (ED) in Hong Kong. 85 patients had three measurements each by both the Omron HEM-650 wrist device and the mercury sphygmomanometer. The conventional automated BP with arm cuff was also measured using an oscillometric (Colin BP-88S NXT) device for comparison. The Omron HEM-650 achieved a grade B for both systolic and diastolic BP and demonstrated acceptable accuracy and reliability in Chinese patients in the emergency setting. The Omron HEM 650 wrist device can be recommended for use in adult emergency patients. Further research is warranted for its use in pregnant women and critically ill patients.
Applying Content Management to Automated Provenance Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.
2008-04-10
Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less
PaR-PaR laboratory automation platform.
Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J
2013-05-17
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.
Automated multi-dimensional purification of tagged proteins.
Sigrell, Jill A; Eklund, Pär; Galin, Markus; Hedkvist, Lotta; Liljedahl, Pia; Johansson, Christine Markeland; Pless, Thomas; Torstenson, Karin
2003-01-01
The capacity for high throughput purification (HTP) is essential in fields such as structural genomics where large numbers of protein samples are routinely characterized in, for example, studies of structural determination, functionality and drug development. Proteins required for such analysis must be pure and homogenous and available in relatively large amounts. AKTA 3D system is a powerful automated protein purification system, which minimizes preparation, run-time and repetitive manual tasks. It has the capacity to purify up to 6 different His6- or GST-tagged proteins per day and can produce 1-50 mg protein per run at >90% purity. The success of automated protein purification increases with careful experimental planning. Protocol, columns and buffers need to be chosen with the final application area for the purified protein in mind.
FISH-in-CHIPS: A Microfluidic Platform for Molecular Typing of Cancer Cells.
Perez-Toralla, Karla; Mottet, Guillaume; Tulukcuoglu-Guneri, Ezgi; Champ, Jérôme; Bidard, François-Clément; Pierga, Jean-Yves; Klijanienko, Jerzy; Draskovic, Irena; Malaquin, Laurent; Viovy, Jean-Louis; Descroix, Stéphanie
2017-01-01
Microfluidics offer powerful tools for the control, manipulation, and analysis of cells, in particular for the assessment of cell malignancy or the study of cell subpopulations. However, implementing complex biological protocols on chip remains a challenge. Sample preparation is often performed off chip using multiple manually performed steps, and protocols usually include different dehydration and drying steps that are not always compatible with a microfluidic format.Here, we report the implementation of a Fluorescence in situ Hybridization (FISH) protocol for the molecular typing of cancer cells in a simple and low-cost device. The geometry of the chip allows integrating the sample preparation steps to efficiently assess the genomic content of individual cells using a minute amount of sample. The FISH protocol can be fully automated, thus enabling its use in routine clinical practice.
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Tn5Prime, a Tn5 based 5' capture method for single cell RNA-seq.
Cole, Charles; Byrne, Ashley; Beaudin, Anna E; Forsberg, E Camilla; Vollmers, Christopher
2018-06-01
RNA-sequencing (RNA-seq) is a powerful technique to investigate and quantify entire transcriptomes. Recent advances in the field have made it possible to explore the transcriptomes of single cells. However, most widely used RNA-seq protocols fail to provide crucial information regarding transcription start sites. Here we present a protocol, Tn5Prime, that takes advantage of the Tn5 transposase-based Smart-seq2 protocol to create RNA-seq libraries that capture the 5' end of transcripts. The Tn5Prime method dramatically streamlines the 5' capture process and is both cost effective and reliable. By applying Tn5Prime to bulk RNA and single cell samples, we were able to define transcription start sites as well as quantify transcriptomes at high accuracy and reproducibility. Additionally, similar to 3' end-based high-throughput methods like Drop-seq and 10× Genomics Chromium, the 5' capture Tn5Prime method allows the introduction of cellular identifiers during reverse transcription, simplifying the analysis of large numbers of single cells. In contrast to 3' end-based methods, Tn5Prime also enables the assembly of the variable 5' ends of the antibody sequences present in single B-cell data. Therefore, Tn5Prime presents a robust tool for both basic and applied research into the adaptive immune system and beyond.
Self streamlining wind tunnel: Low speed testing and transonic test section design
NASA Technical Reports Server (NTRS)
Wolf, S. W. D.; Goodyer, M. J.
1977-01-01
Comprehensive aerodynamic data on an airfoil section were obtained through a wide range of angles of attack, both stalled and unstalled. Data were gathered using a self streamlining wind tunnel and were compared to results obtained on the same section in a conventional wind tunnel. The reduction of wall interference through streamline was demonstrated.
An Architecture for SCADA Network Forensics
NASA Astrophysics Data System (ADS)
Kilpatrick, Tim; Gonzalez, Jesus; Chandia, Rodrigo; Papa, Mauricio; Shenoi, Sujeet
Supervisory control and data acquisition (SCADA) systems are widely used in industrial control and automation. Modern SCADA protocols often employ TCP/IP to transport sensor data and control signals. Meanwhile, corporate IT infrastructures are interconnecting with previously isolated SCADA networks. The use of TCP/IP as a carrier protocol and the interconnection of IT and SCADA networks raise serious security issues. This paper describes an architecture for SCADA network forensics. In addition to supporting forensic investigations of SCADA network incidents, the architecture incorporates mechanisms for monitoring process behavior, analyzing trends and optimizing plant performance.
Kamson, David O.; Juhász, Csaba; Chugani, Harry T.; Jeong, Jeong-Won
2014-01-01
Background Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. Subjects and methods DTI data of 31 children (1–14years; mean age: 6±4years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Results Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=−.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. Conclusion These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. PMID:25027193
Kamson, David O; Juhász, Csaba; Chugani, Harry T; Jeong, Jeong-Won
2015-04-01
Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. DTI data of 31 children (1-14 years; mean age: 6±4 years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=-.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
Blot, Mathieu; Pivot, Diane; Bourredjem, Abderrahmane; Salmon-Rousseau, Arnaud; de Curraize, Claire; Croisier, Delphine; Chavanet, Pascal; Binquet, Christine; Piroth, Lionel
2017-09-01
Antibiotic streamlining is pivotal to reduce the emergence of resistant bacteria. However, whether streamlining is frequently performed and safe in difficult situations, such as bacteremic pneumococcal pneumonia (BPP), has still to be assessed. All adult patients admitted to Dijon Hospital (France) from 2005 to 2013 who had BPP without complications, and were alive on the third day were enrolled. Clinical, biological, radiological, microbiological and therapeutic data were recorded. A first analysis was conducted to assess factors associated with being on amoxicillin on the third day. A second analysis, adjusting for a propensity score, was performed to determine whether 30-day mortality was associated with streamlining to amoxicillin monotherapy. Of the 196 patients hospitalized for BPP, 161 were still alive on the third day and were included in the study. Treatment was streamlined to amoxicillin in 60 patients (37%). Factors associated with not streamlining were severe pneumonia (OR 3.11, 95%CI [1.23-7.87]) and a first-line antibiotic combination (OR 3.08, 95%CI [1.34-7.09]). By contrast, starting with amoxicillin monotherapy correlated inversely with the risk of subsequent treatment with antibiotics other than amoxicillin (OR 0.06, 95%CI [0.01-0.30]). The Cox model adjusted for the propensity-score analysis showed that streamlining to amoxicillin during BPP was not significantly associated with a higher risk of 30-day mortality (HR 0.38, 95%CI [0.08-1.87]). Streamlining to amoxicillin is insufficiently implemented during BPP. This strategy is safe and potentially associated with ecological and economic benefits; therefore, it should be further encouraged, particularly when antibiotic combinations are started for severe pneumonia. Copyright © 2017. Published by Elsevier B.V.
What Do Lead and Copper Sampling Protocols Mean, and Which Is Right for You?
this presentation will provide a short review of the explicit and implicit concepts behind most of the currently-used regulatory and diagnostic sampling schemes for lead, such as: random daytime sampling; automated proportional sampler; 30 minute first draw stagnation; Sequential...
Seed sprout production: Consumables and a foundation for higher plant growth in space
NASA Technical Reports Server (NTRS)
Day, Michelle; Thomas, Terri; Johnson, Steve; Luttges, Marvin
1990-01-01
Seed sprouts can be produced as a source of fresh vegetable materials and as higher plant seedlings in space. Sprout production was undertaken to evaluate the mass accumulations possible, the technologies needed, and the reliability of the overall process. Baseline experiments corroborated the utility of sprout production protocols for a variety of seed types. The automated delivery of saturated humidity effectively supplants labor intensive manual soaking techniques. Automated humidification also lend itself to modest centrifugal sprout growth environments. A small amount of ultraviolet radiation effectively suppressed bacterial and fungal contamination, and the sprouts were suitable for consumption.
Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piette, Mary A.; Schetrit, Oren; Kiliccote, Sila
During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems,more » and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.« less
Automated Prescription of Oblique Brain 3D MRSI
Ozhinsky, Eugene; Vigneron, Daniel B.; Chang, Susan M.; Nelson, Sarah J.
2012-01-01
Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to completely automate the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of OVS saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from 6 exams from 3 healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. PMID:22692829
Home Automation System Based on Intelligent Transducer Enablers.
Suárez-Albela, Manuel; Fraga-Lamas, Paula; Fernández-Caramés, Tiago M; Dapena, Adriana; González-López, Miguel
2016-09-28
This paper presents a novel home automation system named HASITE (Home Automation System based on Intelligent Transducer Enablers), which has been specifically designed to identify and configure transducers easily and quickly. These features are especially useful in situations where many transducers are deployed, since their setup becomes a cumbersome task that consumes a significant amount of time and human resources. HASITE simplifies the deployment of a home automation system by using wireless networks and both self-configuration and self-registration protocols. Thanks to the application of these three elements, HASITE is able to add new transducers by just powering them up. According to the tests performed in different realistic scenarios, a transducer is ready to be used in less than 13 s. Moreover, all HASITE functionalities can be accessed through an API, which also allows for the integration of third-party systems. As an example, an Android application based on the API is presented. Remote users can use it to interact with transducers by just using a regular smartphone or a tablet.
Designing of smart home automation system based on Raspberry Pi
NASA Astrophysics Data System (ADS)
Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn
2016-03-01
Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.
Designing of smart home automation system based on Raspberry Pi
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar
Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pinsmore » of Raspberry Pi by pressing the corresponding key for turning “on” and “off” of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.« less
Home Automation System Based on Intelligent Transducer Enablers
Suárez-Albela, Manuel; Fraga-Lamas, Paula; Fernández-Caramés, Tiago M.; Dapena, Adriana; González-López, Miguel
2016-01-01
This paper presents a novel home automation system named HASITE (Home Automation System based on Intelligent Transducer Enablers), which has been specifically designed to identify and configure transducers easily and quickly. These features are especially useful in situations where many transducers are deployed, since their setup becomes a cumbersome task that consumes a significant amount of time and human resources. HASITE simplifies the deployment of a home automation system by using wireless networks and both self-configuration and self-registration protocols. Thanks to the application of these three elements, HASITE is able to add new transducers by just powering them up. According to the tests performed in different realistic scenarios, a transducer is ready to be used in less than 13 s. Moreover, all HASITE functionalities can be accessed through an API, which also allows for the integration of third-party systems. As an example, an Android application based on the API is presented. Remote users can use it to interact with transducers by just using a regular smartphone or a tablet. PMID:27690031
On a modified streamline curvature method for the Euler equations
NASA Technical Reports Server (NTRS)
Cordova, Jeffrey Q.; Pearson, Carl E.
1988-01-01
A modification of the streamline curvature method leads to a quasilinear second-order partial differential equation for the streamline coordinate function. The existence of a stream function is not required. The method is applied to subsonic and supersonic nozzle flow, and to axially symmetric flow with swirl. For many situations, the associated numerical method is both fast and accurate.
The original goal of the Streamlined LCA workgroup was to define and document a process for a shortened form of LCA. At the time, because of the large amount of data needed to do a cradle-to-grave evaluation, it was believed that in addition to such a "full" LCA approach there w...
NASA Astrophysics Data System (ADS)
Rahman, Mir Mustafizur
In collaboration with The City of Calgary 2011 Sustainability Direction and as part of the HEAT (Heat Energy Assessment Technologies) project, the focus of this research is to develop a semi/automated 'protocol' to post-process large volumes of high-resolution (H-res) airborne thermal infrared (TIR) imagery to enable accurate urban waste heat mapping. HEAT is a free GeoWeb service, designed to help Calgary residents improve their home energy efficiency by visualizing the amount and location of waste heat leaving their homes and communities, as easily as clicking on their house in Google Maps. HEAT metrics are derived from 43 flight lines of TABI-1800 (Thermal Airborne Broadband Imager) data acquired on May 13--14, 2012 at night (11:00 pm--5:00 am) over The City of Calgary, Alberta (˜825 km 2) at a 50 cm spatial resolution and 0.05°C thermal resolution. At present, the only way to generate a large area, high-spatial resolution TIR scene is to acquire separate airborne flight lines and mosaic them together. However, the ambient sensed temperature within, and between flight lines naturally changes during acquisition (due to varying atmospheric and local micro-climate conditions), resulting in mosaicked images with different temperatures for the same scene components (e.g. roads, buildings), and mosaic join-lines arbitrarily bisect many thousands of homes. In combination these effects result in reduced utility and classification accuracy including, poorly defined HEAT Metrics, inaccurate hotspot detection and raw imagery that are difficult to interpret. In an effort to minimize these effects, three new semi/automated post-processing algorithms (the protocol) are described, which are then used to generate a 43 flight line mosaic of TABI-1800 data from which accurate Calgary waste heat maps and HEAT metrics can be generated. These algorithms (presented as four peer-reviewed papers)---are: (a) Thermal Urban Road Normalization (TURN)---used to mitigate the microclimatic variability within a thermal flight line based on varying road temperatures; (b) Automated Polynomial Relative Radiometric Normalization (RRN)---which mitigates the between flight line radiometric variability; and (c) Object Based Mosaicking (OBM)---which minimizes the geometric distortion along the mosaic edge between each flight line. A modified Emissivity Modulation technique is also described to correct H-res TIR images for emissivity. This combined radiometric and geometric post-processing protocol (i) increases the visual agreement between TABI-1800 flight lines, (ii) improves radiometric agreement within/between flight lines, (iii) produces a visually seamless mosaic, (iv) improves hot-spot detection and landcover classification accuracy, and (v) provides accurate data for thermal-based HEAT energy models. Keywords: Thermal Infrared, Post-Processing, High Spatial Resolution, Airborne, Thermal Urban Road Normalization (TURN), Relative Radiometric Normalization (RRN), Object Based Mosaicking (OBM), TABI-1800, HEAT, and Automation.
Automating Data Submission to a National Archive
NASA Astrophysics Data System (ADS)
Work, T. T.; Chandler, C. L.; Groman, R. C.; Allison, M. D.; Gegg, S. R.; Biological; Chemical Oceanography Data Management Office
2010-12-01
In late 2006, the U.S. National Science Foundation (NSF) funded the Biological and Chemical Oceanographic Data Management Office (BCO-DMO) at Woods Hole Oceanographic Institution (WHOI) to work closely with investigators to manage oceanographic data generated from their research projects. One of the final data management tasks is to ensure that the data are permanently archived at the U.S. National Oceanographic Data Center (NODC) or other appropriate national archiving facility. In the past, BCO-DMO submitted data to NODC as an email with attachments including a PDF file (a manually completed metadata record) and one or more data files. This method is no longer feasible given the rate at which data sets are contributed to BCO-DMO. Working with collaborators at NODC, a more streamlined and automated workflow was developed to keep up with the increased volume of data that must be archived at NODC. We will describe our new workflow; a semi-automated approach for contributing data to NODC that includes a Federal Geographic Data Committee (FGDC) compliant Extensible Markup Language (XML) metadata file accompanied by comma-delimited data files. The FGDC XML file is populated from information stored in a MySQL database. A crosswalk described by an Extensible Stylesheet Language Transformation (XSLT) is used to transform the XML formatted MySQL result set to a FGDC compliant XML metadata file. To ensure data integrity, the MD5 algorithm is used to generate a checksum and manifest of the files submitted to NODC for permanent archive. The revised system supports preparation of detailed, standards-compliant metadata that facilitate data sharing and enable accurate reuse of multidisciplinary information. The approach is generic enough to be adapted for use by other data management groups.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2005-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2004-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
High-Frequency Percussive Ventilation: Pneumotachograph Validation and Tidal Volume Analysis
2010-06-01
protocol, preliminary experience has shown that the flow sensor is amenable to near-automated “plug-and- play ” adaptability, permitting clinicians the...400. 6. Velmahos GC, Chan LS, Tatevossian R, Cornwell EE 3rd, Dough - erty WR, Escudero J, Demetriades D. High-frequency percussive ventilation
Macrophage Responses to Epithelial Dysfunction Promote Lung Fibrosis in Aging
2017-10-01
and Christman, 2016, AJRCMB) and at the time of this report listed among highly accessed on AJRCMB website . Importantly, our protocol and findings...seq were prepared using a high-throughput automated robotic platform (Agilent Bravo) to minimize a batch effect, all libraries have passed the QC
Problem Solving Techniques for the Design of Algorithms.
ERIC Educational Resources Information Center
Kant, Elaine; Newell, Allen
1984-01-01
Presents model of algorithm design (activity in software development) based on analysis of protocols of two subjects designing three convex hull algorithms. Automation methods, methods for studying algorithm design, role of discovery in problem solving, and comparison of different designs of case study according to model are highlighted.…
Office Automation and the Navy Finance Center.
1984-09-01
inexpensive premise-distribution cable. - r wide range o protocols and transmission speeds. - Modem -pooling. Modems can be accessed on an as-needeo...the IBM System Nerworr Architecture (SNA and is composed of four 56K baud ii ne [Ref. IT]. 4. Protocop L ers NFC has Kodak and Royal copiers
High-throughput microfluidics to control and measure signaling dynamics in single yeast cells
Hansen, Anders S.; Hao, Nan; O'Shea, Erin K.
2015-01-01
Microfluidics coupled to quantitative time-lapse fluorescence microscopy is transforming our ability to control, measure, and understand signaling dynamics in single living cells. Here we describe a pipeline that incorporates multiplexed microfluidic cell culture, automated programmable fluid handling for cell perturbation, quantitative time-lapse microscopy, and computational analysis of time-lapse movies. We illustrate how this setup can be used to control the nuclear localization of the budding yeast transcription factor Msn2. Using this protocol, we generate oscillations of Msn2 localization and measure the dynamic gene expression response of individual genes in single cells. The protocol allows a single researcher to perform up to 20 different experiments in a single day, whilst collecting data for thousands of single cells. Compared to other protocols, the present protocol is relatively easy to adopt and higher-throughput. The protocol can be widely used to control and monitor single-cell signaling dynamics in other signal transduction systems in microorganisms. PMID:26158443
He, Bo; Kim, Sung Kyoung; Son, Sang Jun; Lee, Sang Bok
2010-01-01
Aims The recent development of 1D barcode arrays has proved their capabilities to be applicable to highly multiplexed bioassays. This article introduces two magnetic decoding protocols for suspension arrays of shape-coded silica nanotubes to process multiplexed assays rapidly and easily, which will benefit the minimization and automation of the arrays. Methods In the first protocol, the magnetic nanocrystals are incorporated into the inner voids of barcoded silica nanotubes in order to give the nanotubes magnetic properties. The second protocol is performed by trapping the barcoded silica nanotubes onto streptavidin-modified magnetic beads. Results The rapid and easy decoding process was demonstrated by applying the above two protocols to multiplexed assays, resulting in high selectivity. Furthermore, the magnetic bead-trapped barcode nanotubes provided a great opportunity to exclude the use of dye molecules in multiplexed assays by using barcode nanotubes as signals. Conclusion The rapid and easy manipulation of encoded carriers using magnetic properties could be used to develop promising suspension arrays for portable bioassays. PMID:20025466
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter
2016-10-01
In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Brito, Maíra M; Lúcio, Cristina F; Angrimani, Daniel S R; Losano, João Diego A; Dalmazzo, Andressa; Nichi, Marcílio; Vannucchi, Camila I
2017-01-02
In addition to the existence of several cryopreservation protocols, no systematic research has been carried out in order to confirm the suitable protocol for canine sperm. This study aims to assess the effect of adding 5% glycerol during cryopreservation at 37°C (one-step) and 5°C (two-steps), in addition of testing two thawing protocols (37°C for 30 seconds, and 70°C for 8 seconds). We used 12 sperm samples divided into four experimental groups: Single-Step - Slow Thawing Group; Two-Step - Slow Thawing Group; Single-Step - Fast Thawing Group; and Two-Step - Fast Thawing Group. Frozen-thawed samples were submitted to automated analysis of sperm motility, evaluation of plasmatic membrane integrity, acrosomal integrity, mitochondrial activity, sperm morphology, sperm susceptibility to oxidative stress, and sperm binding assay to perivitellinic membrane of chicken egg yolk. Considering the comparison between freezing protocols, no statistical differences were verified for any of the response variables. When comparison between thawing protocols was performed, slow thawing protocol presented higher sperm count bound to perivitelline membrane of chicken egg yolk, compared to fast thawing protocol. Regardless of the freezing process, the slow thawing protocol can be recommended for the large scale cryopreservation of canine semen, since it shows a consistent better functional result.
Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi
2013-08-01
Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.
High-throughput DNA extraction of forensic adhesive tapes.
Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes
2016-09-01
Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Experimental Evaluation of an Integrated Datalink and Automation-Based Strategic Trajectory Concept
NASA Technical Reports Server (NTRS)
Mueller, Eric
2007-01-01
This paper presents research on the interoperability of trajectory-based automation concepts and technologies with modern Flight Management Systems and datalink communication available on many of today s commercial aircraft. A tight integration of trajectory-based ground automation systems with the aircraft Flight Management System through datalink will enable mid-term and far-term benefits from trajectory-based automation methods. A two-way datalink connection between the trajectory-based automation resident in the Center/TRACON Automation System and the Future Air Navigation System-1 integrated FMS/datalink in NASA Ames B747-400 Level D simulator has been established and extensive simulation of the use of datalink messages to generate strategic trajectories completed. A strategic trajectory is defined as an aircraft deviation needed to solve a conflict or honor a route request and then merge the aircraft back to its nominal preferred trajectory using a single continuous trajectory clearance. Engineers on the ground side of the datalink generated lateral and vertical trajectory clearances and transmitted them to the Flight Management System of the 747; the airborne automation then flew the new trajectory without human intervention, requiring the flight crew only to review and to accept the trajectory. This simulation established the protocols needed for a significant majority of the trajectory change types required to solve a traffic conflict or deviate around weather. This demonstration provides a basis for understanding the requirements for integration of trajectory-based automation with current Flight Management Systems and datalink to support future National Airspace System operations.
NASA Technical Reports Server (NTRS)
Nickle, F. R.; Freeman, Arthur B.
1939-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
Streamline coal slurry letdown valve
Platt, Robert J.; Shadbolt, Edward A.
1983-01-01
A streamlined coal slurry letdown valve is featured which has a two-piece throat comprised of a seat and seat retainer. The two-piece design allows for easy assembly and disassembly of the valve. A novel cage holds the two-piece throat together during the high pressure letdown. The coal slurry letdown valve has long operating life as a result of its streamlined and erosion-resistance surfaces.
Streamline coal slurry letdown valve
Platt, R.J.; Shadbolt, E.A.
1983-11-08
A streamlined coal slurry letdown valve is featured which has a two-piece throat comprised of a seat and seat retainer. The two-piece design allows for easy assembly and disassembly of the valve. A novel cage holds the two-piece throat together during the high pressure letdown. The coal slurry letdown valve has long operating life as a result of its streamlined and erosion-resistance surfaces. 5 figs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Benton, Nathanael; Burns, Patrick
Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less
Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A
2004-01-01
We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.