Sample records for capture forelemental analysis

  1. Database of prompt gamma rays from slow neutron capture forelemental analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, R.B.; Choi, H.D.; Lindstrom, R.M.

    2004-12-31

    The increasing importance of Prompt Gamma-ray ActivationAnalysis (PGAA) in a broad range of applications is evident, and has beenemphasized at many meetings related to this topic (e.g., TechnicalConsultants' Meeting, Use of neutron beams for low- andmedium-fluxresearch reactors: radiography and materialscharacterizations, IAEA Vienna, 4-7 May 1993, IAEA-TECDOC-837, 1993).Furthermore, an Advisory Group Meeting (AGM) for the Coordination of theNuclear Structure and Decay Data Evaluators Network has stated that thereis a need for a complete and consistent library of cold- and thermalneutron capture gammaray and cross-section data (AGM held at Budapest,14-18 October 1996, INDC(NDS)-363); this AGM also recommended theorganization of an IAEA CRPmore » on the subject. The International NuclearData Committee (INDC) is the primary advisory body to the IAEA NuclearData Section on their nuclear data programmes. At a biennial meeting in1997, the INDC strongly recommended that the Nuclear Data Section supportnew measurements andupdate the database on Neutron-induced PromptGamma-ray Activation Analysis (21st INDC meeting, INDC/P(97)-20). As aconsequence of the various recommendations, a CRP on "Development of aDatabase for Prompt Gamma-ray Neutron Activation Analysis (PGAA)" wasinitiated in 1999. Prior to this project, several consultants had definedthe scope, objectives and tasks, as approved subsequently by the IAEA.Each CRP participant assumed responsibility for the execution of specifictasks. The results of their and other research work were discussed andapproved by the participants in research co-ordination meetings (seeSummary reports: INDC(NDS)-411, 2000; INDC(NDS)-424, 2001; andINDC(NDS)-443, 200). PGAA is a non-destructive radioanalytical method,capable of rapid or simultaneous "in-situ" multi-element analyses acrossthe entire Periodic Table, from hydrogen to uranium. However, inaccurateand incomplete data were a significant hindrance in the qualitative andquantitative analysis of complicated capture-gamma spectra by means ofPGAA. Therefore, the main goal of the CRP was to improve the quality andquantity of the required data in order to make possible the reliableapplication of PGAA in fields such as materials science, chemistry,geology, mining, archaeology, environment, food analysis and medicine.This aim wasachieved thanks to the dedicated work and effort of theparticipants. The CD-ROM included with this publication contains thedatabase, the retrieval system, the three CRM reports, and otherimportant electronic documents related to the CRP. The IAEA wishes tothanks all CRP participants who contributed to the success of the CRP andthe formulation of this publication. Special thanks are due to R.B.Firestone for his leading roll in the development of this CRP and hiscomprehensive compilation, analysis and provision of the adopteddatabase, and to V. Zerkin for the software developments associatedwiththe retrieval system. An essential component of this data compilation isthe extensive sets of new measurements of capture gamma-ray energies andintensities undertaken at Budapest by Zs. Revay under the direction ofG.L. Molnar. The extensive participation and assistance of H.D. Choi isalso greatly appreciated. Other participants inthis CRP were: R.M.Lindstrom, S.M. Mughabghab, A.V.R. Reddy, V.H. Tan and C.M. Zhou. Thanksare also due to S.C. Frankle and M.A. Lone for their active participationas consultants at some of the meetings. Finally, the participants wish tothank R. Paviotti-Corcuera (Nuclear Data Section, Division of Physicaland Chemical Sciences), who was the IAEA responsible officer for the CRP,this publication and the resulting database. The participants aregrateful to D.L. Muir and A.L. Nichols, successive Heads of the NuclearData Section, for their active and enthusiastic encouragement infurthering the work of the CRP.« less

  2. Age-structured mark-recapture analysis: A virtual-population-analysis-based model for analyzing age-structured capture-recapture data

    USGS Publications Warehouse

    Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.

    2006-01-01

    We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.

  3. Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.

    PubMed

    Zhai, Haibo; Rubin, Edward S

    2018-04-17

    This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.

  4. Performance Characteristics of a Kernel-Space Packet Capture Module

    DTIC Science & Technology

    2010-03-01

    Defense, or the United States Government . AFIT/GCO/ENG/10-03 PERFORMANCE CHARACTERISTICS OF A KERNEL-SPACE PACKET CAPTURE MODULE THESIS Presented to the...3.1.2.3 Prototype. The proof of concept for this research is the design, development, and comparative performance analysis of a kernel level N2d capture...changes to kernel code 5. Can be used for both user-space and kernel-space capture applications in order to control comparative performance analysis to

  5. Microfluidic immunocapture of circulating pancreatic cells using parallel EpCAM and MUC1 capture: characterization, optimization and downstream analysis.

    PubMed

    Thege, Fredrik I; Lannin, Timothy B; Saha, Trisha N; Tsai, Shannon; Kochman, Michael L; Hollingsworth, Michael A; Rhim, Andrew D; Kirby, Brian J

    2014-05-21

    We have developed and optimized a microfluidic device platform for the capture and analysis of circulating pancreatic cells (CPCs) and pancreatic circulating tumor cells (CTCs). Our platform uses parallel anti-EpCAM and cancer-specific mucin 1 (MUC1) immunocapture in a silicon microdevice. Using a combination of anti-EpCAM and anti-MUC1 capture in a single device, we are able to achieve efficient capture while extending immunocapture beyond single marker recognition. We also have detected a known oncogenic KRAS mutation in cells spiked in whole blood using immunocapture, RNA extraction, RT-PCR and Sanger sequencing. To allow for downstream single-cell genetic analysis, intact nuclei were released from captured cells by using targeted membrane lysis. We have developed a staining protocol for clinical samples, including standard CTC markers; DAPI, cytokeratin (CK) and CD45, and a novel marker of carcinogenesis in CPCs, mucin 4 (MUC4). We have also demonstrated a semi-automated approach to image analysis and CPC identification, suitable for clinical hypothesis generation. Initial results from immunocapture of a clinical pancreatic cancer patient sample show that parallel capture may capture more of the heterogeneity of the CPC population. With this platform, we aim to develop a diagnostic biomarker for early pancreatic carcinogenesis and patient risk stratification.

  6. Handbook of capture-recapture analysis

    USGS Publications Warehouse

    Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.

    2005-01-01

    Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.

  7. Microfluidic-Based Enrichment and Retrieval of Circulating Tumor Cells for RT-PCR Analysis.

    PubMed

    Gogoi, Priya; Sepehri, Saedeh; Chow, Will; Handique, Kalyan; Wang, Yixin

    2017-01-01

    Molecular analysis of circulating tumor cells (CTCs) is hindered by low sensitivity and high level of background leukocytes of currently available CTC enrichment technologies. We have developed a novel device to enrich and retrieve CTCs from blood samples by using a microfluidic chip. The Celsee PREP100 device captures CTCs with high sensitivity and allows the captured CTCs to be retrieved for molecular analysis. It uses the microfluidic chip which has approximately 56,320 capture chambers. Based on differences in cell size and deformability, each chamber ensures that small blood escape while larger CTCs of varying sizes are trapped and isolated in the chambers. In this report, we used the Celsee PREP100 to capture cancer cells spiked into normal donor blood samples. We were able to show that the device can capture as low as 10 cells with high reproducibility. The captured CTCs were retrieved from the microfluidic chip. The cell recovery rate of this back-flow procedure is 100% and the level of remaining background leukocytes is very low (about 300-400 cells). RNA from the retrieved cells are extracted and converted to cDNA, and gene expression analysis of selected cancer markers can be carried out by using RT-PCR assays. The sensitive and easy-to-use Celsee PREP100 system represents a promising technology for capturing and molecular characterization of CTCs.

  8. A Systematic Approach for Evaluation of Capture Zones at Pump and Treat Systems

    EPA Science Inventory

    This document describes a systematic approach for performing capture zone analysis associated with ground water pump and treat systems. A “capture zone” refers to the three-dimensional region that contributes the ground water extracted by one or more wells or drains. A capture ...

  9. High purity microfluidic sorting and analysis of circulating tumor cells: towards routine mutation detection.

    PubMed

    Autebert, Julien; Coudert, Benoit; Champ, Jérôme; Saias, Laure; Guneri, Ezgi Tulukcuoglu; Lebofsky, Ronald; Bidard, François-Clément; Pierga, Jean-Yves; Farace, Françoise; Descroix, Stéphanie; Malaquin, Laurent; Viovy, Jean-Louis

    2015-05-07

    A new generation of the Ephesia cell capture technology optimized for CTC capture and genetic analysis is presented, characterized in depth and compared with the CellSearch system as a reference. This technology uses magnetic particles bearing tumour-cell specific EpCAM antibodies, self-assembled in a regular array in a microfluidic flow cell. 48,000 high aspect-ratio columns are generated using a magnetic field in a high throughput (>3 ml h(-1)) device and act as sieves to specifically capture the cells of interest through antibody-antigen interactions. Using this device optimized for CTC capture and analysis, we demonstrated the capture of epithelial cells with capture efficiency above 90% for concentrations as low as a few cells per ml. We showed the high specificity of capture with only 0.26% of non-epithelial cells captured for concentrations above 10 million cells per ml. We investigated the capture behavior of cells in the device, and correlated the cell attachment rate with the EpCAM expression on the cell membranes for six different cell lines. We developed and characterized a two-step blood processing method to allow for rapid processing of 10 ml blood tubes in less than 4 hours, and showed a capture rate of 70% for as low as 25 cells spiked in 10 ml blood tubes, with less than 100 contaminating hematopoietic cells. Using this device and procedure, we validated our system on patient samples using an automated cell immunostaining procedure and a semi-automated cell counting method. Our device captured CTCs in 75% of metastatic prostate cancer patients and 80% of metastatic breast cancer patients, and showed similar or better results than the CellSearch device in 10 out of 13 samples. Finally, we demonstrated the possibility of detecting cancer-related PIK3CA gene mutation in 20 cells captured in the chip with a good correlation between the cell count and the quantitation value Cq of the post-capture qPCR.

  10. Meteoroid capture cell construction

    NASA Technical Reports Server (NTRS)

    Zook, H. A.; High, R. W. (Inventor)

    1976-01-01

    A thin membrane covering the open side of a meteoroid capture cell causes an impacting meteoroid to disintegrate as it penetrates the membrane. The capture cell then contains and holds the meteoroid particles for later analysis.

  11. Seamless presentation capture, indexing, and management

    NASA Astrophysics Data System (ADS)

    Hilbert, David M.; Cooper, Matthew; Denoue, Laurent; Adcock, John; Billsus, Daniel

    2005-10-01

    Technology abounds for capturing presentations. However, no simple solution exists that is completely automatic. ProjectorBox is a "zero user interaction" appliance that automatically captures, indexes, and manages presentation multimedia. It operates continuously to record the RGB information sent from presentation devices, such as a presenter's laptop, to display devices, such as a projector. It seamlessly captures high-resolution slide images, text and audio. It requires no operator, specialized software, or changes to current presentation practice. Automatic media analysis is used to detect presentation content and segment presentations. The analysis substantially enhances the web-based user interface for browsing, searching, and exporting captured presentations. ProjectorBox has been in use for over a year in our corporate conference room, and has been deployed in two universities. Our goal is to develop automatic capture services that address both corporate and educational needs.

  12. Linking animal-borne video to accelerometers reveals prey capture variability.

    PubMed

    Watanabe, Yuuki Y; Takahashi, Akinori

    2013-02-05

    Understanding foraging is important in ecology, as it determines the energy gains and, ultimately, the fitness of animals. However, monitoring prey captures of individual animals is difficult. Direct observations using animal-borne videos have short recording periods, and indirect signals (e.g., stomach temperature) are never validated in the field. We took an integrated approach to monitor prey captures by a predator by deploying a video camera (lasting for 85 min) and two accelerometers (on the head and back, lasting for 50 h) on free-swimming Adélie penguins. The movies showed that penguins moved the heads rapidly to capture krill in midwater and fish (Pagothenia borchgrevinki) underneath the sea ice. Captures were remarkably fast (two krill per second in swarms) and efficient (244 krill or 33 P. borchgrevinki in 78-89 min). Prey captures were detected by the signal of head acceleration relative to body acceleration with high sensitivity and specificity (0.83-0.90), as shown by receiver-operating characteristic analysis. Extension of signal analysis to the entire behavioral records showed that krill captures were spatially and temporally more variable than P. borchgrevinki captures. Notably, the frequency distribution of krill capture rate closely followed a power-law model, indicating that the foraging success of penguins depends on a small number of very successful dives. The three steps illustrated here (i.e., video observations, linking video to behavioral signals, and extension of signal analysis) are unique approaches to understanding the spatial and temporal variability of ecologically important events such as foraging.

  13. FRAP Analysis: Accounting for Bleaching during Image Capture

    PubMed Central

    Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.

    2012-01-01

    The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750

  14. Morphological Idiosyncracies in Classical Arabic: Evidence Favoring Lexical Representations over Rules.

    ERIC Educational Resources Information Center

    Miller, Ann M.

    A lexical representational analysis of Classical Arabic is proposed that captures a generalization that McCarthy's (1979, 1981) autosegmental analysis misses, namely that idiosyncratic characteristics of the derivational binyanim in Arabic are lexical, not morphological. This analysis captures that generalization by treating all the idiosyncracies…

  15. Liquid biopsy on chip: a paradigm shift towards the understanding of cancer metastasis.

    PubMed

    Tadimety, Amogha; Syed, Abeer; Nie, Yuan; Long, Christina R; Kready, Kasia M; Zhang, John X J

    2017-01-23

    This comprehensive review serves as a guide for developing scalable and robust liquid biopsies on chip for capture, detection, and analysis of circulating tumor cells (CTCs). Liquid biopsy, the detection of biomarkers from body fluids, has proven challenging because of CTC rarity and the heterogeneity of CTCs shed from tumors. The review starts with the underlying biological mechanisms that make liquid biopsy a challenge before moving into an evaluation of current technological progress. Then, a framework for evaluation of the technologies is presented with special attention to throughput, capture rate, and cell viability for analysis. Technologies for CTC capture, detection, and analysis will be evaluated based on these criteria, with a focus on current approaches, limitations and future directions. The paper provides a critical review for microchip developers as well as clinical investigators to build upon the existing progress towards the goal of designing CTC capture, detection, and analysis platforms.

  16. Performance analysis of Aloha networks with power capture and near/far effect

    NASA Astrophysics Data System (ADS)

    McCartin, Joseph T.

    1989-06-01

    An analysis is presented for the throughput characteristics for several classes of Aloha packet networks. Specifically, the throughput for variable packet length Aloha utilizing multiple power levels to induce receiver capture is derived. The results are extended to an analysis of a selective-repeat ARQ Aloha network. Analytical results are presented which indicate a significant increase in throughput for a variable packet network implementing a random two power level capture scheme. Further research into the area of the near/far effect on Aloha networks is included. Improvements in throughput for mobile radio Aloha networks which are subject to the near/far effect are presented. Tactical Command, Control and Communications (C3) systems of the future will rely on Aloha ground mobile data networks. The incorporation of power capture and the near/far effect into future tactical networks will result in improved system analysis, design, and performance.

  17. Multiparameter cell affinity chromatography: separation and analysis in a single microfluidic channel.

    PubMed

    Li, Peng; Gao, Yan; Pappas, Dimitri

    2012-10-02

    The ability to sort and capture more than one cell type from a complex sample will enable a wide variety of studies of cell proliferation and death and the analysis of disease states. In this work, we integrated a pneumatic actuated control layer to an affinity separation layer to create different antibody-coating regions on the same fluidic channel. The comparison of different antibody capture capabilities to the same cell line was demonstrated by flowing Ramos cells through anti-CD19- and anti-CD71-coated regions in the same channel. It was determined that the cell capture density on the anti-CD19 region was 2.44 ± 0.13 times higher than that on the anti-CD71-coated region. This approach can be used to test different affinity molecules for selectivity and capture efficiency using a single cell line in one separation. Selective capture of Ramos and HuT 78 cells from a mixture was also demonstrated using two antibody regions in the same channel. Greater than 90% purity was obtained on both capture areas in both continuous flow and stop flow separation modes. A four-region antibody-coated device was then fabricated to study the simultaneous, serial capture of three different cell lines. In this case the device showed effective capture of cells in a single separation channel, opening up the possibility of multiple cell sorting. Multiparameter sequential blood sample analysis was also demonstrated with high capture specificity (>97% for both CD19+ and CD4+ leukocytes). The chip can also be used to selectively treat cells after affinity separation.

  18. Linking animal-borne video to accelerometers reveals prey capture variability

    PubMed Central

    Watanabe, Yuuki Y.; Takahashi, Akinori

    2013-01-01

    Understanding foraging is important in ecology, as it determines the energy gains and, ultimately, the fitness of animals. However, monitoring prey captures of individual animals is difficult. Direct observations using animal-borne videos have short recording periods, and indirect signals (e.g., stomach temperature) are never validated in the field. We took an integrated approach to monitor prey captures by a predator by deploying a video camera (lasting for 85 min) and two accelerometers (on the head and back, lasting for 50 h) on free-swimming Adélie penguins. The movies showed that penguins moved the heads rapidly to capture krill in midwater and fish (Pagothenia borchgrevinki) underneath the sea ice. Captures were remarkably fast (two krill per second in swarms) and efficient (244 krill or 33 P. borchgrevinki in 78–89 min). Prey captures were detected by the signal of head acceleration relative to body acceleration with high sensitivity and specificity (0.83–0.90), as shown by receiver-operating characteristic analysis. Extension of signal analysis to the entire behavioral records showed that krill captures were spatially and temporally more variable than P. borchgrevinki captures. Notably, the frequency distribution of krill capture rate closely followed a power-law model, indicating that the foraging success of penguins depends on a small number of very successful dives. The three steps illustrated here (i.e., video observations, linking video to behavioral signals, and extension of signal analysis) are unique approaches to understanding the spatial and temporal variability of ecologically important events such as foraging. PMID:23341596

  19. Motion Analysis System for Instruction of Nihon Buyo using Motion Capture

    NASA Astrophysics Data System (ADS)

    Shinoda, Yukitaka; Murakami, Shingo; Watanabe, Yuta; Mito, Yuki; Watanuma, Reishi; Marumo, Mieko

    The passing on and preserving of advanced technical skills has become an important issue in a variety of fields, and motion analysis using motion capture has recently become popular in the research of advanced physical skills. This research aims to construct a system having a high on-site instructional effect on dancers learning Nihon Buyo, a traditional dance in Japan, and to classify Nihon Buyo dancing according to style, school, and dancer's proficiency by motion analysis. We have been able to study motion analysis systems for teaching Nihon Buyo now that body-motion data can be digitized and stored by motion capture systems using high-performance computers. Thus, with the aim of developing a user-friendly instruction-support system, we have constructed a motion analysis system that displays a dancer's time series of body motions and center of gravity for instructional purposes. In this paper, we outline this instructional motion analysis system based on three-dimensional position data obtained by motion capture. We also describe motion analysis that we performed based on center-of-gravity data obtained by this system and motion analysis focusing on school and age group using this system.

  20. THE RADIATIVE NEUTRON CAPTURE ON 2H, 6Li, 7Li, 12C AND 13C AT ASTROPHYSICAL ENERGIES

    NASA Astrophysics Data System (ADS)

    Dubovichenko, Sergey; Dzhazairov-Kakhramanov, Albert; Burkova, Natalia

    2013-05-01

    The continued interest in the study of radiative neutron capture on atomic nuclei is due, on the one hand, to the important role played by this process in the analysis of many fundamental properties of nuclei and nuclear reactions, and, on the other hand, to the wide use of the capture cross-section data in the various applications of nuclear physics and nuclear astrophysics, and, also, to the importance of the analysis of primordial nucleosynthesis in the Universe. This paper is devoted to the description of results for the processes of the radiative neutron capture on certain light atomic nuclei at thermal and astrophysical energies. The consideration of these processes is done within the framework of the potential cluster model (PCM), general description of which was given earlier. The methods of usage of the results obtained, based on the phase shift analysis intercluster potentials, are demonstrated in calculations of the radiative capture characteristics. The considered capture reactions are not part of stellar thermonuclear cycles, but involve in the basic reaction chain of primordial nucleosynthesis in the course of the Universe formation.

  1. Bench Scale Process for Low Cost CO 2 Capture Using a PhaseChanging Absorbent: Techno-Economic Analysis Topical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miebach, Barbara; McDuffie, Dwayne; Spiry, Irina

    The objective of this project is to design and build a bench-scale process for a novel phase-changing CO 2 capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2 capture absorbent for post-combustion capture of CO 2 from coal-fired power plants with 90% capture efficiency and 95% CO 2 purity at a cost of $40/tonne of CO 2 captured by 2025 and a cost of <$10/tonne of CO 2 captured by 2035. This report presents system and economic analysis for a process that uses a phase changing aminosilicone solvent to remove COmore » 2 from pulverized coal (PC) power plant flue gas. The aminosilicone solvent is a pure 1,3-bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (GAP-0). Performance of the phase-changing aminosilicone technology is compared to that of a conventional carbon capture system using aqueous monoethanolamine (MEA). This analysis demonstrates that the aminosilicone process has significant advantages relative to an MEA-based system. The first-year CO 2 removal cost for the phase-changing CO 2 capture process is $52.1/tonne, compared to $66.4/tonne for the aqueous amine process. The phase-changing CO 2 capture process is less costly than MEA because of advantageous solvent properties that include higher working capacity, lower corrosivity, lower vapor pressure, and lower heat capacity. The phase-changing aminosilicone process has approximately 32% lower equipment capital cost compared to that of the aqueous amine process. However, this solvent is susceptible to thermal degradation at CSTR desorber operating temperatures, which could add as much as $88/tonne to the CO 2 capture cost associated with solvent makeup. Future work is focused on mitigating this critical risk by developing an advanced low-temperature desorber that can deliver comparable desorption performance and significantly reduced thermal degradation rate.« less

  2. Under-reporting of road traffic mortality in developing countries: application of a capture-recapture statistical model to refine mortality estimates.

    PubMed

    Samuel, Jonathan C; Sankhulani, Edward; Qureshi, Javeria S; Baloyi, Paul; Thupi, Charles; Lee, Clara N; Miller, William C; Cairns, Bruce A; Charles, Anthony G

    2012-01-01

    Road traffic injuries are a major cause of preventable death in sub-Saharan Africa. Accurate epidemiologic data are scarce and under-reporting from primary data sources is common. Our objectives were to estimate the incidence of road traffic deaths in Malawi using capture-recapture statistical analysis and determine what future efforts will best improve upon this estimate. Our capture-recapture model combined primary data from both police and hospital-based registries over a one year period (July 2008 to June 2009). The mortality incidences from the primary data sources were 0.075 and 0.051 deaths/1000 person-years, respectively. Using capture-recapture analysis, the combined incidence of road traffic deaths ranged 0.192-0.209 deaths/1000 person-years. Additionally, police data were more likely to include victims who were male, drivers or pedestrians, and victims from incidents with greater than one vehicle involved. We concluded that capture-recapture analysis is a good tool to estimate the incidence of road traffic deaths, and that capture-recapture analysis overcomes limitations of incomplete data sources. The World Health Organization estimated incidence of road traffic deaths for Malawi utilizing a binomial regression model and survey data and found a similar estimate despite strikingly different methods, suggesting both approaches are valid. Further research should seek to improve capture-recapture data through utilization of more than two data sources and improving accuracy of matches by minimizing missing data, application of geographic information systems, and use of names and civil registration numbers if available.

  3. Under-Reporting of Road Traffic Mortality in Developing Countries: Application of a Capture-Recapture Statistical Model to Refine Mortality Estimates

    PubMed Central

    Samuel, Jonathan C.; Sankhulani, Edward; Qureshi, Javeria S.; Baloyi, Paul; Thupi, Charles; Lee, Clara N.; Miller, William C.; Cairns, Bruce A.; Charles, Anthony G.

    2012-01-01

    Road traffic injuries are a major cause of preventable death in sub-Saharan Africa. Accurate epidemiologic data are scarce and under-reporting from primary data sources is common. Our objectives were to estimate the incidence of road traffic deaths in Malawi using capture-recapture statistical analysis and determine what future efforts will best improve upon this estimate. Our capture-recapture model combined primary data from both police and hospital-based registries over a one year period (July 2008 to June 2009). The mortality incidences from the primary data sources were 0.075 and 0.051 deaths/1000 person-years, respectively. Using capture-recapture analysis, the combined incidence of road traffic deaths ranged 0.192–0.209 deaths/1000 person-years. Additionally, police data were more likely to include victims who were male, drivers or pedestrians, and victims from incidents with greater than one vehicle involved. We concluded that capture-recapture analysis is a good tool to estimate the incidence of road traffic deaths, and that capture-recapture analysis overcomes limitations of incomplete data sources. The World Health Organization estimated incidence of road traffic deaths for Malawi utilizing a binomial regression model and survey data and found a similar estimate despite strikingly different methods, suggesting both approaches are valid. Further research should seek to improve capture-recapture data through utilization of more than two data sources and improving accuracy of matches by minimizing missing data, application of geographic information systems, and use of names and civil registration numbers if available. PMID:22355338

  4. Loss of local capture of the pulmonary vein myocardium after antral isolation: prevalence and clinical significance.

    PubMed

    Squara, Fabien; Liuba, Ioan; Chik, William; Santangeli, Pasquale; Zado, Erica S; Callans, David J; Marchlinski, Francis E

    2015-03-01

    Capture of the myocardial sleeves of the pulmonary veins (PV) during PV pacing is mandatory for assessing exit block after PV isolation (PVI). However, previous studies reported that a significant proportion of PVs failed to demonstrate local capture after PVI. We designed this study to evaluate the prevalence and the clinical significance of loss of PV capture after PVI. Thirty patients (14 redo) undergoing antral PVI were included. Before and after PVI, local PV capture was assessed during circumferential pacing (10 mA/2 milliseconds) with a circular multipolar catheter (CMC), using EGM analysis from each dipole of the CMC and from the ablation catheter placed in ipsilateral PV. Pacing output was varied to optimize identification of sleeve capture. All PVs demonstrated sleeve capture before PVI, but only 81% and 40% after first time and redo PVI, respectively (P < 0.001 vs. before PVI). In multivariate analysis, absence of spontaneous PV depolarizations after PVI and previous PVI procedures were associated with less PV sleeve capture after PVI (40% sleeve capture, P < 0.001 for both). Loss of PV local capture by design was coincident with the development of PV entrance block and importantly predicted absence of acute reconnection during adenosine challenge with 96% positive predictive value (23% negative predictive value). Loss of PV local capture is common after antral PVI resulting in entrance block, and may be used as a specific alternate endpoint for PV electrical isolation. Additionally, loss of PV local capture may identify PVs at very low risk of acute reconnection during adenosine challenge. © 2014 Wiley Periodicals, Inc.

  5. Density estimation using the trapping web design: A geometric analysis

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    1994-01-01

    Population densities for small mammal and arthropod populations can be estimated using capture frequencies for a web of traps. A conceptually simple geometric analysis that avoid the need to estimate a point on a density function is proposed. This analysis incorporates data from the outermost rings of traps, explaining large capture frequencies in these rings rather than truncating them from the analysis.

  6. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  7. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    NASA Astrophysics Data System (ADS)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2009-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.

  8. Modeling association among demographic parameters in analysis of open population capture-recapture data.

    PubMed

    Link, William A; Barker, Richard J

    2005-03-01

    We present a hierarchical extension of the Cormack-Jolly-Seber (CJS) model for open population capture-recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis-Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.

  9. Highly sensitive detection of the group A Rotavirus using Apolipoprotein H-coated ELISA plates compared to quantitative real-time PCR.

    PubMed

    Adlhoch, Cornelia; Kaiser, Marco; Hoehne, Marina; Mas Marques, Andreas; Stefas, Ilias; Veas, Francisco; Ellerbrok, Heinz

    2011-02-10

    The principle of a capture ELISA is binding of specific capture antibodies (polyclonal or monoclonal) to the surface of a suitable 96 well plate. These immobilized antibodies are capable of specifically binding a virus present in a clinical sample. Subsequently, the captured virus is detected using a specific detection antibody. The drawback of this method is that a capture ELISA can only function for a single virus captured by the primary antibody. Human Apolipoprotein H (ApoH) or β2-glycoprotein 1 is able to poly-specifically bind viral pathogens. Replacing specific capture antibodies by ApoH should allow poly-specific capture of different viruses that subsequently could be revealed using specific detection antibodies. Thus, using a single capture ELISA format different viruses could be analysed depending on the detection antibody that is applied. In order to demonstrate that this is a valid approach we show detection of group A rotaviruses from stool samples as a proof of principle for a new method of capture ELISA that should also be applicable to other viruses. Stool samples of different circulating common human and potentially zoonotic group A rotavirus strains, which were pretested in commercial EIAs and genotyped by PCR, were tested in parallel in an ApoH-ELISA set-up and by quantitative real-time PCR (qPCR). Several control samples were included in the analysis. The ApoH-ELISA was suitable for the capture of rotavirus-particles and the detection down to 1,000 infectious units (TCID(50/ml)). Subsets of diagnostic samples of different G- and P-types were tested positive in the ApoH-ELISA in different dilutions. Compared to the qPCR results, the analysis showed high sensitivity, specificity and low cross-reactivity for the ApoH-ELISA, which was confirmed in receiver operating characteristics (ROC) analysis. In this study the development of a highly sensitive and specific capture ELISA was demonstrated by combining a poly-specific ApoH capture step with specific detection antibodies using group A rotaviruses as an example.

  10. Capturing Nanotechnology's Current State of Development via Analysis of Patents. OECD Science, Technology and Industry Working Papers, 2007/4

    ERIC Educational Resources Information Center

    Igami, Masatsura; Okazaki, Teruo

    2007-01-01

    This analysis aims at capturing current inventive activities in nanotechnologies based on the analysis of patent applications to the European Patent Office (EPO). Reported findings include: (1) Nanotechnology is a multifaceted technology, currently consisting of a set of technologies on the nanometre scale rather than a single technological field;…

  11. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods.

    PubMed

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa; Hensel, Edward

    2018-02-13

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions.

  12. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods

    PubMed Central

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa

    2018-01-01

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions. PMID:29438289

  13. Laser Capture Microdissection for Protein and NanoString RNA analysis

    PubMed Central

    Golubeva, Yelena; Salcedo, Rosalba; Mueller, Claudius; Liotta, Lance A.; Espina, Virginia

    2013-01-01

    Laser capture microdissection (LCM) allows the precise procurement of enriched cell populations from a heterogeneous tissue, or live cell culture, under direct microscopic visualization. Histologically enriched cell populations can be procured by harvesting cells of interest directly, or isolating specific cells by ablating unwanted cells. The basic components of laser microdissection technology are a) visualization of cells via light microscopy, b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and c) removal of cells of interest from the heterogeneous tissue section. The capture and cutting methods (instruments) for laser microdissection differ in the manner by which cells of interest are removed from the heterogeneous sample. Laser energy in the capture method is infrared (810nm), while in the cutting mode the laser is ultraviolet (355nm). Infrared lasers melt a thermolabile polymer that adheres to the cells of interest, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes laser capture microdissection using an ArcturusXT instrument for protein LCM sample analysis, and using a mmi CellCut Plus® instrument for RNA analysis via NanoString technology. PMID:23027006

  14. The utility of live video capture to enhance debriefing following transcatheter aortic valve replacement.

    PubMed

    Seamans, David P; Louka, Boshra F; Fortuin, F David; Patel, Bhavesh M; Sweeney, John P; Lanza, Louis A; DeValeria, Patrick A; Ezrre, Kim M; Ramakrishna, Harish

    2016-10-01

    The surgical and procedural specialties are continually evolving their methods to include more complex and technically difficult cases. These cases can be longer and incorporate multiple teams in a different model of operating room synergy. Patients are frequently older, with comorbidities adding to the complexity of these cases. Recording of this environment has become more feasible recently with advancement in video and audio capture systems often used in the simulation realm. We began using live capture to record a new procedure shortly after starting these cases in our institution. This has provided continued assessment and evaluation of live procedures. The goal of this was to improve human factors and situational challenges by review and debriefing. B-Line Medical's LiveCapture video system was used to record successive transcatheter aortic valve replacement (TAVR) procedures in our cardiac catheterization/laboratory. An illustrative case is used to discuss analysis and debriefing of the case using this system. An illustrative case is presented that resulted in long-term changes to our approach of these cases. The video capture documented rare events during one of our TAVR procedures. Analysis and debriefing led to definitive changes in our practice. While there are hurdles to the use of this technology in every institution, the role for the ongoing use of video capture, analysis, and debriefing may play an important role in the future of patient safety and human factors analysis in the operating environment.

  15. The utility of live video capture to enhance debriefing following transcatheter aortic valve replacement

    PubMed Central

    Seamans, David P.; Louka, Boshra F.; Fortuin, F. David; Patel, Bhavesh M.; Sweeney, John P.; Lanza, Louis A.; DeValeria, Patrick A.; Ezrre, Kim M.; Ramakrishna, Harish

    2016-01-01

    Background: The surgical and procedural specialties are continually evolving their methods to include more complex and technically difficult cases. These cases can be longer and incorporate multiple teams in a different model of operating room synergy. Patients are frequently older, with comorbidities adding to the complexity of these cases. Recording of this environment has become more feasible recently with advancement in video and audio capture systems often used in the simulation realm. Aims: We began using live capture to record a new procedure shortly after starting these cases in our institution. This has provided continued assessment and evaluation of live procedures. The goal of this was to improve human factors and situational challenges by review and debriefing. Setting and Design: B-Line Medical's LiveCapture video system was used to record successive transcatheter aortic valve replacement (TAVR) procedures in our cardiac catheterization/laboratory. An illustrative case is used to discuss analysis and debriefing of the case using this system. Results and Conclusions: An illustrative case is presented that resulted in long-term changes to our approach of these cases. The video capture documented rare events during one of our TAVR procedures. Analysis and debriefing led to definitive changes in our practice. While there are hurdles to the use of this technology in every institution, the role for the ongoing use of video capture, analysis, and debriefing may play an important role in the future of patient safety and human factors analysis in the operating environment. PMID:27762242

  16. Economic and energetic analysis of capturing CO2 from ambient air

    PubMed Central

    House, Kurt Zenz; Baclig, Antonio C.; Ranjan, Manya; van Nierop, Ernst A.; Wilcox, Jennifer; Herzog, Howard J.

    2011-01-01

    Capturing carbon dioxide from the atmosphere (“air capture”) in an industrial process has been proposed as an option for stabilizing global CO2 concentrations. Published analyses suggest these air capture systems may cost a few hundred dollars per tonne of CO2, making it cost competitive with mainstream CO2 mitigation options like renewable energy, nuclear power, and carbon dioxide capture and storage from large CO2 emitting point sources. We investigate the thermodynamic efficiencies of commercial separation systems as well as trace gas removal systems to better understand and constrain the energy requirements and costs of these air capture systems. Our empirical analyses of operating commercial processes suggest that the energetic and financial costs of capturing CO2 from the air are likely to have been underestimated. Specifically, our analysis of existing gas separation systems suggests that, unless air capture significantly outperforms these systems, it is likely to require more than 400 kJ of work per mole of CO2, requiring it to be powered by CO2-neutral power sources in order to be CO2 negative. We estimate that total system costs of an air capture system will be on the order of $1,000 per tonne of CO2, based on experience with as-built large-scale trace gas removal systems. PMID:22143760

  17. Data Capture and Analysis Using the BBC Microcomputer--an Interfacing Project Applied to Enzyme Kinetics.

    ERIC Educational Resources Information Center

    Jones, Lawrence; Graham, Ian

    1986-01-01

    Reviews the main principles of interfacing and discusses the software developed to perform kinetic data capture and analysis with a BBC microcomputer linked to a recording spectrophotometer. Focuses on the steps in software development. Includes results of a lactate dehydrogenase assay. (ML)

  18. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  19. Simultaneous capture and in situ analysis of circulating tumor cells using multiple hybrid nanoparticles.

    PubMed

    Lee, Hun Joo; Cho, Hyeon-Yeol; Oh, Jin Ho; Namkoong, Kak; Lee, Jeong Gun; Park, Jong-Myeon; Lee, Soo Suk; Huh, Nam; Choi, Jeong-Woo

    2013-09-15

    Using hybrid nanoparticles (HNPs), we demonstrate simultaneous capture, in situ protein expression analysis, and cellular phenotype identification of circulating tumor cells (CTCs). Each HNP consists of three parts: (i) antibodies that bind specifically to a known biomarker for CTCs, (ii) a quantum dot that emits fluorescence signals, and (iii) biotinylated DNA that allows capture and release of CTC-HNP complex to an in-house developed capture & recovery chip (CRC). To evaluate our approach, cells representative of different breast cancer subtypes (MCF-7: luminal; SK-BR-3: HER2; and MDA-MB-231: basal-like) were captured onto CRC and expressions of EpCAM, HER2, and EGFR were detected concurrently. The average capture efficiency of CTCs was 87.5% with identification accuracy of 92.4%. Subsequently, by cleaving the DNA portion with restriction enzymes, captured cells were released at efficiencies of 86.1%. Further studies showed that these recovered cells are viable and can proliferate in vitro. Using HNPs, it is possible to count, analyze in situ protein expression, and culture CTCs, all from the same set of cells, enabling a wide range of molecular- and cellular-based studies using CTCs. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Capturing Public Opinion on Public Health Topics: A Comparison of Experiences from a Systematic Review, Focus Group Study, and Analysis of Online, User-Generated Content.

    PubMed

    Giles, Emma Louise; Adams, Jean M

    2015-01-01

    Capturing public opinion toward public health topics is important to ensure that services, policy, and research are aligned with the beliefs and priorities of the general public. A number of approaches can be used to capture public opinion. We are conducting a program of work on the effectiveness and acceptability of health promoting financial incentive interventions. We have captured public opinion on financial incentive interventions using three methods: a systematic review, focus group study, and analysis of online user-generated comments to news media reports. In this short editorial-style piece, we compare and contrast our experiences with these three methods. Each of these methods had their advantages and disadvantages. Advantages include tailoring of the research question for systematic reviews, probing of answers during focus groups, and the ability to aggregate a large data set using online user-generated content. However, disadvantages include needing to update systematic reviews, participants conforming to a dominant perspective in focus groups, and being unable to collect respondent characteristics during analysis of user-generated online content. That said, analysis of user-generated online content offers additional time and resource advantages, and we found it elicited similar findings to those obtained via more traditional methods, such as systematic reviews and focus groups. A number of methods for capturing public opinions on public health topics are available. Public health researchers, policy makers, and practitioners should choose methods appropriate to their aims. Analysis of user-generated online content, especially in the context of news media reports, may be a quicker and cheaper alternative to more traditional methods, without compromising on the breadth of opinions captured.

  1. Visual Field Asymmetry in Attentional Capture

    ERIC Educational Resources Information Center

    Du, Feng; Abrams, Richard A.

    2010-01-01

    The present study examined the spatial distribution of involuntary attentional capture over the two visual hemi-fields. A new experiment, and an analysis of three previous experiments showed that distractors in the left visual field that matched a sought-for target in color produced a much larger capture effect than identical distractors in the…

  2. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium)

    PubMed Central

    Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A.; Verbeke, Tom

    2018-01-01

    Background Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS. PMID:29404211

  3. Cost-benefit analysis for invasive species control: the case of greater Canada goose Branta canadensis in Flanders (northern Belgium).

    PubMed

    Reyns, Nikolaas; Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A; Verbeke, Tom; Adriaens, Tim

    2018-01-01

    Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis , an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS.

  4. Ethanol-dispersed and antibody-conjugated polymer nanofibers for the selective capture and 3-dimensional culture of EpCAM-positive cells.

    PubMed

    Yoon, Junghyo; Yoon, Hee-Sook; Shin, Yoojin; Kim, Sanghyun; Ju, Youngjun; Kim, Jungbae; Chung, Seok

    2017-07-01

    Electrospun and ethanol-dispersed polystyrene-poly(styrene-co-maleic anhydride) (PS-PSMA) nanofibers (NFs) were used as a platform for the selective capture and three-dimensional culture of EpCAM-positive cells in cell culture medium and whole blood. The NFs were treated with streptavidin to facilitate bond formation between the amino groups of streptavidin and the maleic anhydride groups of the NFs. A biotinylated anti-EpCAM monoclonal antibody (mAb) was attached to the streptavidin-conjugated NFs via the selective binding of streptavidin and biotin. Upon simple mixing and shaking with EpCAM-positive cancer cells in a wide concentration range from 10 to 1000,000 cells per 10mL, the mAb-attached NFs (mAb-NFs) captured the Ep-CAM positive cells in an efficiency of 59%-67% depending on initial cell concentrations, with minor mechanical capture of 14%-36%. Captured cells were directly cultured, forming cell aggregates, in the NF matrix, which ensures the cell proliferation and follow-up analysis. Furthermore, the capture capacity of mAb-NFs was assessed in the presence of whole blood and blood lysates, indicating cluster formation that captured target cells. It is anticipated that the antibody-attached NFs can be employed for the capture and analysis of very rare EpCAM positive circulating cancer cells. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Development of a balloon-borne device for analysis of high-altitude ice and aerosol particulates: Ice Cryo Encapsulator by Balloon (ICE-Ball)

    NASA Astrophysics Data System (ADS)

    Boaggio, K.; Bandamede, M.; Bancroft, L.; Hurler, K.; Magee, N. B.

    2016-12-01

    We report on details of continuing instrument development and deployment of a novel balloon-borne device for capturing and characterizing atmospheric ice and aerosol particles, the Ice Cryo Encapsulator by Balloon (ICE-Ball). The device is designed to capture and preserve cirrus ice particles, maintaining them at cold equilibrium temperatures, so that high-altitude particles can recovered, transferred intact, and then imaged under SEM at an unprecedented resolution (approximately 3 nm maximum resolution). In addition to cirrus ice particles, high altitude aerosol particles are also captured, imaged, and analyzed for geometry, chemical composition, and activity as ice nucleating particles. Prototype versions of ICE-Ball have successfully captured and preserved high altitude ice particles and aerosols, then returned them for recovery and SEM imaging and analysis. New improvements include 1) ability to capture particles from multiple narrowly-defined altitudes on a single payload, 2) high quality measurements of coincident temperature, humidity, and high-resolution video at capture altitude, 3) ability to capture particles during both ascent and descent, 4) better characterization of particle collection volume and collection efficiency, and 5) improved isolation and characterization of capture-cell cryo environment. This presentation provides detailed capability specifications for anyone interested in using measurements, collaborating on continued instrument development, or including this instrument in ongoing or future field campaigns.

  6. Stability of a slotted ALOHA system with capture effect

    NASA Astrophysics Data System (ADS)

    Onozato, Yoshikuni; Liu, Jin; Noguchi, Shoichi

    1989-02-01

    The stability of a slotted ALOHA system with capture effect is investigated under a general communication environment where terminals are divided into two groups (low-power and high-power) and the capture effect is modeled by capture probabilities. An approximate analysis is developed using catastrophe theory, in which the effects of system and user parameters on the stability are characterized by the cusp catastrophe. Particular attention is given to the low-power group, since it must bear the strain under the capture effect. The stability conditions of the two groups are given explicitly by bifurcation sets.

  7. Self-Assembly, Guest Capture, and NMR Spectroscopy of a Metal-Organic Cage in Water

    ERIC Educational Resources Information Center

    Go, Eun Bin; Srisuknimit, Veerasak; Cheng, Stephanie L.; Vosburg, David A.

    2016-01-01

    A green organic-inorganic laboratory experiment has been developed in which students prepare a self-assembling iron cage in D[subscript 2]O at room temperature. The tetrahedral cage captures a small, neutral molecule such as cyclohexane or tetrahydrofuran. [Superscript 1]H NMR analysis distinguishes captured and free guests through diagnostic…

  8. Silver-mordenite for radiologic gas capture from complex streams. Dual catalytic CH 3I decomposition and I confinement

    DOE PAGES

    Nenoff, Tina M.; Rodriguez, Mark A.; Soelberg, Nick R.; ...

    2014-05-09

    The selective capture of radiological iodine ( 129I) is a persistent concern for safe nuclear energy. In these nuclear fuel reprocessing scenarios, the gas streams to be treated are extremely complex, containing several distinct iodine-containing molecules amongst a large variety of other species. Silver-containing mordenite (MOR) is a longstanding benchmark for radioiodine capture, reacting with molecular iodine (I 2) to form AgI. However the mechanisms for organoiodine capture is not well understood. Here we investigate the capture of methyl iodide from complex mixed gas streams by combining chemical analysis of the effluent gas stream with in depth characterization of themore » recovered sorbent. Tools applied include infrared spectroscopy, thermogravimetric analysis with mass spectrometry, micro X-ray fluorescence, powder X-ray diffraction analysis, and pair distribution function analysis. Moreover, the MOR zeolite catalyzes decomposition of the methyl iodide through formation of surface methoxy species (SMS), which subsequently reacts with water in the mixed gas stream to form methanol, and with methanol to form dimethyl ether, which are both detected downstream in the effluent. The liberated iodine reacts with Ag in the MOR pore to the form subnanometer AgI clusters, smaller than the MOR pores, suggesting that the iodine is both physically and chemically confined within the zeolite.« less

  9. Stochastic capture zone analysis of an arsenic-contaminated well using the generalized likelihood uncertainty estimator (GLUE) methodology

    NASA Astrophysics Data System (ADS)

    Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro

    2003-06-01

    In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.

  10. Tilted pillar array fabrication by the combination of proton beam writing and soft lithography for microfluidic cell capture Part 2: Image sequence analysis based evaluation and biological application.

    PubMed

    Járvás, Gábor; Varga, Tamás; Szigeti, Márton; Hajba, László; Fürjes, Péter; Rajta, István; Guttman, András

    2018-02-01

    As a continuation of our previously published work, this paper presents a detailed evaluation of a microfabricated cell capture device utilizing a doubly tilted micropillar array. The device was fabricated using a novel hybrid technology based on the combination of proton beam writing and conventional lithography techniques. Tilted pillars offer unique flow characteristics and support enhanced fluidic interaction for improved immunoaffinity based cell capture. The performance of the microdevice was evaluated by an image sequence analysis based in-house developed single-cell tracking system. Individual cell tracking allowed in-depth analysis of the cell-chip surface interaction mechanism from hydrodynamic point of view. Simulation results were validated by using the hybrid device and the optimized surface functionalization procedure. Finally, the cell capture capability of this new generation microdevice was demonstrated by efficiently arresting cells from a HT29 cell-line suspension. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia

    ERIC Educational Resources Information Center

    Gucev, Gligor V.

    2012-01-01

    Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…

  12. Passive fishing techniques: a cause of turtle mortality in the Mississippi River

    USGS Publications Warehouse

    Barko, V.A.; Briggler, J.T.; Ostendorf, D.E.

    2004-01-01

    We investigated variation of incidentally captured turtle mortality in response to environmental factors and passive fishing techniques. We used Long Term Resource Monitoring Program (LTRMP) data collected from 1996 to 2001 in the unimpounded upper Mississippi River (UMR) adjacent to Missouri and Illinois, USA. We used a principle components analysis (PCA) and a stepwise discriminant function analysis to identify factors correlated with mortality of captured turtles. Furthermore, we were interested in what percentage of turtles died from passive fishing techniques and what techniques caused the most turtle mortality. The main factors influencing captured turtle mortality were water temperature and depth at net deployment. Fyke nets captured the most turtles and caused the most turtle mortality. Almost 90% of mortalities occurred in offshore aquatic areas (i.e., side channel or tributary). Our results provide information on causes of turtle mortality (as bycatch) in a riverine system and implications for river turtle conservation by suggesting management strategies to reduce turtle bycatch and decrease mortality of captured turtles.

  13. Modeling association among demographic parameters in analysis of open population capture-recapture data

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2005-01-01

    We present a hierarchical extension of the Cormack–Jolly–Seber (CJS) model for open population capture–recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis–Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.

  14. The effects of water depth on prey detection and capture by juvenile coho salmon and steelhead

    Treesearch

    J.J. Piccolo; N.F. Hughes; M.D. Bryant

    2007-01-01

    We used three-dimensional video analysis of feeding experiments to determine the effects of water depth on prey detection and capture by drift-feeding juvenile coho salmon (Oncorhynchus kisutch) and steelhead (0. mykiss irideus). Depth treatments were 0.15, 0.30, 0.45 and 0.60 m. Mean prey capture probabilities for both species...

  15. Multiplexed evaluation of capture agent binding kinetics using arrays of silicon photonic microring resonators.

    PubMed

    Byeon, Ji-Yeon; Bailey, Ryan C

    2011-09-07

    High affinity capture agents recognizing biomolecular targets are essential in the performance of many proteomic detection methods. Herein, we report the application of a label-free silicon photonic biomolecular analysis platform for simultaneously determining kinetic association and dissociation constants for two representative protein capture agents: a thrombin-binding DNA aptamer and an anti-thrombin monoclonal antibody. The scalability and inherent multiplexing capability of the technology make it an attractive platform for simultaneously evaluating the binding characteristics of multiple capture agents recognizing the same target antigen, and thus a tool complementary to emerging high-throughput capture agent generation strategies.

  16. Biomechanical analysis using Kinovea for sports application

    NASA Astrophysics Data System (ADS)

    Muaza Nor Adnan, Nor; Patar, Mohd Nor Azmi Ab; Lee, Hokyoo; Yamamoto, Shin-Ichiroh; Jong-Young, Lee; Mahmud, Jamaluddin

    2018-04-01

    This paper assesses the reliability of HD VideoCam–Kinovea as an alternative tool in conducting motion analysis and measuring knee relative angle of drop jump movement. The motion capture and analysis procedure were conducted in the Biomechanics Lab, Shibaura Institute of Technology, Omiya Campus, Japan. A healthy subject without any gait disorder (BMI of 28.60 ± 1.40) was recruited. The volunteered subject was asked to per the drop jump movement on preset platform and the motion was simultaneously recorded using an established infrared motion capture system (Hawk–Cortex) and a HD VideoCam in the sagittal plane only. The capture was repeated for 5 times. The outputs (video recordings) from the HD VideoCam were input into Kinovea (an open-source software) and the drop jump pattern was tracked and analysed. These data are compared with the drop jump pattern tracked and analysed earlier using the Hawk–Cortex system. In general, the results obtained (drop jump pattern) using the HD VideoCam–Kinovea are close to the results obtained using the established motion capture system. Basic statistical analyses show that most average variances are less than 10%, thus proving the repeatability of the protocol and the reliability of the results. It can be concluded that the integration of HD VideoCam–Kinovea has the potential to become a reliable motion capture–analysis system. Moreover, it is low cost, portable and easy to use. As a conclusion, the current study and its findings are found useful and has contributed to enhance significant knowledge pertaining to motion capture-analysis, drop jump movement and HD VideoCam–Kinovea integration.

  17. Adapting Cognitive Task Analysis to Investigate Clinical Decision Making and Medication Safety Incidents.

    PubMed

    Russ, Alissa L; Militello, Laura G; Glassman, Peter A; Arthur, Karen J; Zillich, Alan J; Weiner, Michael

    2017-05-03

    Cognitive task analysis (CTA) can yield valuable insights into healthcare professionals' cognition and inform system design to promote safe, quality care. Our objective was to adapt CTA-the critical decision method, specifically-to investigate patient safety incidents, overcome barriers to implementing this method, and facilitate more widespread use of cognitive task analysis in healthcare. We adapted CTA to facilitate recruitment of healthcare professionals and developed a data collection tool to capture incidents as they occurred. We also leveraged the electronic health record (EHR) to expand data capture and used EHR-stimulated recall to aid reconstruction of safety incidents. We investigated 3 categories of medication-related incidents: adverse drug reactions, drug-drug interactions, and drug-disease interactions. Healthcare professionals submitted incidents, and a subset of incidents was selected for CTA. We analyzed several outcomes to characterize incident capture and completed CTA interviews. We captured 101 incidents. Eighty incidents (79%) met eligibility criteria. We completed 60 CTA interviews, 20 for each incident category. Capturing incidents before interviews allowed us to shorten the interview duration and reduced reliance on healthcare professionals' recall. Incorporating the EHR into CTA enriched data collection. The adapted CTA technique was successful in capturing specific categories of safety incidents. Our approach may be especially useful for investigating safety incidents that healthcare professionals "fix and forget." Our innovations to CTA are expected to expand the application of this method in healthcare and inform a wide range of studies on clinical decision making and patient safety.

  18. Guidance, Navigation, and Control Techniques and Technologies for Active Satellite Removal

    NASA Astrophysics Data System (ADS)

    Ortega Hernando, Guillermo; Erb, Sven; Cropp, Alexander; Voirin, Thomas; Dubois-Matra, Olivier; Rinalducci, Antonio; Visentin, Gianfranco; Innocenti, Luisa; Raposo, Ana

    2013-09-01

    This paper shows an internal feasibility analysis to de- orbit a non-functional satellite of big dimensions by the Technical Directorate of the European Space Agency ESA. The paper focuses specifically on the design of the techniques and technologies for the Guidance, Navigation, and Control (GNC) system of the spacecraft mission that will capture the satellite and ultimately will de-orbit it on a controlled re-entry.The paper explains the guidance strategies to launch, rendezvous, close-approach, and capture the target satellite. The guidance strategy uses chaser manoeuvres, hold points, and collision avoidance trajectories to ensure a safe capture. It also details the guidance profile to de-orbit it in a controlled re-entry.The paper continues with an analysis of the required sensing suite and the navigation algorithms to allow the homing, fly-around, and capture of the target satellite. The emphasis is placed around the design of a system to allow the rendezvous with an un-cooperative target, including the autonomous acquisition of both the orbital elements and the attitude of the target satellite.Analysing the capture phase, the paper provides a trade- off between two selected capture systems: the net and the tentacles. Both are studied from the point of view of the GNC system.The paper analyses as well the advanced algorithms proposed to control the final compound after the capture that will allow the controlled de-orbiting of the assembly in a safe place in the Earth.The paper ends proposing the continuation of this work with the extension to the analysis of the destruction process of the compound in consecutive segments starting from the entry gate to the rupture and break up.

  19. Full-motion video analysis for improved gender classification

    NASA Astrophysics Data System (ADS)

    Flora, Jeffrey B.; Lochtefeld, Darrell F.; Iftekharuddin, Khan M.

    2014-06-01

    The ability of computer systems to perform gender classification using the dynamic motion of the human subject has important applications in medicine, human factors, and human-computer interface systems. Previous works in motion analysis have used data from sensors (including gyroscopes, accelerometers, and force plates), radar signatures, and video. However, full-motion video, motion capture, range data provides a higher resolution time and spatial dataset for the analysis of dynamic motion. Works using motion capture data have been limited by small datasets in a controlled environment. In this paper, we explore machine learning techniques to a new dataset that has a larger number of subjects. Additionally, these subjects move unrestricted through a capture volume, representing a more realistic, less controlled environment. We conclude that existing linear classification methods are insufficient for the gender classification for larger dataset captured in relatively uncontrolled environment. A method based on a nonlinear support vector machine classifier is proposed to obtain gender classification for the larger dataset. In experimental testing with a dataset consisting of 98 trials (49 subjects, 2 trials per subject), classification rates using leave-one-out cross-validation are improved from 73% using linear discriminant analysis to 88% using the nonlinear support vector machine classifier.

  20. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    NASA Astrophysics Data System (ADS)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2010-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.

  1. Estimating juvenile Chinook salmon (Oncorhynchus tshawytscha) abundance from beach seine data collected in the Sacramento–San Joaquin Delta and San Francisco Bay, California

    USGS Publications Warehouse

    Perry, Russell W.; Kirsch, Joseph E.; Hendrix, A. Noble

    2016-06-17

    Resource managers rely on abundance or density metrics derived from beach seine surveys to make vital decisions that affect fish population dynamics and assemblage structure. However, abundance and density metrics may be biased by imperfect capture and lack of geographic closure during sampling. Currently, there is considerable uncertainty about the capture efficiency of juvenile Chinook salmon (Oncorhynchus tshawytscha) by beach seines. Heterogeneity in capture can occur through unrealistic assumptions of closure and from variation in the probability of capture caused by environmental conditions. We evaluated the assumptions of closure and the influence of environmental conditions on capture efficiency and abundance estimates of Chinook salmon from beach seining within the Sacramento–San Joaquin Delta and the San Francisco Bay. Beach seine capture efficiency was measured using a stratified random sampling design combined with open and closed replicate depletion sampling. A total of 56 samples were collected during the spring of 2014. To assess variability in capture probability and the absolute abundance of juvenile Chinook salmon, beach seine capture efficiency data were fitted to the paired depletion design using modified N-mixture models. These models allowed us to explicitly test the closure assumption and estimate environmental effects on the probability of capture. We determined that our updated method allowing for lack of closure between depletion samples drastically outperformed traditional data analysis that assumes closure among replicate samples. The best-fit model (lowest-valued Akaike Information Criterion model) included the probability of fish being available for capture (relaxed closure assumption), capture probability modeled as a function of water velocity and percent coverage of fine sediment, and abundance modeled as a function of sample area, temperature, and water velocity. Given that beach seining is a ubiquitous sampling technique for many species, our improved sampling design and analysis could provide significant improvements in density and abundance estimation.

  2. High Dynamic Range Digital Imaging of Spacecraft

    NASA Technical Reports Server (NTRS)

    Karr, Brian A.; Chalmers, Alan; Debattista, Kurt

    2014-01-01

    The ability to capture engineering imagery with a wide degree of dynamic range during rocket launches is critical for post launch processing and analysis [USC03, NNC86]. Rocket launches often present an extreme range of lightness, particularly during night launches. Night launches present a two-fold problem: capturing detail of the vehicle and scene that is masked by darkness, while also capturing detail in the engine plume.

  3. Water velocity influences prey detection and capture by drift-feeding juvenile coho salmon (Oncorhynchus kisutch) and steelhead (Oncorhynchus mykiss irideus)

    Treesearch

    John J. Piccolo; Nicholas F. Hughes; Mason D. Bryant

    2008-01-01

    We examined the effects of water velocity on prey detection and capture by drift-feeding juvenile coho salmon (Oncorhynchus kisutch) and steelhead (sea-run rainbow trout,Oncorhynchus mykiss irideus) in laboratory experiments. We used repeated-measures analysis of variance to test the effects of velocity, species, and the velocity x species interaction on prey capture...

  4. [Specificity of the Adultrap for capturing females of Aedes aegypti (Diptera: Culicidae)].

    PubMed

    Gomes, Almério de Castro; da Silva, Nilza Nunes; Bernal, Regina Tomie Ivata; Leandro, André de Souza; de Camargo, Natal Jataí; da Silva, Allan Martins; Ferreira, Adão Celestino; Ogura, Luis Carlos; de Oliveira, Sebastião José; de Moura, Silvestre Marques

    2007-01-01

    The Adultrap is a new trap built for capturing females of Aedes aegypti. Tests were carried out to evaluate the specificity of this trap in comparison with the technique of aspiration of specimens in artificial shelters. Adultraps were kept for 24 hours inside and outside 120 randomly selected homes in two districts of the city of Foz do Iguaçú, State of Paraná. The statistical test was Poissons log-linear model. The result was 726 mosquitoes captured, of which 80 were Aedes aegypti. The Adultrap captured only females of this species, while the aspiration method captured both sexes of Aedes aegypti and another five species. The Adultrap captured Aedes aegypti inside and outside the homes, but the analysis indicated that, outside the homes, this trap captured significantly more females than aspiration did. The sensitivity of the Adultrap for detecting females of Aedes aegypti in low-frequency situations was also demonstrated.

  5. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  6. Energy and economic analysis of the carbon dioxide capture installation with the use of monoethanolamine and ammonia

    NASA Astrophysics Data System (ADS)

    Bochon, Krzysztof; Chmielniak, Tadeusz

    2015-03-01

    In the study an accurate energy and economic analysis of the carbon capture installation was carried out. Chemical absorption with the use of monoethanolamine (MEA) and ammonia was adopted as the technology of carbon dioxide (CO2) capture from flue gases. The energy analysis was performed using a commercial software package to analyze the chemical processes. In the case of MEA, the demand for regeneration heat was about 3.5 MJ/kg of CO2, whereas for ammonia it totalled 2 MJ/kg CO2. The economic analysis was based on the net present value (NPV) method. The limit price for CO2 emissions allowances at which the investment project becomes profitable (NPV = 0) was more than 160 PLN/Mg for MEA and less than 150 PLN/Mg for ammonia. A sensitivity analysis was also carried out to determine the limit price of CO2 emissions allowances depending on electricity generation costs at different values of investment expenditures.

  7. Factors influencing the variation in capture rates of shrews in southern California, USA

    USGS Publications Warehouse

    Laakkonen, Juha; Fisher, Robert N.; Case, Ted J.

    2003-01-01

    We examined the temporal variation in capture rates of shrewsNotiosorex crawfordi (Coues, 1877) and Sorex ornatus (Merriam, 1895) in 20 sites representing fragmented and continuous habitats in southern California, USA. InN. crawfordi, the temporal variation was significantly correlated with the mean capture rates. Of the 6 landscape variables analyzed (size of the landscape, size of the sample area, altitude, edge, longitude and latitude), sample area was positively correlated with variation in capture rates ofN. crawfordi. InS. ornatus, longitude was negatively correlated with variation in capture rates. Analysis of the effect of precipitation on the short- and long-term capture rates at 2 of the sites showed no correlation between rainfall and capture rates of shrews even though peak number of shrews at both sites were reached during the year of highest amount of rainfall. A key problem confounding capture rates of shrews in southern California is the low overall abundance of both shrew species in all habitats and seasons.

  8. Systematic R -matrix analysis of the 13C(p ,γ )14N capture reaction

    NASA Astrophysics Data System (ADS)

    Chakraborty, Suprita; deBoer, Richard; Mukherjee, Avijit; Roy, Subinit

    2015-04-01

    Background: The proton capture reaction 13C(p ,γ )14N is an important reaction in the CNO cycle during hydrogen burning in stars with mass greater than the mass of the Sun. It also occurs in astrophysical sites such as red giant stars: the asymptotic giant branch (AGB) stars. The low energy astrophysical S factor of this reaction is dominated by a resonance state at an excitation energy of around 8.06 MeV (Jπ=1-,T =1 ) in 14N. The other significant contributions come from the low energy tail of the broad resonance with Jπ=0-,T =1 at an excitation of 8.78 MeV and the direct capture process. Purpose: Measurements of the low energy astrophysical S factor of the radiative capture reaction 13C(p ,γ )14N reported extrapolated values of S (0 ) that differ by about 30 % . Subsequent R -matrix analysis and potential model calculations also yielded significantly different values for S (0 ) . The present work intends to look into the discrepancy through a detailed R -matrix analysis with emphasis on the associated uncertainties. Method: A systematic reanalysis of the available decay data following the capture to the Jπ=1-,T =1 resonance state of 14N around 8.06 MeV excitation had been performed within the framework of the R -matrix method. A simultaneous analysis of the 13C(p ,p0 ) data, measured over a similar energy range, was carried out with the capture data. The data for the ground state decay of the broad resonance state (Jπ=0-,T =1 ) around 8.78 MeV excitations was included as well. The external capture model along with the background poles to simulate the internal capture contribution were used to estimate the direct capture contribution. The asymptotic normalization constants (ANCs) for all states were extracted from the capture data. The multichannel, multilevel R -matrix code azure2 was used for the calculation. Results: The values of the astrophysical S factor at zero relative energy, resulting from the present analysis, are found to be consistent within the error bars for the two sets of capture data used. However, it is found from the fits to the elastic scattering data that the position of the Jπ=1-,T =1 resonance state is uncertain by about 0.6 keV, preferring an excitation energy value of 8.062 MeV. Also the extracted ANC values for the states of 14N corroborate the values from the transfer reaction studies. The reaction rates from the present calculation are about 10 -15 % lower than the values of the NACRE II compilation but compare well with those from NACRE I. Conclusion: The precise energy of the Jπ=1-,T =1 resonance level around 8.06 MeV in 14N must be determined. Further measurements around and below 100 keV with precision are necessary to reduce the uncertainty in the S -factor value at zero relative energy.

  9. DETECTION OF K-RAS AND P53 MUTATIONS IN SPUTUM SAMPLES OF LUNG CANCER PATIENTS USING LASER CAPTURE MICRODISSECTION MICROSCOPE AND MUTATION ANALYSIS

    EPA Science Inventory

    Detection of K-ras and p53 Mutations in Sputum Samples of Lung Cancer Patients Using Laser Capture Microdissection Microscope and Mutation Analysis

    Phouthone Keohavong a,*, Wei-Min Gao a, Kui-Cheng Zheng a, Hussam Mady b, Qing Lan c, Mona Melhem b, and Judy Mumford d.
    <...

  10. DNA-barcode directed capture and electrochemical metabolic analysis of single mammalian cells on a microelectrode array.

    PubMed

    Douglas, Erik S; Hsiao, Sonny C; Onoe, Hiroaki; Bertozzi, Carolyn R; Francis, Matthew B; Mathies, Richard A

    2009-07-21

    A microdevice is developed for DNA-barcode directed capture of single cells on an array of pH-sensitive microelectrodes for metabolic analysis. Cells are modified with membrane-bound single-stranded DNA, and specific single-cell capture is directed by the complementary strand bound in the sensor area of the iridium oxide pH microelectrodes within a microfluidic channel. This bifunctional microelectrode array is demonstrated for the pH monitoring and differentiation of primary T cells and Jurkat T lymphoma cells. Single Jurkat cells exhibited an extracellular acidification rate of 11 milli-pH min(-1), while primary T cells exhibited only 2 milli-pH min(-1). This system can be used to capture non-adherent cells specifically and to discriminate between visually similar healthy and cancerous cells in a heterogeneous ensemble based on their altered metabolic properties.

  11. DNA-barcode directed capture and electrochemical metabolic analysis of single mammalian cells on a microelectrode array

    PubMed Central

    Douglas, Erik S.; Hsiao, Sonny C.; Onoe, Hiroaki; Bertozzi, Carolyn R.; Francis, Matthew B.; Mathies, Richard A.

    2010-01-01

    A microdevice is developed for DNA-barcode directed capture of single cells on an array of pH-sensitive microelectrodes for metabolic analysis. Cells are modified with membrane-bound single-stranded DNA, and specific single-cell capture is directed by the complementary strand bound in the sensor area of the iridium oxide pH microelectrodes within a microfluidic channel. This bifunctional microelectrode array is demonstrated for the pH monitoring and differentiation of primary T cells and Jurkat T lymphoma cells. Single Jurkat cells exhibited an extracellular acidification rate of 11 milli-pH min−1, while primary T cells exhibited only 2 milli-pH min−1. This system can be used to capture non-adherent cells specifically and to discriminate between visually similar healthy and cancerous cells in a heterogeneous ensemble based on their altered metabolic properties. PMID:19568668

  12. Integrated studies on the use of cognitive task analysis to capture surgical expertise for central venous catheter placement and open cricothyrotomy.

    PubMed

    Yates, Kenneth; Sullivan, Maura; Clark, Richard

    2012-01-01

    Cognitive task analysis (CTA) methods were used for 2 surgical procedures to determine (1) the extent that experts omitted critical information, (2) the number of experts required to capture the optimalamount of information, and (3) the effectiveness of a CTA-informed curriculum. Six expert physicians for both the central venous catheter placement and open cricothyrotomy were interviewed. The transcripts were coded, corrected, and aggregated as a "gold standard." The information captured for each surgeon was then analyzed against the gold standard. Experts omitted an average of 34% of the decisions for the central venous catheter and 77% of the decisions for the Cric. Three to 4 experts were required to capture the optimal amount of information. A significant positive effect on performance (t([21]) = 2.08, P = .050), and self-efficacy ratings (t([18]) = 2.38, P = .029) were found for the CTA-informed curriculum for cricothyrotomy. CTA is an effective method to capture expertise in surgery and a valuable component to improve surgical training. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  14. Dielectrophoretic Capture and Genetic Analysis of Single Neuroblastoma Tumor Cells

    PubMed Central

    Carpenter, Erica L.; Rader, JulieAnn; Ruden, Jacob; Rappaport, Eric F.; Hunter, Kristen N.; Hallberg, Paul L.; Krytska, Kate; O’Dwyer, Peter J.; Mosse, Yael P.

    2014-01-01

    Our understanding of the diversity of cells that escape the primary tumor and seed micrometastases remains rudimentary, and approaches for studying circulating and disseminated tumor cells have been limited by low throughput and sensitivity, reliance on single parameter sorting, and a focus on enumeration rather than phenotypic and genetic characterization. Here, we utilize a highly sensitive microfluidic and dielectrophoretic approach for the isolation and genetic analysis of individual tumor cells. We employed fluorescence labeling to isolate 208 single cells from spiking experiments conducted with 11 cell lines, including 8 neuroblastoma cell lines, and achieved a capture sensitivity of 1 tumor cell per 106 white blood cells (WBCs). Sample fixation or freezing had no detectable effect on cell capture. Point mutations were accurately detected in the whole genome amplification product of captured single tumor cells but not in negative control WBCs. We applied this approach to capture 144 single tumor cells from 10 bone marrow samples of patients suffering from neuroblastoma. In this pediatric malignancy, high-risk patients often exhibit wide-spread hematogenous metastasis, but access to primary tumor can be difficult or impossible. Here, we used flow-based sorting to pre-enrich samples with tumor involvement below 0.02%. For all patients for whom a mutation in the Anaplastic Lymphoma Kinase gene had already been detected in their primary tumor, the same mutation was detected in single cells from their marrow. These findings demonstrate a novel, non-invasive, and adaptable method for the capture and genetic analysis of single tumor cells from cancer patients. PMID:25133137

  15. Gold patterned biochips for on-chip immuno-MALDI-TOF MS: SPR imaging coupled multi-protein MS analysis.

    PubMed

    Kim, Young Eun; Yi, So Yeon; Lee, Chang-Soo; Jung, Yongwon; Chung, Bong Hyun

    2012-01-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) analysis of immuno-captured target protein efficiently complements conventional immunoassays by offering rich molecular information such as protein isoforms or modifications. Direct immobilization of antibodies on MALDI solid support enables both target enrichment and MS analysis on the same plate, allowing simplified and potentially multiplexing protein MS analysis. Reliable on-chip immuno-MALDI-TOF MS for multiple biomarkers requires successful adaptation of antibody array biochips, which also must accommodate consistent reaction conditions on antibody arrays during immuno-capture and MS analysis. Here we developed a facile fabrication process of versatile antibody array biochips for reliable on-chip MALDI-TOF-MS analysis of multiple immuno-captured proteins. Hydrophilic gold arrays surrounded by super-hydrophobic surfaces were formed on a gold patterned biochip via spontaneous chemical or protein layer deposition. From antibody immobilization to MALDI matrix treatment, this hydrophilic/phobic pattern allowed highly consistent surface reactions on each gold spot. Various antibodies were immobilized on these gold spots both by covalent coupling or protein G binding. Four different protein markers were successfully analyzed on the present immuno-MALDI biochip from complex protein mixtures including serum samples. Tryptic digests of captured PSA protein were also effectively detected by on-chip MALDI-TOF-MS. Moreover, the present MALDI biochip can be directly applied to the SPR imaging system, by which antibody and subsequent antigen immobilization were successfully monitored.

  16. Computational simulation of extravehicular activity dynamics during a satellite capture attempt.

    PubMed

    Schaffner, G; Newman, D J; Robinson, S K

    2000-01-01

    A more quantitative approach to the analysis of astronaut extravehicular activity (EVA) tasks is needed because of their increasing complexity, particularly in preparation for the on-orbit assembly of the International Space Station. Existing useful EVA computer analyses produce either high-resolution three-dimensional computer images based on anthropometric representations or empirically derived predictions of astronaut strength based on lean body mass and the position and velocity of body joints but do not provide multibody dynamic analysis of EVA tasks. Our physics-based methodology helps fill the current gap in quantitative analysis of astronaut EVA by providing a multisegment human model and solving the equations of motion in a high-fidelity simulation of the system dynamics. The simulation work described here improves on the realism of previous efforts by including three-dimensional astronaut motion, incorporating joint stops to account for the physiological limits of range of motion, and incorporating use of constraint forces to model interaction with objects. To demonstrate the utility of this approach, the simulation is modeled on an actual EVA task, namely, the attempted capture of a spinning Intelsat VI satellite during STS-49 in May 1992. Repeated capture attempts by an EVA crewmember were unsuccessful because the capture bar could not be held in contact with the satellite long enough for the capture latches to fire and successfully retrieve the satellite.

  17. Single cell array impedance analysis in a microfluidic device

    NASA Astrophysics Data System (ADS)

    Altinagac, Emre; Taskin, Selen; Kizil, Huseyin

    2016-10-01

    Impedance analysis of single cells is presented in this paper. Following the separation of a target cell type by dielectrophoresis in our previous work, this paper focuses on capturing the cells as a single array and performing impedance analysis to point out the signature difference between each cell type. Lab-on-a-chip devices having a titanium interdigitated electrode layer on a glass substrate and a PDMS microchannel are fabricated to capture each cell in a single form and perform impedance analysis. HCT116 (homosapiens colon colorectal carcin) and HEK293 (human embryonic kidney) cells are used in our experiments.

  18. Capture of activation during ventricular arrhythmia using distributed stimulation.

    PubMed

    Meunier, Jason M; Ramalingam, Sanjiv; Lin, Shien-Fong; Patwardhan, Abhijit R

    2007-04-01

    Results of previous studies suggest that pacing strength stimuli can capture activation during ventricular arrhythmia locally near pacing sites. The existence of spatio-temporal distribution of excitable gap during arrhythmia suggests that multiple and timed stimuli delivered over a region may permit capture over larger areas. Our objective in this study was to evaluate the efficacy of using spatially distributed pacing (DP) to capture activation during ventricular arrhythmia. Data were obtained from rabbit hearts which were placed against a lattice of parallel wires through which biphasic pacing stimuli were delivered. Electrical activity was recorded optically. Pacing stimuli were delivered in sequence through the parallel wires starting with the wire closest to the apex and ending with one closest to the base. Inter-stimulus delay was based on conduction velocity. Time-frequency analysis of optical signals was used to determine variability in activation. A decrease in standard deviation of dominant frequencies of activation from a grid of locations that spanned the captured area and a concurrence with paced frequency were used as an index of capture. Results from five animals showed that the average standard deviation decreased from 0.81 Hz during arrhythmia to 0.66 Hz during DP at pacing cycle length of 125 ms (p = 0.03) reflecting decreased spatio-temporal variability in activation during DP. Results of time-frequency analysis during these pacing trials showed agreement between activation and paced frequencies. These results show that spatially distributed and timed stimulation can be used to modify and capture activation during ventricular arrhythmia.

  19. A comparative analysis of exome capture.

    PubMed

    Parla, Jennifer S; Iossifov, Ivan; Grabill, Ian; Spector, Mona S; Kramer, Melissa; McCombie, W Richard

    2011-09-29

    Human exome resequencing using commercial target capture kits has been and is being used for sequencing large numbers of individuals to search for variants associated with various human diseases. We rigorously evaluated the capabilities of two solution exome capture kits. These analyses help clarify the strengths and limitations of those data as well as systematically identify variables that should be considered in the use of those data. Each exome kit performed well at capturing the targets they were designed to capture, which mainly corresponds to the consensus coding sequences (CCDS) annotations of the human genome. In addition, based on their respective targets, each capture kit coupled with high coverage Illumina sequencing produced highly accurate nucleotide calls. However, other databases, such as the Reference Sequence collection (RefSeq), define the exome more broadly, and so not surprisingly, the exome kits did not capture these additional regions. Commercial exome capture kits provide a very efficient way to sequence select areas of the genome at very high accuracy. Here we provide the data to help guide critical analyses of sequencing data derived from these products.

  20. Combinatorial ligand libraries as a two-dimensional method for proteome analysis.

    PubMed

    Santucci, Laura; Candiano, Giovanni; Petretto, Andrea; Lavarello, Chiara; Bruschi, Maurizio; Ghiggeri, Gian Marco; Citterio, Attilio; Righetti, Pier Giorgio

    2013-07-05

    The present report tries to assess the possibility of performing capture of proteomes via combinatorial peptide ligand libraries (CPLL) in a two-dimensional (2D) mode, i.e. via orthogonal complementarity in the capture phase. To that aim, serum proteins are captured at physiological pH either at low ionic strength (25mM NaCl) or at high concentrations of lyotropic salts of the Hofmeister series (1M ammonium sulphate) favouring hydrophobic interaction. Indeed such 2D mechanisms seems to be operative, since 52% of the captured proteins are common to the two capture modes, 20% are specific only of the "ionic" interaction mode and 28% are found only in the "hydrophobically" driven interaction. As an additional bonus, losses of protein species from the initial sample, one of the major drawbacks of CPLLs, are diminished to about 5% and are found only in the ionic capture, whereas the hydrophobically engendered capture is loss-free. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. DEVELOPMENT OF A NOVEL GAS PRESSURIZED STRIPPING (GPS)-BASED TECHNOLOGY FOR CO 2 CAPTURE FROM POST-COMBUSTION FLUE GASES Topical Report: Techno-Economic Analysis of GPS-based Technology for CO 2 Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Shiaoguo

    This topical report presents the techno-economic analysis, conducted by Carbon Capture Scientific, LLC (CCS) and Nexant, for a nominal 550 MWe supercritical pulverized coal (PC) power plant utilizing CCS patented Gas Pressurized Stripping (GPS) technology for post-combustion carbon capture (PCC). Illinois No. 6 coal is used as fuel. Because of the difference in performance between the GPS-based PCC and the MEA-based CO2 absorption technology, the net power output of this plant is not exactly 550 MWe. DOE/NETL Case 11 supercritical PC plant without CO2 capture and Case 12 supercritical PC plant with benchmark MEA-based CO2 capture are chosen as references.more » In order to include CO2 compression process for the baseline case, CCS independently evaluated the generic 30 wt% MEA-based PCC process together with the CO2 compression section. The net power produced in the supercritical PC plant with GPS-based PCC is 647 MW, greater than the MEA-based design. The levelized cost of electricity (LCOE) over a 20-year period is adopted to assess techno-economic performance. The LCOE for the supercritical PC plant with GPS-based PCC, not considering CO2 transport, storage and monitoring (TS&M), is 97.4 mills/kWh, or 152% of the Case 11 supercritical PC plant without CO2 capture, equivalent to $39.6/tonne for the cost of CO2 capture. GPS-based PCC is also significantly superior to the generic MEA-based PCC with CO2 compression section, whose LCOE is as high as 109.6 mills/kWh.« less

  2. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  3. The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.

    PubMed

    Hachaj, Tomasz; Ogiela, Marek R

    2016-06-01

    The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.

  4. Early Design Choices: Capture, Model, Integrate, Analyze, Simulate

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2004-01-01

    I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.

  5. Water Microbiology Kit/Microbial Capture Devices (WMK MCD)

    NASA Image and Video Library

    2009-08-04

    ISS020-E-027318 (4 Aug. 2009) --- Canadian Space Agency astronaut Robert Thirsk, Expedition 20 flight engineer, performs a subsequent in-flight analysis with a Water Microbiology Kit/Microbial Capture Devices (WMK MCD) for microbial traces in the Destiny laboratory of the International Space Station.

  6. Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.

    PubMed

    Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert

    2013-08-01

    Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.

  7. Thermal Neutron Capture onto the Stable Tungsten Isotopes

    NASA Astrophysics Data System (ADS)

    Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Summers, N. C.; Revay, Zs.; Szentmiklósi, L.; Belgya, T.; Basunia, M. S.; Capote, R.; Choi, H.; Dashdorj, D.; Escher, J.; Krticka, M.; Nichols, A.

    2012-02-01

    Thermal neutron-capture measurements of the stable tungsten isotopes have been carried out using the guided thermal-neutron beam at the Budapest Reactor. Prompt singles spectra were collected and analyzed using the HYPERMET γ-ray analysis software package for the compound tungsten systems 183W, 184W, and 187W, prepared from isotopically-enriched samples of 182W, 183W, and 186W, respectively. These new data provide both confirmation and new insights into the decay schemes and structure of the tungsten isotopes reported in the Evaluated Gamma-ray Activation File based upon previous elemental analysis. The experimental data have also been compared to Monte Carlo simulations of γ-ray emission following the thermal neutron-capture process using the statistical-decay code DICEBOX. Together, the experimental cross sections and modeledfeeding contribution from the quasi continuum, have been used to determine the total radiative thermal neutron-capture cross sections for the tungsten isotopes and provide improved decay-scheme information for the structural- and neutron-data libraries.

  8. Sequence selective capture, release and analysis of DNA using a magnetic microbead-assisted toehold-mediated DNA strand displacement reaction.

    PubMed

    Khodakov, Dmitriy A; Khodakova, Anastasia S; Linacre, Adrian; Ellis, Amanda V

    2014-07-21

    This paper reports on the modification of magnetic beads with oligonucleotide capture probes with a specially designed pendant toehold (overhang) aimed specifically to capture double-stranded PCR products. After capture, the PCR products were selectively released from the magnetic beads by means of a toehold-mediated strand displacement reaction using short artificial oligonucleotide triggers and analysed using capillary electrophoresis. The approach was successfully shown on two genes widely used in human DNA genotyping, namely human c-fms (macrophage colony-stimulating factor) proto-oncogene for the CSF-1 receptor (CSF1PO) and amelogenin.

  9. Mobile Motion Capture--MiMiC.

    PubMed

    Harbert, Simeon D; Jaiswal, Tushar; Harley, Linda R; Vaughn, Tyler W; Baranak, Andrew S

    2013-01-01

    The low cost, simple, robust, mobile, and easy to use Mobile Motion Capture (MiMiC) system is presented and the constraints which guided the design of MiMiC are discussed. The MiMiC Android application allows motion data to be captured from kinematic modules such as Shimmer 2r sensors over Bluetooth. MiMiC is cost effective and can be used for an entire day in a person's daily routine without being intrusive. MiMiC is a flexible motion capture system which can be used for many applications including fall detection, detection of fatigue in industry workers, and analysis of individuals' work patterns in various environments.

  10. What's Next in Complex Networks? Capturing the Concept of Attacking Play in Invasive Team Sports.

    PubMed

    Ramos, João; Lopes, Rui J; Araújo, Duarte

    2018-01-01

    The evolution of performance analysis within sports sciences is tied to technology development and practitioner demands. However, how individual and collective patterns self-organize and interact in invasive team sports remains elusive. Social network analysis has been recently proposed to resolve some aspects of this problem, and has proven successful in capturing collective features resulting from the interactions between team members as well as a powerful communication tool. Despite these advances, some fundamental team sports concepts such as an attacking play have not been properly captured by the more common applications of social network analysis to team sports performance. In this article, we propose a novel approach to team sports performance centered on sport concepts, namely that of an attacking play. Network theory and tools including temporal and bipartite or multilayered networks were used to capture this concept. We put forward eight questions directly related to team performance to discuss how common pitfalls in the use of network tools for capturing sports concepts can be avoided. Some answers are advanced in an attempt to be more precise in the description of team dynamics and to uncover other metrics directly applied to sport concepts, such as the structure and dynamics of attacking plays. Finally, we propose that, at this stage of knowledge, it may be advantageous to build up from fundamental sport concepts toward complex network theory and tools, and not the other way around.

  11. Measurement of θ 13 in Double Chooz using neutron captures on hydrogen with novel background rejection techniques

    DOE PAGES

    Abe, Y.; Appel, S.; Abrahão, T.; ...

    2016-01-27

    We observed a measurement of the Double Chooz collaboration and the neutrino mixing angle θ 13 using reactormore » $$\\bar{v}$$ e via the inverse beta decay reaction in which the neutron is captured on hydrogen. Our measurement is based on 462.72 live days data, approximately twice as much data as in the previous such analysis, collected with a detector positioned at an average distance of 1050 m from two reactor cores. Several novel techniques have been developed to achieve significant reductions of the backgrounds and systematic uncertainties. Accidental coincidences, the dominant background in this analysis, are suppressed by more than an order of magnitude with respect to our previous publication by a multi-variate analysis. Furthermore, these improvements demonstrate the capability of precise measurement of reactor $$\\bar{v}$$ e without gadolinium loading. Spectral distortions from the $$\\bar{v}$$ e reactor flux predictions previously reported with the neutron capture on gadolinium events are confirmed in the independent data sample presented here. A value of sin 2 2θ 13= 0.095 0.039 +0.038 (stat+syst) is obtained from a fit to the observed event rate as a function of the reactor power, a method insensitive to the energy spectrum shape. A simultaneous fit of the hydrogen capture events and of the gadolinium capture events yields a measurement of sin 2 2θ 13 = 0.088 ± 0.033(stat+syst).« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system.more » RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant materials.« less

  13. An automated time and hand motion analysis based on planar motion capture extended to a virtual environment

    NASA Astrophysics Data System (ADS)

    Tinoco, Hector A.; Ovalle, Alex M.; Vargas, Carlos A.; Cardona, María J.

    2015-09-01

    In the context of industrial engineering, the predetermined time systems (PTS) play an important role in workplaces because inefficiencies are found in assembly processes that require manual manipulations. In this study, an approach is proposed with the aim to analyze time and motions in a manual process using a capture motion system embedded to a virtual environment. Capture motion system tracks IR passive markers located on the hands to take the positions of each one. For our purpose, a real workplace is virtually represented by domains to create a virtual workplace based on basic geometries. Motion captured data are combined with the virtual workplace to simulate operations carried out on it, and a time and motion analysis is completed by means of an algorithm. To test the methodology of analysis, a case study was intentionally designed using and violating the principles of motion economy. In the results, it was possible to observe where the hands never crossed as well as where the hands passed by the same place. In addition, the activities done in each zone were observed and some known deficiencies were identified in the distribution of the workplace by computational analysis. Using a frequency analysis of hand velocities, errors in the chosen assembly method were revealed showing differences in the hand velocities. An opportunity is seen to classify some quantifiable aspects that are not identified easily in a traditional time and motion analysis. The automated analysis is considered as the main contribution in this study. In the industrial context, a great application is perceived in terms of monitoring the workplace to analyze repeatability, PTS, workplace and labor activities redistribution using the proposed methodology.

  14. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    NASA Astrophysics Data System (ADS)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  15. CZAEM USER'S GUIDE: MODELING CAPTURE ZONES OF GROUND-WATER WELLS USING ANALYTIC ELEMENTS

    EPA Science Inventory

    The computer program CZAEM is designed for elementary capture zone analysis, and is based on the analytic element method. CZAEM is applicable to confined and/or unconfined low in shallow aquifers; the Dupuit-Forchheimer assumption is adopted. CZAEM supports the following analyt...

  16. Research on Capturing of Customer Requirements Based on Innovation Theory

    NASA Astrophysics Data System (ADS)

    junwu, Ding; dongtao, Yang; zhenqiang, Bao

    To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.

  17. Lateral flow devices

    DOEpatents

    Mazumdar, Debapriya; Liu, Juewen; Lu, Yi

    2010-09-21

    An analytical test for an analyte comprises (a) a base, having a reaction area and a visualization area, (b) a capture species, on the base in the visualization area, comprising nucleic acid, and (c) analysis chemistry reagents, on the base in the reaction area. The analysis chemistry reagents comprise (i) a substrate comprising nucleic acid and a first label, and (ii) a reactor comprising nucleic acid. The analysis chemistry reagents can react with a sample comprising the analyte and water, to produce a visualization species comprising nucleic acid and the first label, and the capture species can bind the visualization species.

  18. Visualizing Ebolavirus Particles Using Single-Particle Interferometric Reflectance Imaging Sensor (SP-IRIS).

    PubMed

    Carter, Erik P; Seymour, Elif Ç; Scherr, Steven M; Daaboul, George G; Freedman, David S; Selim Ünlü, M; Connor, John H

    2017-01-01

    This chapter describes an approach for the label-free imaging and quantification of intact Ebola virus (EBOV) and EBOV viruslike particles (VLPs) using a light microscopy technique. In this technique, individual virus particles are captured onto a silicon chip that has been printed with spots of virus-specific capture antibodies. These captured virions are then detected using an optical approach called interference reflectance imaging. This approach allows for the detection of each virus particle that is captured on an antibody spot and can resolve the filamentous structure of EBOV VLPs without the need for electron microscopy. Capture of VLPs and virions can be done from a variety of sample types ranging from tissue culture medium to blood. The technique also allows automated quantitative analysis of the number of virions captured. This can be used to identify the virus concentration in an unknown sample. In addition, this technique offers the opportunity to easily image virions captured from native solutions without the need for additional labeling approaches while offering a means of assessing the range of particle sizes and morphologies in a quantitative manner.

  19. Making Definitions Explicit and Capturing Evaluation Policies.

    ERIC Educational Resources Information Center

    Houston, Samuel R.

    Judgment ANalysis (JAN) is described as a technique for identifying the rating policies that exist within a group of judges. Studies are presented in which JAN has been used in evaluating teacher effectiveness by capturing both student and faculty policies of teacher effectiveness at the University of Northern Colorado. In addition, research…

  20. Robust detection of chromosomal interactions from small numbers of cells using low-input Capture-C

    PubMed Central

    Oudelaar, A. Marieke; Davies, James O.J.; Downes, Damien J.; Higgs, Douglas R.

    2017-01-01

    Abstract Chromosome conformation capture (3C) techniques are crucial to understanding tissue-specific regulation of gene expression, but current methods generally require large numbers of cells. This hampers the investigation of chromatin architecture in rare cell populations. We present a new low-input Capture-C approach that can generate high-quality 3C interaction profiles from 10 000–20 000 cells, depending on the resolution used for analysis. We also present a PCR-free, sequencing-free 3C technique based on NanoString technology called C-String. By comparing C-String and Capture-C interaction profiles we show that the latter are not skewed by PCR amplification. Furthermore, we demonstrate that chromatin interactions detected by Capture-C do not depend on the degree of cross-linking by performing experiments with varying formaldehyde concentrations. PMID:29186505

  1. Improving nuclear data accuracy of 241Am and 237Np capture cross sections

    NASA Astrophysics Data System (ADS)

    Žerovnik, Gašper; Schillebeeckx, Peter; Cano-Ott, Daniel; Jandel, Marian; Hori, Jun-ichi; Kimura, Atsushi; Rossbach, Matthias; Letourneau, Alain; Noguere, Gilles; Leconte, Pierre; Sano, Tadafumi; Kellett, Mark A.; Iwamoto, Osamu; Ignatyuk, Anatoly V.; Cabellos, Oscar; Genreith, Christoph; Harada, Hideo

    2017-09-01

    In the framework of the OECD/NEA WPEC subgroup 41, ways to improve neutron induced capture cross sections for 241Am and 237Np are being sought. Decay data, energy dependent cross section data and neutron spectrum averaged data are important for that purpose and were investigated. New time-of-flight measurements were performed and analyzed, and considerable effort was put into development of methods for analysis of spectrum averaged data and re-analysis of existing experimental data.

  2. Proteomic analysis of laser capture microscopy purified myotendinous junction regions from muscle sections

    PubMed Central

    2014-01-01

    The myotendinous junction is a specialized structure of the muscle fibre enriched in mechanosensing complexes, including costameric proteins and core elements of the z-disc. Here, laser capture microdissection was applied to purify membrane regions from the myotendinous junctions of mouse skeletal muscles, which were then processed for proteomic analysis. Sarcolemma sections from the longitudinal axis of the muscle fibre were used as control for the specificity of the junctional preparation. Gene ontology term analysis of the combined lists indicated a statistically significant enrichment in membrane-associated proteins. The myotendinous junction preparation contained previously uncharacterized proteins, a number of z-disc costameric ligands (e.g., actinins, capZ, αB cristallin, filamin C, cypher, calsarcin, desmin, FHL1, telethonin, nebulin, titin and an enigma-like protein) and other proposed players of sarcomeric stretch sensing and signalling, such as myotilin and the three myomesin homologs. A subset were confirmed by immunofluorescence analysis as enriched at the myotendinous junction, suggesting that laser capture microdissection from muscle sections is a valid approach to identify novel myotendinous junction players potentially involved in mechanotransduction pathways. PMID:25071420

  3. Initial Adsorption of Fe on an Ethanol-Saturated Si(111)7 × 7 Surface: Statistical Analysis in Scanning Tunneling Microscopy

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; Hattori, Ken

    2018-03-01

    We studied the initial stage of iron deposition on an ethanol-saturated Si(111)7 × 7 surface at room temperature using scanning tunneling microscopy (STM). The statistical analysis of the Si adatom height at empty states for Si(111)-C2H5OH before and after the Fe deposition showed different types of adatoms: type B (before the deposition) and type B' (after the deposition) assigned to bare adatoms, type D and type D' to C2H5O-terminated adatoms, and type E' to adatoms with Fe. The analysis of the height distribution revealed the protection of the molecule termination for the Fe capture at the initial stage. The analysis also indicated the preferential capture of a single Fe atom to a bare center-adatom rather than a bare corner-adatom which remain after the C2H5OH saturation, but no selectivity was observed in faulted and unfaulted half unit-cells. This is the first STM-based report proving that a remaining bare adatom, but not a molecule-terminated adatom, captures a metal.

  4. Point-of-care mobile digital microscopy and deep learning for the detection of soil-transmitted helminths and Schistosoma haematobium.

    PubMed

    Holmström, Oscar; Linder, Nina; Ngasala, Billy; Mårtensson, Andreas; Linder, Ewert; Lundin, Mikael; Moilanen, Hannu; Suutala, Antti; Diwan, Vinod; Lundin, Johan

    2017-06-01

    Microscopy remains the gold standard in the diagnosis of neglected tropical diseases. As resource limited, rural areas often lack laboratory equipment and trained personnel, new diagnostic techniques are needed. Low-cost, point-of-care imaging devices show potential in the diagnosis of these diseases. Novel, digital image analysis algorithms can be utilized to automate sample analysis. Evaluation of the imaging performance of a miniature digital microscopy scanner for the diagnosis of soil-transmitted helminths and Schistosoma haematobium, and training of a deep learning-based image analysis algorithm for automated detection of soil-transmitted helminths in the captured images. A total of 13 iodine-stained stool samples containing Ascaris lumbricoides, Trichuris trichiura and hookworm eggs and 4 urine samples containing Schistosoma haematobium were digitized using a reference whole slide-scanner and the mobile microscopy scanner. Parasites in the images were identified by visual examination and by analysis with a deep learning-based image analysis algorithm in the stool samples. Results were compared between the digital and visual analysis of the images showing helminth eggs. Parasite identification by visual analysis of digital slides captured with the mobile microscope was feasible for all analyzed parasites. Although the spatial resolution of the reference slide-scanner is higher, the resolution of the mobile microscope is sufficient for reliable identification and classification of all parasites studied. Digital image analysis of stool sample images captured with the mobile microscope showed high sensitivity for detection of all helminths studied (range of sensitivity = 83.3-100%) in the test set (n = 217) of manually labeled helminth eggs. In this proof-of-concept study, the imaging performance of a mobile, digital microscope was sufficient for visual detection of soil-transmitted helminths and Schistosoma haematobium. Furthermore, we show that deep learning-based image analysis can be utilized for the automated detection and classification of helminths in the captured images.

  5. Point-of-care mobile digital microscopy and deep learning for the detection of soil-transmitted helminths and Schistosoma haematobium

    PubMed Central

    Holmström, Oscar; Linder, Nina; Ngasala, Billy; Mårtensson, Andreas; Linder, Ewert; Lundin, Mikael; Moilanen, Hannu; Suutala, Antti; Diwan, Vinod; Lundin, Johan

    2017-01-01

    ABSTRACT Background: Microscopy remains the gold standard in the diagnosis of neglected tropical diseases. As resource limited, rural areas often lack laboratory equipment and trained personnel, new diagnostic techniques are needed. Low-cost, point-of-care imaging devices show potential in the diagnosis of these diseases. Novel, digital image analysis algorithms can be utilized to automate sample analysis. Objective: Evaluation of the imaging performance of a miniature digital microscopy scanner for the diagnosis of soil-transmitted helminths and Schistosoma haematobium, and training of a deep learning-based image analysis algorithm for automated detection of soil-transmitted helminths in the captured images. Methods: A total of 13 iodine-stained stool samples containing Ascaris lumbricoides, Trichuris trichiura and hookworm eggs and 4 urine samples containing Schistosoma haematobium were digitized using a reference whole slide-scanner and the mobile microscopy scanner. Parasites in the images were identified by visual examination and by analysis with a deep learning-based image analysis algorithm in the stool samples. Results were compared between the digital and visual analysis of the images showing helminth eggs. Results: Parasite identification by visual analysis of digital slides captured with the mobile microscope was feasible for all analyzed parasites. Although the spatial resolution of the reference slide-scanner is higher, the resolution of the mobile microscope is sufficient for reliable identification and classification of all parasites studied. Digital image analysis of stool sample images captured with the mobile microscope showed high sensitivity for detection of all helminths studied (range of sensitivity = 83.3–100%) in the test set (n = 217) of manually labeled helminth eggs. Conclusions: In this proof-of-concept study, the imaging performance of a mobile, digital microscope was sufficient for visual detection of soil-transmitted helminths and Schistosoma haematobium. Furthermore, we show that deep learning-based image analysis can be utilized for the automated detection and classification of helminths in the captured images. PMID:28838305

  6. Free stream capturing in fluid conservation law for moving coordinates in three dimensions

    NASA Technical Reports Server (NTRS)

    Obayashi, Shigeru

    1991-01-01

    The free-stream capturing technique for both the finite-volume (FV) and finite-difference (FD) framework is summarized. For an arbitrary motion of the grid, the FV analysis shows that volumes swept by all six surfaces of the cell have to be computed correctly. This means that the free-stream capturing time-metric terms should be calculated not only from a surface vector of a cell at a single time level, but also from a volume swept by the cell surface in space and time. The FV free-stream capturing formulation is applicable to the FD formulation by proper translation from an FV cell to an FD mesh.

  7. Constant-parameter capture-recapture models

    USGS Publications Warehouse

    Brownie, C.; Hines, J.E.; Nichols, J.D.

    1986-01-01

    Jolly (1982, Biometrics 38, 301-321) presented modifications of the Jolly-Seber model for capture-recapture data, which assume constant survival and/or capture rates. Where appropriate, because of the reduced number of parameters, these models lead to more efficient estimators than the Jolly-Seber model. The tests to compare models given by Jolly do not make complete use of the data, and we present here the appropriate modifications, and also indicate how to carry out goodness-of-fit tests which utilize individual capture history information. We also describe analogous models for the case where young and adult animals are tagged. The availability of computer programs to perform the analysis is noted, and examples are given using output from these programs.

  8. Microsatellite analysis of medfly bioinfestations in California.

    PubMed

    Bonizzoni, M; Zheng, L; Guglielmino, C R; Haymer, D S; Gasperi, G; Gomulski, L M; Malacrida, A R

    2001-10-01

    The Mediterranean fruit fly, Ceratitis capitata, is a destructive agricultural pest with a long history of invasion success. This pest has been affecting different regions of the United States for the past 30 years, but a number of studies of medfly bioinfestations has focused on the situation in California. Although some progress has been made in terms of establishing the origin of infestations, the overall status of this pest in this area remains controversial. Specifically, do flies captured over the years represent independent infestations or the persistence of a resident population? We present an effort to answer this question based on the use of multilocus genotyping. Ten microsatellite loci were used to analyse 109 medflies captured in several infestations within California between 1992 and 1998. Using these same markers, 242 medflies from regions of the world having 'established' populations of this pest including Hawaii, Guatemala, El Salvador, Ecuador, Brazil, Argentina and Peru, were also analysed. Although phylogenetic analysis, amova analysis, the IMMANC assignment test and geneclass exclusion test analysis suggest that some of the medflies captured in California are derived from independent invasion events, analysis of specimens from the Los Angeles basin provides support for the hypothesis that an endemic population, probably derived from Guatemala, has been established.

  9. Feasibility of Using Low-Cost Motion Capture for Automated Screening of Shoulder Motion Limitation after Breast Cancer Surgery.

    PubMed

    Gritsenko, Valeriya; Dailey, Eric; Kyle, Nicholas; Taylor, Matt; Whittacre, Sean; Swisher, Anne K

    2015-01-01

    To determine if a low-cost, automated motion analysis system using Microsoft Kinect could accurately measure shoulder motion and detect motion impairments in women following breast cancer surgery. Descriptive study of motion measured via 2 methods. Academic cancer center oncology clinic. 20 women (mean age = 60 yrs) were assessed for active and passive shoulder motions during a routine post-operative clinic visit (mean = 18 days after surgery) following mastectomy (n = 4) or lumpectomy (n = 16) for breast cancer. Participants performed 3 repetitions of active and passive shoulder motions on the side of the breast surgery. Arm motion was recorded using motion capture by Kinect for Windows sensor and on video. Goniometric values were determined from video recordings, while motion capture data were transformed to joint angles using 2 methods (body angle and projection angle). Correlation of motion capture with goniometry and detection of motion limitation. Active shoulder motion measured with low-cost motion capture agreed well with goniometry (r = 0.70-0.80), while passive shoulder motion measurements did not correlate well. Using motion capture, it was possible to reliably identify participants whose range of shoulder motion was reduced by 40% or more. Low-cost, automated motion analysis may be acceptable to screen for moderate to severe motion impairments in active shoulder motion. Automatic detection of motion limitation may allow quick screening to be performed in an oncologist's office and trigger timely referrals for rehabilitation.

  10. Phylogenomics of Phrynosomatid Lizards: Conflicting Signals from Sequence Capture versus Restriction Site Associated DNA Sequencing

    PubMed Central

    Leaché, Adam D.; Chavez, Andreas S.; Jones, Leonard N.; Grummer, Jared A.; Gottscho, Andrew D.; Linkem, Charles W.

    2015-01-01

    Sequence capture and restriction site associated DNA sequencing (RADseq) are popular methods for obtaining large numbers of loci for phylogenetic analysis. These methods are typically used to collect data at different evolutionary timescales; sequence capture is primarily used for obtaining conserved loci, whereas RADseq is designed for discovering single nucleotide polymorphisms (SNPs) suitable for population genetic or phylogeographic analyses. Phylogenetic questions that span both “recent” and “deep” timescales could benefit from either type of data, but studies that directly compare the two approaches are lacking. We compared phylogenies estimated from sequence capture and double digest RADseq (ddRADseq) data for North American phrynosomatid lizards, a species-rich and diverse group containing nine genera that began diversifying approximately 55 Ma. Sequence capture resulted in 584 loci that provided a consistent and strong phylogeny using concatenation and species tree inference. However, the phylogeny estimated from the ddRADseq data was sensitive to the bioinformatics steps used for determining homology, detecting paralogs, and filtering missing data. The topological conflicts among the SNP trees were not restricted to any particular timescale, but instead were associated with short internal branches. Species tree analysis of the largest SNP assembly, which also included the most missing data, supported a topology that matched the sequence capture tree. This preferred phylogeny provides strong support for the paraphyly of the earless lizard genera Holbrookia and Cophosaurus, suggesting that the earless morphology either evolved twice or evolved once and was subsequently lost in Callisaurus. PMID:25663487

  11. Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing

    NASA Technical Reports Server (NTRS)

    Jones, Robert L.; Goode, Plesent W. (Technical Monitor)

    2000-01-01

    The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.

  12. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  13. Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders

    PubMed Central

    Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini

    2008-01-01

    Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693

  14. Cretaceous origin of the unique prey-capture apparatus in mega-diverse genus: stem lineage of Steninae rove beetles discovered in Burmese amber

    PubMed Central

    Żyła, Dagmara; Yamamoto, Shûhei; Wolf-Schwenninger, Karin; Solodovnikov, Alexey

    2017-01-01

    Stenus is the largest genus of rove beetles and the second largest among animals. Its evolutionary success was associated with the adhesive labial prey-capture apparatus, a unique apomorphy of that genus. Definite Stenus with prey-capture apparatus are known from the Cenozoic fossils, while the age and early evolution of Steninae was hardly ever hypothesized. Our study of several Cretaceous Burmese amber inclusions revealed a stem lineage of Steninae that possibly possesses the Stenus-like prey-capture apparatus. Phylogenetic analysis of extinct and extant taxa of Steninae and putatively allied subfamilies of Staphylinidae with parsimony and Bayesian approaches resolved the Burmese amber lineage as a member of Steninae. It justified the description of a new extinct stenine genus Festenus with two new species, F. robustus and F. gracilis. The Late Cretaceous age of Festenus suggests an early origin of prey-capture apparatus in Steninae that, perhaps, drove the evolution towards the crown Stenus. Our analysis confirmed the well-established sister relationships between Steninae and Euaesthetinae and resolved Scydmaeninae as their next closest relative, the latter having no stable position in recent phylogenetic studies of rove beetles. Close affiliation of Megalopsidiinae, a subfamily often considered as a sister group to Euaesthetinae + Steninae clade, is rejected. PMID:28397786

  15. Visually driven chaining of elementary swim patterns into a goal-directed motor sequence: a virtual reality study of zebrafish prey capture.

    PubMed

    Trivedi, Chintan A; Bollmann, Johann H

    2013-01-01

    Prey capture behavior critically depends on rapid processing of sensory input in order to track, approach, and catch the target. When using vision, the nervous system faces the problem of extracting relevant information from a continuous stream of input in order to detect and categorize visible objects as potential prey and to select appropriate motor patterns for approach. For prey capture, many vertebrates exhibit intermittent locomotion, in which discrete motor patterns are chained into a sequence, interrupted by short periods of rest. Here, using high-speed recordings of full-length prey capture sequences performed by freely swimming zebrafish larvae in the presence of a single paramecium, we provide a detailed kinematic analysis of first and subsequent swim bouts during prey capture. Using Fourier analysis, we show that individual swim bouts represent an elementary motor pattern. Changes in orientation are directed toward the target on a graded scale and are implemented by an asymmetric tail bend component superimposed on this basic motor pattern. To further investigate the role of visual feedback on the efficiency and speed of this complex behavior, we developed a closed-loop virtual reality setup in which minimally restrained larvae recapitulated interconnected swim patterns closely resembling those observed during prey capture in freely moving fish. Systematic variation of stimulus properties showed that prey capture is initiated within a narrow range of stimulus size and velocity. Furthermore, variations in the delay and location of swim triggered visual feedback showed that the reaction time of secondary and later swims is shorter for stimuli that appear within a narrow spatio-temporal window following a swim. This suggests that the larva may generate an expectation of stimulus position, which enables accelerated motor sequencing if the expectation is met by appropriate visual feedback.

  16. Evaluation by latent class analysis of a magnetic capture based DNA extraction followed by real-time qPCR as a new diagnostic method for detection of Echinococcus multilocularis in definitive hosts.

    PubMed

    Maas, Miriam; van Roon, Annika; Dam-Deisz, Cecile; Opsteegh, Marieke; Massolo, Alessandro; Deksne, Gunita; Teunis, Peter; van der Giessen, Joke

    2016-10-30

    A new method, based on a magnetic capture based DNA extraction followed by qPCR, was developed for the detection of the zoonotic parasite Echinococcus multilocularis in definitive hosts. Latent class analysis was used to compare this new method with the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. In total, 60 red foxes and coyotes from three different locations were tested with both molecular methods and the sedimentation and counting technique (SCT) or intestinal scraping technique (IST). Though based on a limited number of samples, it could be established that the magnetic capture based DNA extraction followed by qPCR showed similar sensitivity and specificity as the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. All methods have a high specificity as shown by Bayesian latent class analysis. Both molecular assays have higher sensitivities than the combined SCT and IST, though the uncertainties in sensitivity estimates were wide for all assays tested. The magnetic capture based DNA extraction followed by qPCR has the advantage of not requiring hazardous chemicals like the phenol-chloroform DNA extraction followed by single tube nested PCR. This supports the replacement of the phenol-chloroform DNA extraction followed by single tube nested PCR by the magnetic capture based DNA extraction followed by qPCR for molecular detection of E. multilocularis in definitive hosts. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Low-cost human motion capture system for postural analysis onboard ships

    NASA Astrophysics Data System (ADS)

    Nocerino, Erica; Ackermann, Sebastiano; Del Pizzo, Silvio; Menna, Fabio; Troisi, Salvatore

    2011-07-01

    The study of human equilibrium, also known as postural stability, concerns different research sectors (medicine, kinesiology, biomechanics, robotics, sport) and is usually performed employing motion analysis techniques for recording human movements and posture. A wide range of techniques and methodologies has been developed, but the choice of instrumentations and sensors depends on the requirement of the specific application. Postural stability is a topic of great interest for the maritime community, since ship motions can make demanding and difficult the maintenance of the upright stance with hazardous consequences for the safety of people onboard. The need of capturing the motion of an individual standing on a ship during its daily service does not permit to employ optical systems commonly used for human motion analysis. These sensors are not designed for operating in disadvantageous environmental conditions (water, wetness, saltiness) and with not optimal lighting. The solution proposed in this study consists in a motion acquisition system that could be easily usable onboard ships. It makes use of two different methodologies: (I) motion capture with videogrammetry and (II) motion measurement with Inertial Measurement Unit (IMU). The developed image-based motion capture system, made up of three low-cost, light and compact video cameras, was validated against a commercial optical system and then used for testing the reliability of the inertial sensors. In this paper, the whole process of planning, designing, calibrating, and assessing the accuracy of the motion capture system is reported and discussed. Results from the laboratory tests and preliminary campaigns in the field are presented.

  18. Initial Risk Analysis and Decision Making Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coalmore » electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.« less

  19. Properties of Earth's temporarily-captured flybys

    NASA Astrophysics Data System (ADS)

    Fedorets, Grigori; Granvik, Mikael

    2014-11-01

    In addition to the Moon, a population of small temporarily-captured NEOs is predicted to orbit the Earth. The definition of a natural Earth satellite is that it is on an elliptic geocentric orbit within 0.03 au from the Earth. The population is further divided into temporarily-captured orbiters (TCOs, or minimoons, making at least one full revolution around the Earth in a coordinate system co-rotating with the Sun) and temporarily-captured flybys (TCFs) which fail to make a full revolution, but are temporarily on an elliptic orbit around the Earth. Only one minimoon has been discovered to date, but it is expected that next generation surveys will be able to detect these objects regularly.Granvik et al. (2012) performed an extensive analysis of the behaviour of these temporarily-captured objects. One of the main results was that at any given moment there is at least one 1-meter-diameter minimoon in orbit around the Earth. However, the results of Granvik et al. (2012) raised questions considering the NES population such as the bimodality of the capture duration distribution and a distinctive lack of test particles within Earth's Hill sphere, which requires investigating the statistical properties also of the TCF population.In this work we confirm the population characteristics for minimoons described by Granvik et al. (2012), and extend the analysis to TCFs. For the calculations we use a Bulirsch-Stoer integrator implemented in the OpenOrb software package (Granvik et al. 2009). We study, e.g., the capture statistics, residence-time distributions, and steady-state properties of TCFs. Our preliminary results indicate that TCFs may be suitable targets for asteroid-redirect missions. More detailed knowledge of the TCF population will also improve our understanding of the link between temporarily-captured objects and NEOs in general.References: Granvik et al. (2009) MPS 44(12), 1853-1861; Granvik et al. (2012) Icarus 218, 262-277.

  20. Sample Acqusition Drilling System for the the Resource Prospector Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Paulsen, G.; Quinn, J.; Smith, J.; Kleinhenz, J.

    2015-12-01

    The goal of the Lunar Resource Prospector Mission (RPM) is to capture and identify volatiles species within the top meter of the lunar regolith. The RPM drill has been designed to 1. Generate cuttings and place them on the surface for analysis by the the Near InfraRed Volatiles Spectrometer Subsystem (NIRVSS), and 2. Capture cuttings and transfer them to the Oxygen and Volatile Extraction Node (OVEN) coupled with the Lunar Advanced Volatiles Analysis (LAVA) subsystem. The RPM drill is based on the Mars Icebreaker drill developed for capturing samples of ice and ice cemented ground on Mars. The drill weighs approximately 10 kg and is rated at ~300 Watt. It is a rotary-percussive, fully autonomous system designed to capture cuttings for analysis. The drill consists of: 1. Rotary-Percussive Drill Head, 2. Sampling Auger, 3. Brushing station, 4. Z-stage, 5. Deployment stage. To reduce sample handling complexity, the drill auger is designed to capture cuttings as opposed to cores. High sampling efficiency is possible through a dual design of the auger. The lower section has deep and low pitch flutes for retaining of cuttings. The upper section has been designed to efficiently move the cuttings out of the hole. The drill uses a "bite" sampling approach where samples are captured in ~10 cm intervals. The first generation drill was tested in Mars chamber as well as in Antarctica and the Arctic. It demonstrated drilling at 1-1-100-100 level (1 meter in 1 hour with 100 Watt and 100 N Weight on Bit) in ice, ice cemented ground, soil, and rocks. The second generation drill was deployed on a Carnegie Mellon University rover, called Zoe, and tested in Atacama in 2012. The tests demonstrated fully autonomous sample acquisition and delivery to a carousel. The third generation drill was tested in NASA GRC's vacuum chamber, VF13, at 10-5 torr and approximately 200 K. It demonstrated successful capture and transfer of icy samples to a crucible. The drill has been modified and integrated onto the NASA JSC RPM rover. It has been undergoing testing in a lab and in the field during the Summer of 2015.

  1. Applications of Photogrammetry for Analysis of Forest Plantations. Preliminary study: Analysis of individual trees

    NASA Astrophysics Data System (ADS)

    Mora, R.; Barahona, A.; Aguilar, H.

    2015-04-01

    This paper presents a method for using high detail volumetric information, captured with a land based photogrammetric survey, to obtain information from individual trees. Applying LIDAR analysis techniques it is possible to measure diameter at breast height, height at first branch (commercial height), basal area and volume of an individual tree. Given this information it is possible to calculate how much of that tree can be exploited as wood. The main objective is to develop a methodology for successfully surveying one individual tree, capturing every side of the stem a using high resolution digital camera and reference marks with GPS coordinates. The process is executed for several individuals of two species present in the metropolitan area in San Jose, Costa Rica, Delonix regia (Bojer) Raf. and Tabebuia rosea (Bertol.) DC., each one with different height, stem shape and crown area. Using a photogrammetry suite all the pictures are aligned, geo-referenced and a dense point cloud is generated with enough detail to perform the required measurements, as well as a solid tridimensional model for volume measurement. This research will open the way to develop a capture methodology with an airborne camera using close range UAVs. An airborne platform will make possible to capture every individual in a forest plantation, furthermore if the analysis techniques applied in this research are automated it will be possible to calculate with high precision the exploit potential of a forest plantation and improve its management.

  2. Experience patterns: capturing the dynamic nature of a recreation experience

    Treesearch

    R.B., IV Hull; William P. Stewart; Young K. Yi

    1992-01-01

    A recreation experience is not static, it varies over the course of an engagement. Yet, most recreation research operationalizes recreation benefits andoutcomes as essentially static in nature (i.e., satisfaction). "Experience patterns" capture the dynamic nature of a recreation experience and thus might prove useful as units of analysis in the management and...

  3. Nanomaterial-based Microfluidic Chips for the Capture and Detection of Circulating Tumor Cells.

    PubMed

    Sun, Duanping; Chen, Zuanguang; Wu, Minhao; Zhang, Yuanqing

    2017-01-01

    Circulating tumor cells (CTCs), a type of cancer cells that spreads from primary or metastatic tumors into the bloodstream, can lead to a new fatal metastasis. As a new type of liquid biopsy, CTCs have become a hot pursuit and detection of CTCs offers the possibility for early diagnosis of cancers, earlier evaluation of chemotherapeutic efficacy and cancer recurrence, and choice of individual sensitive anti-cancer drugs. The fundamental challenges of capturing and characterizing CTCs are the extremely low number of CTCs in the blood and the intrinsic heterogeneity of CTCs. A series of microfluidic devices have been proposed for the analysis of CTCs with automation capability, precise flow behaviors, and significant advantages over the conventional larger scale systems. This review aims to provide in-depth insights into CTCs analysis, including various nanomaterial-based microfluidic chips for the capture and detection of CTCs based on the specific biochemical and physical properties of CTCs. The current developmental trends and promising research directions in the establishment of microfluidic chips for the capture and detection of CTCs are also discussed.

  4. Automated Microfluidic Filtration and Immunocytochemistry Detection System for Capture and Enumeration of Circulating Tumor Cells and Other Rare Cell Populations in Blood.

    PubMed

    Pugia, Michael; Magbanua, Mark Jesus M; Park, John W

    2017-01-01

    Isolation by size using a filter membrane offers an antigen-independent method for capturing rare cells present in blood of cancer patients. Multiple cell types, including circulating tumor cells (CTCs), captured on the filter membrane can be simultaneously identified via immunocytochemistry (ICC) analysis of specific cellular biomarkers. Here, we describe an automated microfluidic filtration method combined with a liquid handling system for sequential ICC assays to detect and enumerate non-hematologic rare cells in blood.

  5. Validation of enhanced kinect sensor based motion capturing for gait assessment

    PubMed Central

    Müller, Björn; Ilg, Winfried; Giese, Martin A.

    2017-01-01

    Optical motion capturing systems are expensive and require substantial dedicated space to be set up. On the other hand, they provide unsurpassed accuracy and reliability. In many situations however flexibility is required and the motion capturing system can only temporarily be placed. The Microsoft Kinect v2 sensor is comparatively cheap and with respect to gait analysis promising results have been published. We here present a motion capturing system that is easy to set up, flexible with respect to the sensor locations and delivers high accuracy in gait parameters comparable to a gold standard motion capturing system (VICON). Further, we demonstrate that sensor setups which track the person only from one-side are less accurate and should be replaced by two-sided setups. With respect to commonly analyzed gait parameters, especially step width, our system shows higher agreement with the VICON system than previous reports. PMID:28410413

  6. A Dual-Responsive Self-Assembled Monolayer for Specific Capture and On-Demand Release of Live Cells.

    PubMed

    Gao, Xia; Li, Qiang; Wang, Fengchao; Liu, Xuehui; Liu, Dingbin

    2018-06-22

    We report a dual-responsive self-assembled monolayer (SAM) on a well-defined rough gold substrate for dynamic capture and release of live cells. By incorporating 5'-triphosphate (ATP) aptamer into a SAM, we can accurately isolate specific cell types and subsequently release captured cells at either population or desired-group (or even single-cell) levels. On one hand, the whole SAMs can be disassembled through addition of ATP solution, leading to the entire release of the captured cells from the supported substrate. On the other hand, desired cells can be selectively released by using near-infrared light (NIR) irradiation, with relatively high spatial and temporal precision. The proposed dual-responsive cell capture-and-release system is biologically friendly and is reusable with another round of modification, showing great usefulness in cancer diagnosis and molecular analysis.

  7. Ultrasonic Bolt Gage

    NASA Technical Reports Server (NTRS)

    Gleman, Stuart M. (Inventor); Rowe, Geoffrey K. (Inventor)

    1999-01-01

    An ultrasonic bolt gage is described which uses a crosscorrelation algorithm to determine a tension applied to a fastener, such as a bolt. The cross-correlation analysis is preferably performed using a processor operating on a series of captured ultrasonic echo waveforms. The ultrasonic bolt gage is further described as using the captured ultrasonic echo waveforms to perform additional modes of analysis, such as feature recognition. Multiple tension data outputs, therefore, can be obtained from a single data acquisition for increased measurement reliability. In addition, one embodiment of the gage has been described as multi-channel, having a multiplexer for performing a tension analysis on one of a plurality of bolts.

  8. High-Density Dielectrophoretic Microwell Array for Detection, Capture, and Single-Cell Analysis of Rare Tumor Cells in Peripheral Blood.

    PubMed

    Morimoto, Atsushi; Mogami, Toshifumi; Watanabe, Masaru; Iijima, Kazuki; Akiyama, Yasuyuki; Katayama, Koji; Futami, Toru; Yamamoto, Nobuyuki; Sawada, Takeshi; Koizumi, Fumiaki; Koh, Yasuhiro

    2015-01-01

    Development of a reliable platform and workflow to detect and capture a small number of mutation-bearing circulating tumor cells (CTCs) from a blood sample is necessary for the development of noninvasive cancer diagnosis. In this preclinical study, we aimed to develop a capture system for molecular characterization of single CTCs based on high-density dielectrophoretic microwell array technology. Spike-in experiments using lung cancer cell lines were conducted. The microwell array was used to capture spiked cancer cells, and captured single cells were subjected to whole genome amplification followed by sequencing. A high detection rate (70.2%-90.0%) and excellent linear performance (R2 = 0.8189-0.9999) were noted between the observed and expected numbers of tumor cells. The detection rate was markedly higher than that obtained using the CellSearch system in a blinded manner, suggesting the superior sensitivity of our system in detecting EpCAM- tumor cells. Isolation of single captured tumor cells, followed by detection of EGFR mutations, was achieved using Sanger sequencing. Using a microwell array, we established an efficient and convenient platform for the capture and characterization of single CTCs. The results of a proof-of-principle preclinical study indicated that this platform has potential for the molecular characterization of captured CTCs from patients.

  9. A new electrocardiogram algorithm for diagnosing loss of ventricular capture during cardiac resynchronisation therapy.

    PubMed

    Ganière, Vincent; Domenichini, Giulia; Niculescu, Viviana; Cassagneau, Romain; Defaye, Pascal; Burri, Haran

    2013-03-01

    The prerequisite for cardiac resynchronization therapy (CRT) is ventricular capture, which may be verified by analysis of the surface electrocardiogram (ECG). Few algorithms exist to diagnose loss of ventricular capture. Electrocardiograms from 126 CRT patients were analysed during biventricular (BV), right ventricular (RV), and left ventricular (LV) pacing. An algorithm evaluating QRS narrowing in the limb leads and increasing negativity in lead I to diagnose changes in ventricular capture was devised, prospectively validated, and compared with two existing algorithms. Performance of the algorithm according to ventricular lead position was also assessed. Our algorithm had an accuracy of 88% to correctly identify the changes in ventricular capture (either loss or gain of RV or LV capture). The algorithm had a sensitivity of 94% and a specificity of 96% with an accuracy of 96% for identifying loss of LV capture (the most clinically relevant change), and compared favourably with the existing algorithms. Performance of the algorithms was not significantly affected by RV or LV lead position. A simple two-step algorithm evaluating QRS width in the limb leads and changes in negativity in lead I can accurately diagnose the lead responsible for intermittent loss of ventricular capture in CRT. This simple tool may be of particular use outside the setting of specialized device clinics.

  10. Evaluation of trap capture in a geographically closed population of brown treesnakes on Guam

    USGS Publications Warehouse

    Tyrrell, C.L.; Christy, M.T.; Rodda, G.H.; Yackel Adams, A.A.; Ellingson, A.R.; Savidge, J.A.; Dean-Bradley, K.; Bischof, R.

    2009-01-01

    1. Open population mark-recapture analysis of unbounded populations accommodates some types of closure violations (e.g. emigration, immigration). In contrast, closed population analysis of such populations readily allows estimation of capture heterogeneity and behavioural response, but requires crucial assumptions about closure (e.g. no permanent emigration) that are suspect and rarely tested empirically. 2. In 2003, we erected a double-sided barrier to prevent movement of snakes in or out of a 5-ha semi-forested study site in northern Guam. This geographically closed population of >100 snakes was monitored using a series of transects for visual searches and a 13 ?? 13 trapping array, with the aim of marking all snakes within the site. Forty-five marked snakes were also supplemented into the resident population to quantify the efficacy of our sampling methods. We used the program mark to analyse trap captures (101 occasions), referenced to census data from visual surveys, and quantified heterogeneity, behavioural response, and size bias in trappability. Analytical inclusion of untrapped individuals greatly improved precision in the estimation of some covariate effects. 3. A novel discovery was that trap captures for individual snakes consisted of asynchronous bouts of high capture probability lasting about 7 days (ephemeral behavioural effect). There was modest behavioural response (trap happiness) and significant latent (unexplained) heterogeneity, with small influences on capture success of date, gender, residency status (translocated or not), and body condition. 4. Trapping was shown to be an effective tool for eradicating large brown treesnakes Boiga irregularis (>900 mm snout-vent length, SVL). 5. Synthesis and applications. Mark-recapture modelling is commonly used by ecological managers to estimate populations. However, existing models involve making assumptions about either closure violations or response to capture. Physical closure of our population on a landscape scale allowed us to determine the relative importance of covariates influencing capture probability (body size, trappability periods, and latent heterogeneity). This information was used to develop models in which different segments of the population could be assigned different probabilities of capture, and suggests that modelling of open populations should incorporate easily measured, but potentially overlooked, parameters such as body size or condition. ?? 2008 The Authors.

  11. Objective analysis of image quality of video image capture systems

    NASA Astrophysics Data System (ADS)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give horizontal or vertical streaking. While many of these results are significant from an engineering standpoint alone, there are clinical implications and some anatomy or pathology may not be visualized if an image capture system is used improperly.

  12. Real-time marker-free motion capture system using blob feature analysis

    NASA Astrophysics Data System (ADS)

    Park, Chang-Joon; Kim, Sung-Eun; Kim, Hong-Seok; Lee, In-Ho

    2005-02-01

    This paper presents a real-time marker-free motion capture system which can reconstruct 3-dimensional human motions. The virtual character of the proposed system mimics the motion of an actor in real-time. The proposed system captures human motions by using three synchronized CCD cameras and detects the root and end-effectors of an actor such as a head, hands, and feet by exploiting the blob feature analysis. And then, the 3-dimensional positions of end-effectors are restored and tracked by using Kalman filter. At last, the positions of the intermediate joint are reconstructed by using anatomically constrained inverse kinematics algorithm. The proposed system was implemented under general lighting conditions and we confirmed that the proposed system could reconstruct motions of a lot of people wearing various clothes in real-time stably.

  13. Radiative-emission analysis in charge-exchange collisions of O6 + with argon, water, and methane

    NASA Astrophysics Data System (ADS)

    Leung, Anthony C. K.; Kirchner, Tom

    2017-04-01

    Processes of electron capture followed by Auger and radiative decay were investigated in slow ion-atom and -molecule collisions. A quantum-mechanical analysis which utilizes the basis generator method within an independent electron model was carried out for collisions of O 6 + with Ar, H2O , and CH4 at impact energies of 1.17 and 2.33 keV/amu. At these impact energies, a closure approximation in the spectral representation of the Hamiltonian for molecules was found to be necessary to yield reliable results. Total single-, double-, and triple-electron-capture cross sections obtained show good agreement with previous measurements and calculations using the classical trajectory Monte Carlo method. The corresponding emission spectra from single capture for each collision system are in satisfactory agreement with previous calculations.

  14. (n,{gamma}) Experiments on tin isotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baramsai, B.; Mitchell, G. E.; Walker, C. L.

    2013-04-19

    Neutron capture experiments on highly enriched {sup 117,119}Sn isotopes were performed with the DANCE detector array located at the Los Alamos Neutron Science Center. The DANCE detector provides detailed information about the multi-step {gamma}-ray cascade following neutron capture. Analysis of the experimental data provides important information to improve understanding of the neutron capture reaction, including a test of the statistical model, the assignment of spins and parities of neutron resonances, and information concerning the Photon Strength Function (PSF) and Level Density (LD) below the neutron separation energy. Preliminary results for the (n,{gamma}) reaction on {sup 117,119}Sn are presented. Resonance spinsmore » of the odd-A tin isotopes were almost completely unknown. Resonance spins and parities have been assigned via analysis of the multi-step {gamma}-ray spectra and directional correlations.« less

  15. A closure test for time-specific capture-recapture data

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    The assumption of demographic closure in the analysis of capture-recapture data under closed-population models is of fundamental importance. Yet, little progress has been made in the development of omnibus tests of the closure assumption. We present a closure test for time-specific data that, in principle, tests the null hypothesis of closed-population model M(t) against the open-population Jolly-Seber model as a specific alternative. This test is chi-square, and can be decomposed into informative components that can be interpreted to determine the nature of closure violations. The test is most sensitive to permanent emigration and least sensitive to temporary emigration, and is of intermediate sensitivity to permanent or temporary immigration. This test is a versatile tool for testing the assumption of demographic closure in the analysis of capture-recapture data.

  16. Measurement of the radiative capture cross section of the s-process branching points 204Tl and 171Tm at the n_TOF facility (CERN)

    NASA Astrophysics Data System (ADS)

    Casanovas, A.; Domingo-Pardo, C.; Guerrero, C.; Lerendegui-Marco, J.; Calviño, F.; Tarifeño-Saldivia, A.; Dressler, R.; Heinitz, S.; Kivel, N.; Quesada, J. M.; Schumann, D.; Aberle, O.; Alcayne, V.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Barbagallo, M.; Bečvář, F.; Bellia, G.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Busso, M.; Caamaño, M.; Caballero-Ontanaya, L.; Calviani, M.; Cano-Ott, D.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Cristallo, S.; Damone, L. A.; Diakaki, M.; Dietz, M.; Dupont, E.; Durán, I.; Eleme, Z.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Furman, V.; Göbel, K.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González-Romero, E.; Gunsing, F.; Heyse, J.; Jenkins, D. G.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kimura, A.; Kokkoris, M.; Kopatch, Y.; Krtička, M.; Kurtulgil, D.; Ladarescu, I.; Lederer-Woods, C.; Meo, S. Lo; Lonsdale, S. J.; Macina, D.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Michalopoulou, V.; Milazzo, P. M.; Mingrone, F.; Musumarra, A.; Negret, A.; Nolte, R.; Ogállar, F.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Persanti, L.; Porras, I.; Praena, J.; Radeck, D.; Ramos, D.; Rauscher, T.; Reifarth, R.; Rochman, D.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Simone, S.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Talip, T.; Tassan-Got, L.; Tsinganis, A.; Ulrich, J.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Woods, P. J.; Wright, T.; Žugec, P.; Köster, U.

    2018-05-01

    The neutron capture cross section of some unstable nuclei is especially relevant for s-process nucleosynthesis studies. This magnitude is crucial to determine the local abundance pattern, which can yield valuable information of the s-process stellar environment. In this work we describe the neutron capture (n,γ) measurement on two of these nuclei of interest, 204Tl and 171Tm, from target production to the final measurement, performed successfully at the n_TOF facility at CERN in 2014 and 2015. Preliminary results on the ongoing experimental data analysis will also be shown. These results include the first ever experimental observation of capture resonances for these two nuclei.

  17. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  18. Modeling and Analysis of Mixed Synchronous/Asynchronous Systems

    NASA Technical Reports Server (NTRS)

    Driscoll, Kevin R.; Madl. Gabor; Hall, Brendan

    2012-01-01

    Practical safety-critical distributed systems must integrate safety critical and non-critical data in a common platform. Safety critical systems almost always consist of isochronous components that have synchronous or asynchronous interface with other components. Many of these systems also support a mix of synchronous and asynchronous interfaces. This report presents a study on the modeling and analysis of asynchronous, synchronous, and mixed synchronous/asynchronous systems. We build on the SAE Architecture Analysis and Design Language (AADL) to capture architectures for analysis. We present preliminary work targeted to capture mixed low- and high-criticality data, as well as real-time properties in a common Model of Computation (MoC). An abstract, but representative, test specimen system was created as the system to be modeled.

  19. Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities

    NASA Astrophysics Data System (ADS)

    Perjanik, Nicholas Steven

    As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.

  20. Laser Capture Microdissection Revisited as a Tool for Transcriptomic Analysis: Application of an Excel-Based qPCR Preparation Software (PREXCEL-Q)

    USDA-ARS?s Scientific Manuscript database

    The ability to reliably analyze cellular and molecular profiles of normal or diseased tissues is frequently obfuscated by the inherent heterogeneous nature of tissues. Laser Capture Microdissection (LCM) is an innovative technique that allows the isolation and enrichment of pure subpopulations of c...

  1. Promoting Collaborative Practice and Reciprocity in Initial Teacher Education: Realising a "Dialogic Space" through Video Capture Analysis

    ERIC Educational Resources Information Center

    Youens, Bernadette; Smethem, Lindsey; Sullivan, Stefanie

    2014-01-01

    This paper explores the potential of video capture to generate a collaborative space for teacher preparation; a space in which traditional hierarchies and boundaries between actors (student teacher, school mentor and university tutor) and knowledge (academic, professional and practical) are disrupted. The study, based in a teacher education…

  2. THE DEVELOPMENT OF IODINE BASED IMPINGER SOLUTIONS FOR THE EFFICIENT CAPTURE OF HG USING DIRECT INJECTION NEBULIZATION - INDUCTIVELY COUPLED PLASMA MASS SPECTROMETRY ANALYSIS

    EPA Science Inventory

    Inductively coupled plasma mass spectrometry (ICP/MS) with direct injection nebulization (DIN) was used to evaluate novel impinger solution compositions capable of capturing elemental mercury (Hgo) in EPA Method 5 type sampling. An iodine based impinger solutoin proved to be ver...

  3. Innovative approach for transcriptomic analysis of obligate intracellular pathogen: selective capture of transcribed sequences of Ehrlichia ruminantium

    PubMed Central

    2009-01-01

    Background Whole genome transcriptomic analysis is a powerful approach to elucidate the molecular mechanisms controlling the pathogenesis of obligate intracellular bacteria. However, the major hurdle resides in the low quantity of prokaryotic mRNAs extracted from host cells. Our model Ehrlichia ruminantium (ER), the causative agent of heartwater, is transmitted by tick Amblyomma variegatum. This bacterium affects wild and domestic ruminants and is present in Sub-Saharan Africa and the Caribbean islands. Because of its strictly intracellular location, which constitutes a limitation for its extensive study, the molecular mechanisms involved in its pathogenicity are still poorly understood. Results We successfully adapted the SCOTS method (Selective Capture of Transcribed Sequences) on the model Rickettsiales ER to capture mRNAs. Southern Blots and RT-PCR revealed an enrichment of ER's cDNAs and a diminution of ribosomal contaminants after three rounds of capture. qRT-PCR and whole-genome ER microarrays hybridizations demonstrated that SCOTS method introduced only a limited bias on gene expression. Indeed, we confirmed the differential gene expression between poorly and highly expressed genes before and after SCOTS captures. The comparative gene expression obtained from ER microarrays data, on samples before and after SCOTS at 96 hpi was significantly correlated (R2 = 0.7). Moreover, SCOTS method is crucial for microarrays analysis of ER, especially for early time points post-infection. There was low detection of transcripts for untreated samples whereas 24% and 70.7% were revealed for SCOTS samples at 24 and 96 hpi respectively. Conclusions We conclude that this SCOTS method has a key importance for the transcriptomic analysis of ER and can be potentially used for other Rickettsiales. This study constitutes the first step for further gene expression analyses that will lead to a better understanding of both ER pathogenicity and the adaptation of obligate intracellular bacteria to their environment. PMID:20034374

  4. Scalable Photogrammetric Motion Capture System "mosca": Development and Application

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2015-05-01

    Wide variety of applications (from industrial to entertainment) has a need for reliable and accurate 3D information about motion of an object and its parts. Very often the process of movement is rather fast as in cases of vehicle movement, sport biomechanics, animation of cartoon characters. Motion capture systems based on different physical principles are used for these purposes. The great potential for obtaining high accuracy and high degree of automation has vision-based system due to progress in image processing and analysis. Scalable inexpensive motion capture system is developed as a convenient and flexible tool for solving various tasks requiring 3D motion analysis. It is based on photogrammetric techniques of 3D measurements and provides high speed image acquisition, high accuracy of 3D measurements and highly automated processing of captured data. Depending on the application the system can be easily modified for different working areas from 100 mm to 10 m. The developed motion capture system uses from 2 to 4 technical vision cameras for video sequences of object motion acquisition. All cameras work in synchronization mode at frame rate up to 100 frames per second under the control of personal computer providing the possibility for accurate calculation of 3D coordinates of interest points. The system was used for a set of different applications fields and demonstrated high accuracy and high level of automation.

  5. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  6. Martian Magnets Under the Microscope

    NASA Technical Reports Server (NTRS)

    2004-01-01

    NASA's Mars Exploration Rover Spirit acquired this microscopic imager view of its capture magnet on sol 92 (April 6, 2004). Both Spirit and the Mars Exploration Rover Opportunity are equipped with a number of magnets. The capture magnet, as seen here, has a stronger charge than its sidekick, the filter magnet. The lower-powered filter magnet captures only the most magnetic airborne dust with the strongest charges, while the capture magnet picks up all magnetic airborne dust.

    The magnets' primary purpose is to collect the martian magnetic dust so that scientists can analyze it with the rovers' Moessbauer spectrometers. While there is plenty of dust on the surface of Mars, it is difficult to confirm where it came from, and when it was last airborne. Because scientists are interested in learning about the properties of the dust in the atmosphere, they devised this dust-collection experiment.

    The capture magnet is about 4.5 centimeters (1.8 inches) in diameter and is constructed with a central cylinder and three rings, each with alternating orientations of magnetization. Scientists have been monitoring the continual accumulation of dust since the beginning of the mission with panoramic camera and microscopic imager images. They had to wait until enough dust accumulated before they could get a Moessbauer spectrometer analysis. The results of that analysis, performed on sol 92, have not been sent back to Earth yet.

  7. In vivo evaluation of neutron capture therapy effectivity using calcium phosphate-based nanoparticles as Gd-DTPA delivery agent.

    PubMed

    Dewi, Novriana; Mi, Peng; Yanagie, Hironobu; Sakurai, Yuriko; Morishita, Yasuyuki; Yanagawa, Masashi; Nakagawa, Takayuki; Shinohara, Atsuko; Matsukawa, Takehisa; Yokoyama, Kazuhito; Cabral, Horacio; Suzuki, Minoru; Sakurai, Yoshinori; Tanaka, Hiroki; Ono, Koji; Nishiyama, Nobuhiro; Kataoka, Kazunori; Takahashi, Hiroyuki

    2016-04-01

    A more immediate impact for therapeutic approaches of current clinical research efforts is of major interest, which might be obtained by developing a noninvasive radiation dose-escalation strategy, and neutron capture therapy represents one such novel approach. Furthermore, some recent researches on neutron capture therapy have focused on using gadolinium as an alternative or complementary for currently used boron, taking into account several advantages that gadolinium offers. Therefore, in this study, we carried out feasibility evaluation for both single and multiple injections of gadolinium-based MRI contrast agent incorporated in calcium phosphate nanoparticles as neutron capture therapy agent. In vivo evaluation was performed on colon carcinoma Col-26 tumor-bearing mice irradiated at nuclear reactor facility of Kyoto University Research Reactor Institute with average neutron fluence of 1.8 × 10(12) n/cm(2). Antitumor effectivity was evaluated based on tumor growth suppression assessed until 27 days after neutron irradiation, followed by histopathological analysis on tumor slice. The experimental results showed that the tumor growth of irradiated mice injected beforehand with Gd-DTPA-incorporating calcium phosphate-based nanoparticles was suppressed up to four times higher compared to the non-treated group, supported by the results of histopathological analysis. The results of antitumor effectivity observed on tumor-bearing mice after neutron irradiation indicated possible effectivity of gadolinium-based neutron capture therapy treatment.

  8. Techno-economic assessment of polymer membrane systems for postcombustion carbon capture at coal-fired power plants.

    PubMed

    Zhai, Haibo; Rubin, Edward S

    2013-03-19

    This study investigates the feasibility of polymer membrane systems for postcombustion carbon dioxide (CO(2)) capture at coal-fired power plants. Using newly developed performance and cost models, our analysis shows that membrane systems configured with multiple stages or steps are capable of meeting capture targets of 90% CO(2) removal efficiency and 95+% product purity. A combined driving force design using both compressors and vacuum pumps is most effective for reducing the cost of CO(2) avoided. Further reductions in the overall system energy penalty and cost can be obtained by recycling a portion of CO(2) via a two-stage, two-step membrane configuration with air sweep to increase the CO(2) partial pressure of feed flue gas. For a typical plant with carbon capture and storage, this yielded a 15% lower cost per metric ton of CO(2) avoided compared to a plant using a current amine-based capture system. A series of parametric analyses also is undertaken to identify paths for enhancing the viability of membrane-based capture technology.

  9. System-wide identification of RNA-binding proteins by interactome capture.

    PubMed

    Castello, Alfredo; Horos, Rastislav; Strein, Claudia; Fischer, Bernd; Eichelbaum, Katrin; Steinmetz, Lars M; Krijgsveld, Jeroen; Hentze, Matthias W

    2013-03-01

    Owing to their preeminent biological functions, the repertoire of expressed RNA-binding proteins (RBPs) and their activity states are highly informative about cellular systems. We have developed a novel and unbiased technique, called interactome capture, for identifying the active RBPs of cultured cells. By making use of in vivo UV cross-linking of RBPs to polyadenylated RNAs, covalently bound proteins are captured with oligo(dT) magnetic beads. After stringent washes, the mRNA interactome is determined by quantitative mass spectrometry (MS). The protocol takes 3 working days for analysis of single proteins by western blotting, and about 2 weeks for the determination of complete cellular mRNA interactomes by MS. The most important advantage of interactome capture over other in vitro and in silico approaches is that only RBPs bound to RNA in a physiological environment are identified. When applied to HeLa cells, interactome capture revealed hundreds of novel RBPs. Interactome capture can also be broadly used to compare different biological states, including metabolic stress, cell cycle, differentiation, development or the response to drugs.

  10. Isolation of circulating tumor cells by a magnesium-embedded filter

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Xu, Tong; Xu, Yucheng; Kang, Dongyang; Xu, Lei; Park, Jungwook; Han-Chieh Chang, Jay; Zhang, Xiaoxiao; Goldkorn, Amir; Tai, Yu-Chong

    2015-10-01

    Circulating tumor cells (CTCs) are rare cancer cells that are shed by tumors into the bloodstream and that can be valuable biomarkers for various types of cancers. However, CTCs captured on the filter could not be released easily using the existing CTC analysis platforms based on size. To address this limitation, we have developed a novel magnesium (Mg)-embedded cell filter for capture, release and isolation of CTCs. The CTC-filter consists of a thin Ebeam-deposited Mg layer embedded between two parylene-C (PA-C) layers with designed slots for filtration and CTC capture. Thin Mg film has proved highly biocompatible and can be etched in saline, PBS and Dulbecco’s modified eagle medium (DMEM) etc, properties that are of great benefit to help dissociate the filter and thus release the cells. The finite element method (FEM) analysis was performed on the Mg etching process in DMEM for the structure design. After the filtration process, the filter was submerged in DMEM to facilitate Mg etching. The top PA-C filter pieces break apart from the bottom after Mg completely dissolves, enabling captured CTCs to detach. The released CTC can be easily aspirated into a micropipette for further analysis. Thus, the Mg-embedded cell filter provides a new and effective approach for CTCs isolation from the filter, making this a promising new strategy for cancer detection.

  11. Likelihood analysis of spatial capture-recapture models for stratified or class structured populations

    USGS Publications Warehouse

    Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.

    2015-01-01

    We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.

  12. Multi-Column Xe/Kr Separation with AgZ-PAN and HZ-PAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenhalgh, Mitchell Randy; Garn, Troy Gerry; Welty, Amy Keil

    Previous multi-column xenon/krypton separation tests have demonstrated the capability of separating xenon from krypton in a mixed gas feed stream. The results of this initial testing with AgZ-PAN and HZ-PAN indicated that an excellent separation of xenon from krypton could be achieved. Building upon these initial results, a series of additional multi-column testing were performed in FY-16. The purpose of this testing was to scale up the sorbent beds, test a different composition of feed gas and attempt to improve the accuracy of the analysis of the individual capture columns’ compositions. Two Stirling coolers were installed in series to performmore » this testing. The use of the coolers instead of the cryostat provided two desired improvements, 1) removal of the large dilution due to the internal volume of the cryostat adsorption chamber, and 2) ability to increase the sorbent bed size for scale-up. The AgZ-PAN sorbent, due to its xenon selectivity, was loaded in the first column to capture the xenon while allowing the krypton to flow through and be routed to a second column containing the HZ-PAN for capture and analysis. The gases captured on both columns were sampled with evacuated sample bombs and subsequently analyzed via GC-MS for both krypton and xenon. The results of these tests can be used to develop the scope of future testing and analysis using this test bed for demonstrating the capture and separation of xenon and krypton using sorbents, for demonstrating desorption and regeneration of the sorbents, and for determining compositions of the desorbed gases. They indicate a need for future desorption studies in order to better quantify co-adsorbed species and final krypton purity.« less

  13. Parameter-expanded data augmentation for Bayesian analysis of capture-recapture models

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, Robert M.

    2012-01-01

    Data augmentation (DA) is a flexible tool for analyzing closed and open population models of capture-recapture data, especially models which include sources of hetereogeneity among individuals. The essential concept underlying DA, as we use the term, is based on adding "observations" to create a dataset composed of a known number of individuals. This new (augmented) dataset, which includes the unknown number of individuals N in the population, is then analyzed using a new model that includes a reformulation of the parameter N in the conventional model of the observed (unaugmented) data. In the context of capture-recapture models, we add a set of "all zero" encounter histories which are not, in practice, observable. The model of the augmented dataset is a zero-inflated version of either a binomial or a multinomial base model. Thus, our use of DA provides a general approach for analyzing both closed and open population models of all types. In doing so, this approach provides a unified framework for the analysis of a huge range of models that are treated as unrelated "black boxes" and named procedures in the classical literature. As a practical matter, analysis of the augmented dataset by MCMC is greatly simplified compared to other methods that require specialized algorithms. For example, complex capture-recapture models of an augmented dataset can be fitted with popular MCMC software packages (WinBUGS or JAGS) by providing a concise statement of the model's assumptions that usually involves only a few lines of pseudocode. In this paper, we review the basic technical concepts of data augmentation, and we provide examples of analyses of closed-population models (M 0, M h , distance sampling, and spatial capture-recapture models) and open-population models (Jolly-Seber) with individual effects.

  14. The impact of red light running camera flashes on younger and older drivers' attention and oculomotor control.

    PubMed

    Wright, Timothy J; Vitale, Thomas; Boot, Walter R; Charness, Neil

    2015-12-01

    Recent empirical evidence has suggested that the flashes associated with red light running cameras (RLRCs) distract younger drivers, pulling attention away from the roadway and delaying processing of safety-relevant events. Considering the perceptual and attentional declines that occur with age, older drivers may be especially susceptible to the distracting effects of RLRC flashes, particularly in situations in which the flash is more salient (a bright flash at night compared with the day). The current study examined how age and situational factors potentially influence attention capture by RLRC flashes using covert (cuing effects) and overt (eye movement) indices of capture. We manipulated the salience of the flash by varying its luminance and contrast with respect to the background of the driving scene (either day or night scenes). Results of 2 experiments suggest that simulated RLRC flashes capture observers' attention, but, surprisingly, no age differences in capture were observed. However, an analysis examining early and late eye movements revealed that older adults may have been strategically delaying their eye movements in order to avoid capture. Additionally, older adults took longer to disengage attention following capture, suggesting at least 1 age-related disadvantage in capture situations. Findings have theoretical implications for understanding age differences in attention capture, especially with respect to capture in real-world scenes, and inform future work that should examine how the distracting effects of RLRC flashes influence driver behavior. (c) 2015 APA, all rights reserved).

  15. The Impact of Red Light Running Camera Flashes on Younger and Older Drivers' Attention and Oculomotor Control

    PubMed Central

    Wright, Timothy J.; Vitale, Thomas; Boot, Walter R; Charness, Neil

    2015-01-01

    Recent empirical evidence suggests that the flashes associated with red light running cameras (RLRCs) distract younger drivers, pulling attention away from the roadway and delaying processing of safety-relevant events. Considering the perceptual and attentional declines that occur with age, older drivers may be especially susceptible to the distracting effects of RLRC flashes, particularly in situations in which the flash is more salient (a bright flash at night compared to the day). The current study examined how age and situational factors potentially influence attention capture by RLRC flashes using covert (cuing effects) and overt (eye movement) indices of capture. We manipulated the salience of the flash by varying its luminance and contrast with respect to the background of the driving scene (either day or night scenes). Results of two experiments suggest that simulated RLRC flashes capture observers' attention, but, surprisingly, no age differences in capture were observed. However, an analysis examining early and late eye movements revealed that older adults may have been strategically delaying their eye movements in order to avoid capture. Additionally, older adults took longer to disengage attention following capture, suggesting at least one age-related disadvantage in capture situations. Findings have theoretical implications for understanding age differences in attention capture, especially with respect to capture in real-world scenes, and inform future work that should examine how the distracting effects of RLRC flashes influence driver behavior. PMID:26479014

  16. Orbit and size distributions for asteroids temporarily captured by the Earth-Moon system

    NASA Astrophysics Data System (ADS)

    Fedorets, Grigori; Granvik, Mikael; Jedicke, Robert

    2017-03-01

    As a continuation of the work by Granvik et al. (2012), we expand the statistical treatment of Earth's temporarily-captured natural satellites from temporarily-captured orbiters (TCOs, i.e., objects which make at least one orbit around the Earth) to the newly redefined subpopulation of temporarily-captured flybys (TCFs). TCFs are objects that while being gravitationally bound fail to make a complete orbit around the Earth while on a geocentric orbit, but nevertheless approach the Earth within its Hill radius. We follow the trajectories of massless test asteroids through the Earth-Moon system and record the orbital characteristics of those that are temporarily captured. We then carry out a steady-state analysis utilizing the novel NEO population model by Granvik et al. (2016). We also investigate how an quadratic distribution at very small values of e⊙ and i⊙ affects the predicted population statistics of Earth's temporarily-captured natural satellites. The steady-state population in both cases (constant and quadratic number distributions inside the e and i bins) is predicted to contain a slightly reduced number of meter-sized asteroids compared to the values of the previous paper. For the combined TCO/TCF population, we find the largest body constantly present on a geocentric orbit to be on the order of 80 cm in diameter. In the phase space, where the capture is possible, the capture efficiency of TCOs and TCFs is O(10-6 -10-4) . We also find that kilometer-scale asteroids are captured once every 10 Myr.

  17. Large Pilot-Scale Carbon Dioxide (CO2) Capture Project Using Aminosilicone Solvent.Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hancu, Dan

    GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economicmore » analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.« less

  18. Low and High Frequency Models of Response Statistics of a Cylindrical Orthogrid Vehicle Panel to Acoustic Excitation

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent

    2010-01-01

    This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense

  19. Effects of sampling conditions on DNA-based estimates of American black bear abundance

    USGS Publications Warehouse

    Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability for the larger of 2 mixture proportions of the population (i.e., pA or pB, depending on the value of π) was most important for predicting accuracy and precision, whereas capture probabilities of both mixture proportions (pA and pB) were important to explain variation in coverage. Based on sampling conditions similar to parameter estimates from the empirical dataset (pA = 0.30, pB = 0.05, N = 250, π = 0.15, and k = 10), predicted accuracy and precision were low (60% and 53%, respectively), whereas coverage was high (94%). Increasing pB, the capture probability for the predominate but most difficult to capture proportion of the population, was most effective to improve accuracy under those conditions. However, manipulation of other parameters may be more effective under different conditions. In general, the probabilities of obtaining accurate and precise estimates were best when p≥ 0.2. Our regression models can be used by managers to evaluate specific sampling scenarios and guide development of sampling frameworks or to assess reliability of DNA-based capture-mark-recapture studies.

  20. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    PubMed

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.

  1. The quick acquisition technique for laser communication between LEO and GEO

    NASA Astrophysics Data System (ADS)

    Zhang, Li-zhong; Zhang, Rui-qin; Li, Yong-hao; Meng, Li-xin; Li, Xiao-ming

    2013-08-01

    The sight-axis alignment can be accomplished by the quick acquisition operation between two laser communication terminals, which is the premise of establishing a free-space optical communication link. Especially for the laser communication links of LEO (Low Earth Orbit)-Ground and LEO-GEO (Geostationary Earth Orbit), since the earth would break the transmission of laser and break the communication as well, so the effective time for each communication is very shot (several minutes~ dozens of minutes), as a result the communication terminals have to capture each other to rebuild the laser communication link. In the paper, on the basis of the analysis of the traditional methods, it presents a new idea that using the long beacon light instead of the circular beacon light; thereby the original of two-dimensional raster spiral scanning is replaced by one-dimensional scanning. This method will reduce the setup time and decrease the failure probability of acquisition for the LEO-GEO laser communication link. Firstly, the analysis of the external constraint conditions in the acquisition phase has been presented in this paper. Furthermore, the acquisition algorithm models have been established. The optimization analysis for the parameters of the acquisition unit has been carried out, and the ground validation experiments of the acquisition strategy have also been performed. The experiments and analysis show that compared with traditional capturing methods, the method presented in this article can make the capturing time be shortened by about 40%, and the failure probability of capturing be reduced by about 30%. So, the method is significant for the LEO-GEO laser communication link.

  2. Analysis and Modeling of Ground Operations at Hub Airports

    NASA Technical Reports Server (NTRS)

    Atkins, Stephen (Technical Monitor); Andersson, Kari; Carr, Francis; Feron, Eric; Hall, William D.

    2000-01-01

    Building simple and accurate models of hub airports can considerably help one understand airport dynamics, and may provide quantitative estimates of operational airport improvements. In this paper, three models are proposed to capture the dynamics of busy hub airport operations. Two simple queuing models are introduced to capture the taxi-out and taxi-in processes. An integer programming model aimed at representing airline decision-making attempts to capture the dynamics of the aircraft turnaround process. These models can be applied for predictive purposes. They may also be used to evaluate control strategies for improving overall airport efficiency.

  3. SIMS analysis of extended impact features on LDEF experiment

    NASA Technical Reports Server (NTRS)

    Amari, S.; Foote, J.; Jessberger, E. K.; Simon, C.; Stadermann, F. J.; Swan, P.; Walker, R.; Zinner, E.

    1991-01-01

    Discussed here are the first Secondary Ion Mass Spectroscopy (SIMS) analysis of projectile material deposited in extended impact features on Ge wafers from the trailing edge. Although most capture cells lost their plastic film covers, they contain extended impact features that apparently were produced by high velocity impacts when the plastic foils were still intact. Detailed optical scanning of all bare capture cells from the trailing edge revealed more than 100 impacts. Fifty-eight were selected by scanning electron microscope (SEM) inspection as prime candidates for SIMS analysis. Preliminary SIMS measurements were made on 15 impacts. More than half showed substantial enhancements of Mg, Al, Si, Ca, and Fe in the impact region, indicating micrometeorites as the projectiles.

  4. Single step synthesis of nanostructured boron nitride for boron neutron capture therapy

    NASA Astrophysics Data System (ADS)

    Singh, Bikramjeet; Singh, Paviter; Kumar, Manjeet; Thakur, Anup; Kumar, Akshay

    2015-05-01

    Nanostructured Boron Nitride (BN) has been successfully synthesized by carbo-thermic reduction of Boric Acid (H3BO3). This method is a relatively low temperature synthesis route and it can be used for large scale production of nanostructured BN. The synthesized nanoparticles have been characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM) and differential thermal analyzer (DTA). XRD analysis confirmed the formation of single phase nanostructured Boron Nitride. SEM analysis showed that the particles are spherical in shape. DTA analysis showed that the phase is stable upto 900 °C and the material can be used for high temperature applications as well boron neutron capture therapy (BNCT).

  5. Analysis of the life stages of Cimex lectularius captured within a medical centre suggests that the true numbers of bed bug introductions are under-reported.

    PubMed

    Sheele, J M; Barrett, E; Dash, D; Ridge, G E

    2017-11-01

    Little is known about the epidemiology of bed bugs within the healthcare system, but nymphal stages predominate in natural infestations. This study determined the life stages of bed bugs captured within a medical centre, and found that older bed bugs were more likely to be captured than younger insects. The numbers of first instars, third-fifth instars and adult females captured were significantly different compared with the numbers of each life stage found in a natural infestation (P<0.01). A significant number of early-instar bed bugs introduced into the medical centre may go unnoticed by hospital staff. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  6. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2018-02-01

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  7. Military Police Operations and Counterinsurgency

    DTIC Science & Technology

    2008-03-01

    that deceitfully advertised PRUs as assassination squads. In reality most PRU operations ended in the successful capture and prosecution of VCI...and neutralize Kimathi whose capture " virtually ended Mau Mau resistence" (Asprey, 886). This campaign analysis will use the following categories to...for special access programs. , 3.10. Employs specialized investigative techniques to include forensic and t:>ehavioral sciencell and hypnosis . 3.11

  8. Project CAPTURE: using forest inventory and analysis data to prioritize tree species for conservation, management, and restoration

    Treesearch

    Kevin M. Potter; Barbara S. Crane; William W. Hargrove

    2015-01-01

    A variety of threats, most importantly climate change and insect and disease infestation, will increase the likelihood that forest tree species could experience population-level extirpation or species-level extinction during the next century. Project CAPTURE (Conservation Assessment and Prioritization of Forest Trees Under Risk of Extirpation) is a cooperative effort...

  9. EVALUATION OF N-METHYL-N-TERT-BUTYLDIMETHYLSILYLTRIFLUOROACETAMIDE FOR ENVIRONMENTAL ANALYSIS UNDER BOTH EIMS AND ELECTRON CAPTURE NICIMS CONDITIONS AND COMPARISON TO TRIMETHYLSILYL REAGENTS UNDER EIMS

    EPA Science Inventory

    Sewage effluent was analyzed for 3,5,6-trichloropyridinol (TCP) by extracting one liter of water using liquid-liquid extraction and determined by GC/MS operated in the negative ion chemical ionization (electron capture) mode, TCP is the major metabolite of the commonly used insec...

  10. Analysis on Tracking Schedule and Measurements Characteristics for the Spacecraft on the Phase of Lunar Transfer and Capture

    NASA Astrophysics Data System (ADS)

    Song, Young-Joo; Choi, Su-Jin; Ahn, Sang-il; Sim, Eun-Sup

    2014-03-01

    In this work, the preliminary analysis on both the tracking schedule and measurements characteristics for the spacecraft on the phase of lunar transfer and capture is performed. To analyze both the tracking schedule and measurements characteristics, lunar transfer and capture phases¡¯ optimized trajectories are directly adapted from former research, and eleven ground tracking facilities (three Deep Space Network sties, seven Near Earth Network sites, one Daejeon site) are assumed to support the mission. Under these conceptual mission scenarios, detailed tracking schedules and expected measurement characteristics during critical maneuvers (Trans Lunar Injection, Lunar Orbit Insertion and Apoapsis Adjustment Maneuver), especially for the Deajeon station, are successfully analyzed. The orders of predicted measurements' variances during lunar capture phase according to critical maneuvers are found to be within the order of mm/s for the range and micro-deg/s for the angular measurements rates which are in good agreement with the recommended values of typical measurement modeling accuracies for Deep Space Networks. Although preliminary navigation accuracy guidelines are provided through this work, it is expected to give more practical insights into preparing the Korea's future lunar mission, especially for developing flight dynamics subsystem.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jing; Liang, Zheng-Liang; Wu, Yue-Liang

    We investigate the implications of the long-rang self-interaction on both the self-capture and the annihilation of the self-interacting dark matter (SIDM) trapped in the Sun. Our discussion is based on a specific SIDM model in which DM particles self-interact via a light scalar mediator, or Yukawa potential, in the context of quantum mechanics. Within this framework, we calculate the self-capture rate across a broad region of parameter space. While the self-capture rate can be obtained separately in the Born regime with perturbative method, and in the classical limits with the Rutherford formula, our calculation covers the gap between in amore » non-perturbative fashion. Besides, the phenomenology of both the Sommerfeld-enhanced s- and p-wave annihilation of the solar SIDM is also involved in our discussion. Moreover, by combining the analysis of the Super-Kamiokande (SK) data and the observed DM relic density, we constrain the nuclear capture rate of the DM particles in the presence of the dark Yukawa potential. The consequence of the long-range dark force on probing the solar SIDM turns out to be significant if the force-carrier is much lighter than the DM particle, and a quantitative analysis is provided.« less

  12. Long-range Self-interacting Dark Matter in the Sun

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Liang, Zheng-Liang; Wu, Yue-Liang; Zhou, Yu-Feng

    2015-12-01

    We investigate the implications of the long-rang self-interaction on both the self-capture and the annihilation of the self-interacting dark matter (SIDM) trapped in the Sun. Our discussion is based on a specific SIDM model in which DM particles self-interact via a light scalar mediator, or Yukawa potential, in the context of quantum mechanics. Within this framework, we calculate the self-capture rate across a broad region of parameter space. While the self-capture rate can be obtained separately in the Born regime with perturbative method, and in the classical limits with the Rutherford formula, our calculation covers the gap between in a non-perturbative fashion. Besides, the phenomenology of both the Sommerfeld-enhanced s- and p-wave annihilation of the solar SIDM is also involved in our discussion. Moreover, by combining the analysis of the Super-Kamiokande (SK) data and the observed DM relic density, we constrain the nuclear capture rate of the DM particles in the presence of the dark Yukawa potential. The consequence of the long-range dark force on probing the solar SIDM turns out to be significant if the force-carrier is much lighter than the DM particle, and a quantitative analysis is provided.

  13. Retinal Image Quality Assessment for Spaceflight-Induced Vision Impairment Study

    NASA Technical Reports Server (NTRS)

    Vu, Amanda Cadao; Raghunandan, Sneha; Vyas, Ruchi; Radhakrishnan, Krishnan; Taibbi, Giovanni; Vizzeri, Gianmarco; Grant, Maria; Chalam, Kakarla; Parsons-Wingerter, Patricia

    2015-01-01

    Long-term exposure to space microgravity poses significant risks for visual impairment. Evidence suggests such vision changes are linked to cephalad fluid shifts, prompting a need to directly quantify microgravity-induced retinal vascular changes. The quality of retinal images used for such vascular remodeling analysis, however, is dependent on imaging methodology. For our exploratory study, we hypothesized that retinal images captured using fluorescein imaging methodologies would be of higher quality in comparison to images captured without fluorescein. A semi-automated image quality assessment was developed using Vessel Generation Analysis (VESGEN) software and MATLAB® image analysis toolboxes. An analysis of ten images found that the fluorescein imaging modality provided a 36% increase in overall image quality (two-tailed p=0.089) in comparison to nonfluorescein imaging techniques.

  14. Target capture enrichment of nuclear SNP markers for massively parallel sequencing of degraded and mixed samples.

    PubMed

    Bose, Nikhil; Carlberg, Katie; Sensabaugh, George; Erlich, Henry; Calloway, Cassandra

    2018-05-01

    DNA from biological forensic samples can be highly fragmented and present in limited quantity. When DNA is highly fragmented, conventional PCR based Short Tandem Repeat (STR) analysis may fail as primer binding sites may not be present on a single template molecule. Single Nucleotide Polymorphisms (SNPs) can serve as an alternative type of genetic marker for analysis of degraded samples because the targeted variation is a single base. However, conventional PCR based SNP analysis methods still require intact primer binding sites for target amplification. Recently, probe capture methods for targeted enrichment have shown success in recovering degraded DNA as well as DNA from ancient bone samples using next-generation sequencing (NGS) technologies. The goal of this study was to design and test a probe capture assay targeting forensically relevant nuclear SNP markers for clonal and massively parallel sequencing (MPS) of degraded and limited DNA samples as well as mixtures. A set of 411 polymorphic markers totaling 451 nuclear SNPs (375 SNPs and 36 microhaplotype markers) was selected for the custom probe capture panel. The SNP markers were selected for a broad range of forensic applications including human individual identification, kinship, and lineage analysis as well as for mixture analysis. Performance of the custom SNP probe capture NGS assay was characterized by analyzing read depth and heterozygote allele balance across 15 samples at 25 ng input DNA. Performance thresholds were established based on read depth ≥500X and heterozygote allele balance within ±10% deviation from 50:50, which was observed for 426 out of 451 SNPs. These 426 SNPs were analyzed in size selected samples (at ≤75 bp, ≤100 bp, ≤150 bp, ≤200 bp, and ≤250 bp) as well as mock degraded samples fragmented to an average of 150 bp. Samples selected for ≤75 bp exhibited 99-100% reportable SNPs across varied DNA amounts and as low as 0.5 ng. Mock degraded samples at 1 ng and 10 ng exhibited >90% reportable SNPs. Finally, two-person male-male mixtures were tested at 10 ng in contributor varying ratios. Overall, 85-100% of alleles unique to the minor contributor were observed at all mixture ratios. Results from these studies using the SNP probe capture NGS system demonstrates proof of concept for application to forensically relevant degraded and mixed DNA samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Diversity of abundance patterns of neutron-capture elements in very metal-poor stars

    NASA Astrophysics Data System (ADS)

    Aoki, Misa; Aoki, Wako; Ishimaru, Yuhri; Wanajo, Shinya

    2014-05-01

    Observations of Very Metal-Poor stars indicate that there are at least two sites to r-process; "weak r-process" and "main r-process". A question is whether these two are well separated or there exists a variation in the r-process. We present the results of abundance analysis of neutron-capture elements in the two Very Metal-Poor stars HD107752 and HD110184 in the Milky Way halo observed with the Subaru Telescope HDS. The abundance patterns show overabundace at light n-capture elements (e.g. Sr, Y), inferring the element yielding of weak r-process, while heavy neutron-capture elements (e.g. Ba, Eu) are deficient; however, the overabundance of light ones is not as significant as that previously found in stars representing the weak r-process (e.g. HD122563; Honda et al. 2006). Our study show diversity in the abundance patterns from light to heavy neutron-capture elements in VMP stars, suggesting a variation in r-process, which may depend on electron fraction of environment.

  16. A comparison of radiative capture with decay gamma-ray method in bore hole logging for economic minerals

    USGS Publications Warehouse

    Senftle, F.E.; Moxham, R.M.; Tanner, A.B.

    1972-01-01

    The recent availability of borehole logging sondes employing a source of neutrons and a Ge(Li) detector opens up the possibility of analyzing either decay or capture gamma rays. The most efficient method for a given element can be predicted by calculating the decay-to-capture count ratio for the most prominent peaks in the respective spectra. From a practical point of view such a calculation must be slanted toward short irradiation and count times at each station in a borehole. A simplified method of computation is shown, and the decay-to-capture count ratio has been calculated and tabulated for the optimum value in the decay mode irrespective of the irradiation time, and also for a ten minute irradiation time. Based on analysis of a single peak in each spectrum, the results indicate the preferred technique and the best decay or capture peak to observe for those elements of economic interest. ?? 1972.

  17. Surface modification of a low cost bentonite for post-combustion CO2 capture

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Park, Dong-Wha; Ahn, Wha-Seung

    2013-10-01

    A low cost bentonite was modified with PEI (polyethylenimine) through a physical impregnation method. Bentonite in its natural state and after amine modification were characterized by scanning electron microscopy-energy dispersive X-ray spectroscopy, X-ray diffraction, N2 adsorption-desorption isotherms, and investigated for CO2 capture using a thermogravimetric analysis unit connected to a flow panel. The effect of adsorption temperature, PEI loading and CO2 partial pressure on the CO2 capture performance of the PEI-modified bentonite was examined. A cyclic CO2 adsorption-desorption test was also carried out to assess the stability of PEI-modified bentonite as a CO2 adsorbent. Bentonite in its natural state showed negligible CO2 uptake. After amine modification, the CO2 uptake increased significantly due to CO2 capture by amine species introduced via chemisorption. The PEI-modified bentonites showed high CO2 capture selectivity over N2, and exhibited excellent stability in cyclic CO2 adsorption-desorption runs.

  18. Voltage and pace-capture mapping of linear ablation lesions overestimates chronic ablation gap size.

    PubMed

    O'Neill, Louisa; Harrison, James; Chubb, Henry; Whitaker, John; Mukherjee, Rahul K; Bloch, Lars Ølgaard; Andersen, Niels Peter; Dam, Høgni; Jensen, Henrik K; Niederer, Steven; Wright, Matthew; O'Neill, Mark; Williams, Steven E

    2018-04-26

    Conducting gaps in lesion sets are a major reason for failure of ablation procedures. Voltage mapping and pace-capture have been proposed for intra-procedural identification of gaps. We aimed to compare gap size measured acutely and chronically post-ablation to macroscopic gap size in a porcine model. Intercaval linear ablation was performed in eight Göttingen minipigs with a deliberate gap of ∼5 mm left in the ablation line. Gap size was measured by interpolating ablation contact force values between ablation tags and thresholding at a low force cut-off of 5 g. Bipolar voltage mapping and pace-capture mapping along the length of the line were performed immediately, and at 2 months, post-ablation. Animals were euthanized and gap sizes were measured macroscopically. Voltage thresholds to define scar were determined by receiver operating characteristic analysis as <0.56 mV (acutely) and <0.62 mV (chronically). Taking the macroscopic gap size as gold standard, error in gap measurements were determined for voltage, pace-capture, and ablation contact force maps. All modalities overestimated chronic gap size, by 1.4 ± 2.0 mm (ablation contact force map), 5.1 ± 3.4 mm (pace-capture), and 9.5 ± 3.8 mm (voltage mapping). Error on ablation contact force map gap measurements were significantly less than for voltage mapping (P = 0.003, Tukey's multiple comparisons test). Chronically, voltage mapping and pace-capture mapping overestimated macroscopic gap size by 11.9 ± 3.7 and 9.8 ± 3.5 mm, respectively. Bipolar voltage and pace-capture mapping overestimate the size of chronic gap formation in linear ablation lesions. The most accurate estimation of chronic gap size was achieved by analysis of catheter-myocardium contact force during ablation.

  19. Visually driven chaining of elementary swim patterns into a goal-directed motor sequence: a virtual reality study of zebrafish prey capture

    PubMed Central

    Trivedi, Chintan A.; Bollmann, Johann H.

    2013-01-01

    Prey capture behavior critically depends on rapid processing of sensory input in order to track, approach, and catch the target. When using vision, the nervous system faces the problem of extracting relevant information from a continuous stream of input in order to detect and categorize visible objects as potential prey and to select appropriate motor patterns for approach. For prey capture, many vertebrates exhibit intermittent locomotion, in which discrete motor patterns are chained into a sequence, interrupted by short periods of rest. Here, using high-speed recordings of full-length prey capture sequences performed by freely swimming zebrafish larvae in the presence of a single paramecium, we provide a detailed kinematic analysis of first and subsequent swim bouts during prey capture. Using Fourier analysis, we show that individual swim bouts represent an elementary motor pattern. Changes in orientation are directed toward the target on a graded scale and are implemented by an asymmetric tail bend component superimposed on this basic motor pattern. To further investigate the role of visual feedback on the efficiency and speed of this complex behavior, we developed a closed-loop virtual reality setup in which minimally restrained larvae recapitulated interconnected swim patterns closely resembling those observed during prey capture in freely moving fish. Systematic variation of stimulus properties showed that prey capture is initiated within a narrow range of stimulus size and velocity. Furthermore, variations in the delay and location of swim triggered visual feedback showed that the reaction time of secondary and later swims is shorter for stimuli that appear within a narrow spatio-temporal window following a swim. This suggests that the larva may generate an expectation of stimulus position, which enables accelerated motor sequencing if the expectation is met by appropriate visual feedback. PMID:23675322

  20. A simple novel device for air sampling by electrokinetic capture

    DOE PAGES

    Gordon, Julian; Gandhi, Prasanthi; Shekhawat, Gajendra; ...

    2015-12-27

    A variety of different sampling devices are currently available to acquire air samples for the study of the microbiome of the air. All have a degree of technical complexity that limits deployment. Here, we evaluate the use of a novel device, which has no technical complexity and is easily deployable. An air-cleaning device powered by electrokinetic propulsion has been adapted to provide a universal method for collecting samples of the aerobiome. Plasma-induced charge in aerosol particles causes propulsion to and capture on a counter-electrode. The flow of ions creates net bulk airflow, with no moving parts. A device and electrodemore » assembly have been re-designed from air-cleaning technology to provide an average air flow of 120 lpm. This compares favorably with current air sampling devices based on physical air pumping. Capture efficiency was determined by comparison with a 0.4 μm polycarbonate reference filter, using fluorescent latex particles in a controlled environment chamber. Performance was compared with the same reference filter method in field studies in three different environments. For 23 common fungal species by quantitative polymerase chain reaction (qPCR), there was 100 % sensitivity and apparent specificity of 87%, with the reference filter taken as “gold standard.” Further, bacterial analysis of 16S RNA by amplicon sequencing showed equivalent community structure captured by the electrokinetic device and the reference filter. Unlike other current air sampling methods, capture of particles is determined by charge and so is not controlled by particle mass. We analyzed particle sizes captured from air, without regard to specific analyte by atomic force microscopy: particles at least as low as 100 nM could be captured from ambient air. This work introduces a very simple plug-and-play device that can sample air at a high-volume flow rate with no moving parts and collect particles down to the sub-micron range. In conclusion, the performance of the device is substantially equivalent to capture by pumping through a filter for microbiome analysis by quantitative PCR and amplicon sequencing.« less

  1. A simple novel device for air sampling by electrokinetic capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, Julian; Gandhi, Prasanthi; Shekhawat, Gajendra

    A variety of different sampling devices are currently available to acquire air samples for the study of the microbiome of the air. All have a degree of technical complexity that limits deployment. Here, we evaluate the use of a novel device, which has no technical complexity and is easily deployable. An air-cleaning device powered by electrokinetic propulsion has been adapted to provide a universal method for collecting samples of the aerobiome. Plasma-induced charge in aerosol particles causes propulsion to and capture on a counter-electrode. The flow of ions creates net bulk airflow, with no moving parts. A device and electrodemore » assembly have been re-designed from air-cleaning technology to provide an average air flow of 120 lpm. This compares favorably with current air sampling devices based on physical air pumping. Capture efficiency was determined by comparison with a 0.4 μm polycarbonate reference filter, using fluorescent latex particles in a controlled environment chamber. Performance was compared with the same reference filter method in field studies in three different environments. For 23 common fungal species by quantitative polymerase chain reaction (qPCR), there was 100 % sensitivity and apparent specificity of 87%, with the reference filter taken as “gold standard.” Further, bacterial analysis of 16S RNA by amplicon sequencing showed equivalent community structure captured by the electrokinetic device and the reference filter. Unlike other current air sampling methods, capture of particles is determined by charge and so is not controlled by particle mass. We analyzed particle sizes captured from air, without regard to specific analyte by atomic force microscopy: particles at least as low as 100 nM could be captured from ambient air. This work introduces a very simple plug-and-play device that can sample air at a high-volume flow rate with no moving parts and collect particles down to the sub-micron range. In conclusion, the performance of the device is substantially equivalent to capture by pumping through a filter for microbiome analysis by quantitative PCR and amplicon sequencing.« less

  2. Robust Extraction and Multi-Technique Analysis of Micrometeoroids Captured in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Westphal, A. J.; Graham, G. A.; Bench, G.; Brennan, S.; Luening, K.; Pianetta, P.; Keller, L. P.; Flynn, G. J.; Snead, C.; Dominquez, G.

    2003-01-01

    The use of low-density silica aerogel as the primary capture cell technology for the NASA Discovery mission Stardust to Comet Wild-2 [1] is a strong motivation for researchers within the Meteoritics community to develop techniques to handle this material. The unique properties of silica aerogel allow dust particles to be captured at hypervelocity speeds and to remain partially intact. The same unique properties present difficulties in the preparation of particles for analysis. Using tools borrowed from microbiologists, we have developed techniques for robustly extracting captured hypervelocity dust particles and their residues from aerogel collectors[2-3]. It is important not only to refine these extraction techniques but also to develop protocols for analyzing the captured particles. Since Stardust does not return material to Earth until 2006, researchers must either analyze particles that are impacted in the laboratory using light-gasgun facilities [e.g. 41 or examine aerogel collectors that have been exposed in low-Earth orbit (LEO) [5]. While there are certainly benefits in laboratory shots, i.e. accelerating known compositions of projectiles into aerogel, the LEO capture particles offer the opportunity to investigate real particles captured under real conditions. The aerogel collectors used in this research are part of the NASA Orbital Debris Collection Experiment that was exposed on the MIR Space Station for 18 months [5]. We have developed the capability at the UCB Space Sciences Laboratory to extract tiny volumes of aerogel that completely contain each impact event, and to mount them on micromachined fixtures so that they can be analyzed with no interfering support (Fig.1). These aerogel keystones simultaneously bring the terminal particle and the particle track to within 10 m (15 g cm- ) of the nearest aerogel surface. The extracted aerogel wedges containing both the impact tracks and the captured particles have been characterized using the synchrotron total external reflection X-ray fluorescence (TXRF) microprobe at SSRL, the Nuclear Microprobe at LLNL, synchrotron infrared microscopy at the ALS facility at LBL and the NSLS at BNL, and the Total Reflection X-ray Fluorescence (TXRF) facility at SLAC.

  3. A simple novel device for air sampling by electrokinetic capture.

    PubMed

    Gordon, Julian; Gandhi, Prasanthi; Shekhawat, Gajendra; Frazier, Angel; Hampton-Marcell, Jarrad; Gilbert, Jack A

    2015-12-27

    A variety of different sampling devices are currently available to acquire air samples for the study of the microbiome of the air. All have a degree of technical complexity that limits deployment. Here, we evaluate the use of a novel device, which has no technical complexity and is easily deployable. An air-cleaning device powered by electrokinetic propulsion has been adapted to provide a universal method for collecting samples of the aerobiome. Plasma-induced charge in aerosol particles causes propulsion to and capture on a counter-electrode. The flow of ions creates net bulk airflow, with no moving parts. A device and electrode assembly have been re-designed from air-cleaning technology to provide an average air flow of 120 lpm. This compares favorably with current air sampling devices based on physical air pumping. Capture efficiency was determined by comparison with a 0.4 μm polycarbonate reference filter, using fluorescent latex particles in a controlled environment chamber. Performance was compared with the same reference filter method in field studies in three different environments. For 23 common fungal species by quantitative polymerase chain reaction (qPCR), there was 100 % sensitivity and apparent specificity of 87 %, with the reference filter taken as "gold standard." Further, bacterial analysis of 16S RNA by amplicon sequencing showed equivalent community structure captured by the electrokinetic device and the reference filter. Unlike other current air sampling methods, capture of particles is determined by charge and so is not controlled by particle mass. We analyzed particle sizes captured from air, without regard to specific analyte by atomic force microscopy: particles at least as low as 100 nM could be captured from ambient air. This work introduces a very simple plug-and-play device that can sample air at a high-volume flow rate with no moving parts and collect particles down to the sub-micron range. The performance of the device is substantially equivalent to capture by pumping through a filter for microbiome analysis by quantitative PCR and amplicon sequencing.

  4. Capturing Nature's Diversity

    PubMed Central

    Pascolutti, Mauro; Campitelli, Marc; Nguyen, Bao; Pham, Ngoc; Gorse, Alain-Dominique; Quinn, Ronald J.

    2015-01-01

    Natural products are universally recognized to contribute valuable chemical diversity to the design of molecular screening libraries. The analysis undertaken in this work, provides a foundation for the generation of fragment screening libraries that capture the diverse range of molecular recognition building blocks embedded within natural products. Physicochemical properties were used to select fragment-sized natural products from a database of known natural products (Dictionary of Natural Products). PCA analysis was used to illustrate the positioning of the fragment subset within the property space of the non-fragment sized natural products in the dataset. Structural diversity was analysed by three distinct methods: atom function analysis, using pharmacophore fingerprints, atom type analysis, using radial fingerprints, and scaffold analysis. Small pharmacophore triplets, representing the range of chemical features present in natural products that are capable of engaging in molecular interactions with small, contiguous areas of protein binding surfaces, were analysed. We demonstrate that fragment-sized natural products capture more than half of the small pharmacophore triplet diversity observed in non fragment-sized natural product datasets. Atom type analysis using radial fingerprints was represented by a self-organizing map. We examined the structural diversity of non-flat fragment-sized natural product scaffolds, rich in sp3 configured centres. From these results we demonstrate that 2-ring fragment-sized natural products effectively balance the opposing characteristics of minimal complexity and broad structural diversity when compared to the larger, more complex fragment-like natural products. These naturally-derived fragments could be used as the starting point for the generation of a highly diverse library with the scope for further medicinal chemistry elaboration due to their minimal structural complexity. This study highlights the possibility to capture a high proportion of the individual molecular interaction motifs embedded within natural products using a fragment screening library spanning 422 structural clusters and comprised of approximately 2800 natural products. PMID:25902039

  5. New insights on Ba overabundance in open clusters. Evidence for the intermediate neutron-capture process at play?

    NASA Astrophysics Data System (ADS)

    Mishenina, T.; Pignatari, M.; Carraro, G.; Kovtyukh, V.; Monaco, L.; Korotin, S.; Shereta, E.; Yegorova, I.; Herwig, F.

    2015-02-01

    Recently, an increasing number of studies were devoted to measure the abundances of neutron-capture elements heavier than iron in stars belonging to Galactic Open Clusters (OCs). OCs span a sizeable range in metallicity (-0.6 ≤ [Fe/H] ≤ +0.4), and they show abundances of light elements similar to disc stars of the same age. A different pattern is observed for heavy elements. A large scatter is observed for Ba, with most OCs showing [Ba/Fe] and [Ba/La] overabundant with respect to the Sun. The origin of this overabundance is not clearly understood. With the goal of providing new observational insights, we determined radial velocities, atmospheric parameters and chemical composition of 27 giant stars members of five OCs: Cr 110, Cr 261, NGC 2477, NGC 2506 and NGC 5822. We used high-resolution spectra obtained with the UVES spectrograph at European Southern Observatory Paranal. We perform a detailed spectroscopic analysis of these stars to measure the abundance of up to 22 elements per star. We study the dependence of element abundance on metallicity and age with unprecedented detail, complementing our analysis with data culled from the literature. We confirm the trend of Ba overabundance in OCs, and show its large dispersion for clusters younger than ˜4 Gyr. Finally, the implications of our results for stellar nucleosynthesis are discussed. We show in this work that the Ba enrichment compared to other neutron-capture elements in OCs cannot be explained by the contributions from the slow neutron-capture process and the rapid neutron-capture process. Instead, we argue that this anomalous signature can be explained by assuming an additional contribution by the intermediate neutron-capture process.

  6. Pilot-Scale Silicone Process for Low-Cost Carbon Dioxide Capture Preliminary Techno-Economic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surinder; Spiry, Irina; Wood, Benjamin

    This report presents system and economic analysis for a carbon-capture unit which uses an aminosilicone-based solvent for CO{sub 2} capture in a pulverized coal (PC) boiler. The aminosilicone solvent is a 60/40 wt/wt mixture of 3-aminopropyl end-capped polydimethylsiloxane (GAP-1m) with tri-ethylene glycol (TEG) as a co-solvent. For comparison purposes, the report also shows results for a carbon-capture unit based on a conventional approach using mono-ethanol amine (MEA). The first year removal cost of CO{sub 2} for the aminosilicone-based carbon-capture process ismore » $46.04/ton of CO2 as compared to $$60.25/ton of CO{sub 2} when MEA is used. The aminosilicone-based process has <77% of the CAPEX of a system using MEA solvent. The lower CAPEX is due to several factors, including the higher working capacity of the aminosilicone solvent compared the MEA, which reduces the solvent flow rate required, reducing equipment sizes. If it is determined that carbon steel can be used in the rich-lean heat exchanger in the carbon capture unit, the first year removal cost of CO{sub 2} decreases to $$44.12/ton. The aminosilicone-based solvent has a higher thermal stability than MEA, allowing desorption to be conducted at higher temperatures and pressures, decreasing the number of compressor stages needed. The aminosilicone-based solvent also has a lower vapor pressure, allowing the desorption to be conducted in a continuous-stirred tank reactor versus a more expensive packed column. The aminosilicone-based solvent has a lower heat capacity, which decreases the heat load on the desorber. In summary, the amino-silicone solvent has significant advantages over conventional systems using MEA.« less

  7. Near-Ultraviolet Observations of CS 29497-030: New Constraints on Neutron-Capture Nucleosynthesis Processes

    NASA Astrophysics Data System (ADS)

    Ivans, Inese I.; Sneden, Christopher; Gallino, Roberto; Cowan, John J.; Preston, George W.

    2005-07-01

    Employing spectra obtained with the new Keck I HIRES near-UV-sensitive detector, we have performed a comprehensive chemical composition analysis of the binary blue metal-poor star CS 29497-030. Abundances for 29 elements and upper limits for an additional seven have been derived, concentrating on elements largely produced by means of neutron-capture nucleosynthesis. Included in our analysis are the two elements that define the termination point of the slow neutron-capture process, lead and bismuth. We determine an extremely high value of [Pb/Fe]=+3.65+/-0.07 (σ=0.13) from three features, supporting the single-feature result obtained in previous studies. We detect Bi for the first time in a metal-poor star. Our derived Bi/Pb ratio is in accord with those predicted from the most recent FRANEC calculations of the slow neutron-capture process in low-mass asymptotic giant branch (AGB) stars. We find that the neutron-capture elemental abundances of CS 29497-030 are best explained by an AGB model that also includes very significant amounts of pre-enrichment of rapid neutron-capture process material in the protostellar cloud out of which the CS 29497-030 binary system formed. Mass transfer is consistent with the observed [Nb/Zr]~0. Thus, CS 29497-030 is both an r+s and ``extrinsic AGB'' star. Furthermore, we find that the mass of the AGB model can be further constrained by the abundance of the light odd-element Na. The data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and NASA. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.

  8. Radiative neutron capture on 242Pu in the resonance region at the CERN n_TOF-EAR1 facility

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Guerrero, C.; Mendoza, E.; Quesada, J. M.; Eberhardt, K.; Junghans, A. R.; Krtička, M.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Balibrea, J.; Barbagallo, M.; Barros, S.; Bečvář, F.; Beinrucker, C.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brugger, M.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Castelluccio, D. M.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Dietz, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Furman, V.; Göbel, K.; García, A. R.; Gawlik, A.; Glodariu, T.; Gonçalves, I. F.; González-Romero, E.; Goverdovski, A.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heftrich, T.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kavrigin, P.; Ketlerov, V.; Khryachkov, V.; Kimura, A.; Kivel, N.; Kokkoris, M.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lo Meo, S.; Lonsdale, S. J.; Losito, R.; Macina, D.; Marganiec, J.; Martínez, T.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mengoni, A.; Milazzo, P. M.; Mingrone, F.; Mirea, M.; Montesano, S.; Musumarra, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, J. I.; Praena, J.; Rajeev, K.; Rauscher, T.; Reifarth, R.; Riego-Perez, A.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Schmidt, S.; Schumann, D.; Sedyshev, P.; Smith, A. G.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Tsinganis, A.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Warren, S.; Weigand, M.; Weiss, C.; Wolf, C.; Woods, P. J.; Wright, T.; Žugec, P.; n TOF Collaboration

    2018-02-01

    The spent fuel of current nuclear reactors contains fissile plutonium isotopes that can be combined with uranium to make mixed oxide (MOX) fuel. In this way the Pu from spent fuel is used in a new reactor cycle, contributing to the long-term sustainability of nuclear energy. However, an extensive use of MOX fuels, in particular in fast reactors, requires more accurate capture and fission cross sections for some Pu isotopes. In the case of 242Pu there are sizable discrepancies among the existing capture cross-section measurements included in the evaluations (all from the 1970s) resulting in an uncertainty as high as 35% in the fast energy region. Moreover, postirradiation experiments evaluated with JEFF-3.1 indicate an overestimation of 14% in the capture cross section in the fast neutron energy region. In this context, the Nuclear Energy Agency (NEA) requested an accuracy of 8% in this cross section in the energy region between 500 meV and 500 keV. This paper presents a new time-of-flight capture measurement on 242Pu carried out at n_TOF-EAR1 (CERN), focusing on the analysis and statistical properties of the resonance region, below 4 keV. The 242Pu(n ,γ ) reaction on a sample containing 95(4) mg enriched to 99.959% was measured with an array of four C6D6 detectors and applying the total energy detection technique. The high neutron energy resolution of n_TOF-EAR1 and the good statistics accumulated have allowed us to extend the resonance analysis up to 4 keV, obtaining new individual and average resonance parameters from a capture cross section featuring a systematic uncertainty of 5%, fulfilling the request of the NEA.

  9. Neutron radiative capture methods for surface elemental analysis

    USGS Publications Warehouse

    Trombka, J.I.; Senftle, F.; Schmadebeck, R.

    1970-01-01

    Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.

  10. Global-Local Finite Element Analysis for Thermo-Mechanical Stresses in Bonded Joints

    NASA Technical Reports Server (NTRS)

    Shkarayev, S.; Madenci, Erdogan; Camarda, C. J.

    1997-01-01

    An analysis of adhesively bonded joints using conventional finite elements does not capture the singular behavior of the stress field in regions where two or three dissimilar materials form a junction with or without free edges. However, these regions are characteristic of the bonded joints and are prone to failure initiation. This study presents a method to capture the singular stress field arising from the geometric and material discontinuities in bonded composites. It is achieved by coupling the local (conventional) elements with global (special) elements whose interpolation functions are constructed from the asymptotic solution.

  11. Analysis and Perspective from the Complex Aerospace Systems Exchange (CASE) 2013

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Parker, Peter A.; Detweiler, Kurt N.; McGowan, Anna-Maria R.; Dress, David A.; Kimmel, William M.

    2014-01-01

    NASA Langley Research Center embedded four rapporteurs at the Complex Aerospace Systems Exchange (CASE) held in August 2013 with the objective to capture the essence of the conference presentations and discussions. CASE was established to provide a discussion forum among chief engineers, program managers, and systems engineers on challenges in the engineering of complex aerospace systems. The meeting consists of invited presentations and panels from industry, academia, and government followed by discussions among attendees. This report presents the major and reoccurring themes captured throughout the meeting and provides analysis and insights to further the CASE mission.

  12. Measurement of cardiac output using improved chromatographic analysis of sulfur hexafluoride (SF6).

    PubMed

    Klocke, F J; Roberts, D L; Farhi, E R; Naughton, B J; Sekovski, B; Klocke, R A

    1977-06-01

    A constant current variable frequency pulsed electron capture detector has been incorporated into the gas chromatographic analysis of trace amounts of sulfur hexafluoride (SF6) in water and blood. The resulting system offers a broader effective operating range than more conventional electron capture units and has been utilized for measurements of cardiac output employing constant-rate infusion of dissolved SF6. The SF6 technique has been validated against direct volumetric measurements of cardiac output in a canine right-heart bypass preparation and used subsequently for rapidly repeated measurements in conscious animals and man.

  13. Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload

    NASA Technical Reports Server (NTRS)

    Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.

    2006-01-01

    Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept

  14. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  15. Capture compound mass spectrometry sheds light on the molecular mechanisms of liver toxicity of two Parkinson drugs.

    PubMed

    Fischer, Jenny J; Michaelis, Simon; Schrey, Anna K; Graebner, Olivia Graebner nee; Glinski, Mirko; Dreger, Mathias; Kroll, Friedrich; Koester, Hubert

    2010-01-01

    Capture compound mass spectrometry (CCMS) is a novel technology that helps understand the molecular mechanism of the mode of action of small molecules. The Capture Compounds are trifunctional probes: A selectivity function (the drug) interacts with the proteins in a biological sample, a reactivity function (phenylazide) irreversibly forms a covalent bond, and a sorting function (biotin) allows the captured protein(s) to be isolated for mass spectrometric analysis. Tolcapone and entacapone are potent inhibitors of catechol-O-methyltransferase (COMT) for the treatment of Parkinson's disease. We aimed to understand the molecular basis of the difference of both drugs with respect to side effects. Using Capture Compounds with these drugs as selectivity functions, we were able to unambiguously and reproducibly isolate and identify their known target COMT. Tolcapone Capture Compounds captured five times more proteins than entacapone Capture Compounds. Moreover, tolcapone Capture Compounds isolated mitochondrial and peroxisomal proteins. The major tolcapone-protein interactions occurred with components of the respiratory chain and of the fatty acid beta-oxidation. Previously reported symptoms in tolcapone-treated rats suggested that tolcapone might act as decoupling reagent of the respiratory chain (Haasio et al., 2002b). Our results demonstrate that CCMS is an effective tool for the identification of a drug's potential off targets. It fills a gap in currently used in vitro screens for drug profiling that do not contain all the toxicologically relevant proteins. Thereby, CCMS has the potential to fill a technological need in drug safety assessment and helps reengineer or to reject drugs at an early preclinical stage.

  16. On the problem of origin of periodic comets.

    NASA Astrophysics Data System (ADS)

    Guliev, A. S.

    The problem of origin of periodic comets is viewed under various aspects. A steady growth of the fraction of these comets in the overall population of comets is emphasized. The number of discovered periodic comets with small eccentricities and with the Jacobi constant close to 3 is also growing eventually. Comparison of maximum magnitudes of the same comets in different apparitions at the same elongations as well as the analysis of exhausted comets indicate that the age of these objects does not exceed 1000 years. Capture is considered as an efficient mechanism for preserving equilibrium over reasonable time intervals. The analysis of the data given by Everhart and the calculations of the evolution of cometary orbits reveal small efficiency of capture. Comparison of the number of well established capture cases with the corresponding time interval shows that the age of the system of periodic comets must be 17000 years within the framework of this mechanism. This is most unlikely. Secular variations in the distributions of semimajor axes, inclinations, longitudes of perihelia, eccentricities of orbits of periodic comets are analysed. On the average, the eccentricities tend to increase, but this conflicts with the capture mechanism. A conclusion is made that the concept of capture in its classical and modern versions is unable to solve the problem of the origin of periodic comets on the whole. Other, more effective sources and mechanisms seem to be also in operation in enlarging the cometary system.

  17. Extended Field Laser Confocal Microscopy (EFLCM): Combining automated Gigapixel image capture with in silico virtual microscopy

    PubMed Central

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-01-01

    Background Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Methods Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). Results We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. Conclusion The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes. PMID:18627634

  18. Capture and X-ray diffraction studies of protein microcrystals in a microfluidic trap array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyubimov, Artem Y.; Stanford University, Stanford, CA 94305; Stanford University, Stanford, CA 94305

    A microfluidic platform has been developed for the capture and X-ray analysis of protein microcrystals, affording a means to improve the efficiency of XFEL and synchrotron experiments. X-ray free-electron lasers (XFELs) promise to enable the collection of interpretable diffraction data from samples that are refractory to data collection at synchrotron sources. At present, however, more efficient sample-delivery methods that minimize the consumption of microcrystalline material are needed to allow the application of XFEL sources to a wide range of challenging structural targets of biological importance. Here, a microfluidic chip is presented in which microcrystals can be captured at fixed, addressablemore » points in a trap array from a small volume (<10 µl) of a pre-existing slurry grown off-chip. The device can be mounted on a standard goniostat for conducting diffraction experiments at room temperature without the need for flash-cooling. Proof-of-principle tests with a model system (hen egg-white lysozyme) demonstrated the high efficiency of the microfluidic approach for crystal harvesting, permitting the collection of sufficient data from only 265 single-crystal still images to permit determination and refinement of the structure of the protein. This work shows that microfluidic capture devices can be readily used to facilitate data collection from protein microcrystals grown in traditional laboratory formats, enabling analysis when cryopreservation is problematic or when only small numbers of crystals are available. Such microfluidic capture devices may also be useful for data collection at synchrotron sources.« less

  19. Extended Field Laser Confocal Microscopy (EFLCM): combining automated Gigapixel image capture with in silico virtual microscopy.

    PubMed

    Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo

    2008-07-16

    Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes.

  20. Molecular detection of Leishmania (Leishmania) infantum in phlebotomine sandflies from a visceral leishmaniasis endemic area in northwestern of São Paulo State, Brazil.

    PubMed

    Dos Santos Brighente, Kate Bastos; Cutolo, Andre Antonio; Motoie, Gabriela; da Silva Meira-Strejevitch, Cristina; Pereira-Chioccola, Vera Lucia

    2018-05-01

    This study identified the natural infection rate of Leishmania (Leishmania) infantum in Lutzomyia longipalpis sandflies collected in a neighborhood around a kennel, in Dracena, northwestern of São Paulo state. This region is highly endemic for visceral leishmaniasis in Brazil. Insects were captured during 2-3 nights monthly for 11 months (January-November 2012) using 10 automatic light traps around a kennel in a transition between periurban and urban neighborhood. Capture aimed the determination of the minimal infection rate (MIR) on the area. A total of 1690 Lu. longipalpis were captured during the studied period. Out of them, 292 (17.25%) were females and were grouped in 165 pools containing 1 to five insects for DNA extraction and PCR analysis. Positive results for L. (L) infantum in conventional PCR and real time PCR were shown in 7.28% (12/165) and 4.85% (8/165) of the analysis respectively. These data confirm that Lu. longipalpis captured in the study area were infected by L. (L.) infantum. The MIR of sandflies during the 11 months of captures was 4.10% for female the total of 292 female sandflies collected. A high DNA concentration of L. (L.) infantum was detected on sandflies especially in kennel, chicken coop and neighboring houses, where higher abundance of hosts for blood source were present. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Geometric rectification of camera-captured document images.

    PubMed

    Liang, Jian; DeMenthon, Daniel; Doermann, David

    2008-04-01

    Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.

  2. Automated On-tip Affinity Capture Coupled with Mass Spectrometry to Characterize Intact Antibody-Drug Conjugates from Blood

    NASA Astrophysics Data System (ADS)

    Li, Ke Sherry; Chu, Phillip Y.; Fourie-O'Donohue, Aimee; Srikumar, Neha; Kozak, Katherine R.; Liu, Yichin; Tran, John C.

    2018-05-01

    Antibody-drug conjugates (ADCs) present unique challenges for ligand-binding assays primarily due to the dynamic changes of the drug-to-antibody ratio (DAR) distribution in vivo and in vitro. Here, an automated on-tip affinity capture platform with subsequent mass spectrometry analysis was developed to accurately characterize the DAR distribution of ADCs from biological matrices. A variety of elution buffers were tested to offer optimal recovery, with trastuzumab serving as a surrogate to the ADCs. High assay repeatability (CV 3%) was achieved for trastuzumab antibody when captured below the maximal binding capacity of 7.5 μg. Efficient on-tip deglycosylation was also demonstrated in 1 h followed by affinity capture. Moreover, this tip-based platform affords higher throughput for DAR characterization when compared with a well-characterized bead-based method.

  3. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    PubMed

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  4. Analysis of daylight performance of solar light pipes influenced by size and shape of sunlight captures

    NASA Astrophysics Data System (ADS)

    Wu, Yanpeng; Jin, Rendong; Zhang, Wenming; Liu, Li; Zou, Dachao

    2009-11-01

    Experimental investigations on three different sunlight captures with diameter 150mm, 212mm, 300mm were carried out under different conditions such as sunny conditions, cloudy conditions and overcast conditions and the two different size solar light pipes with diameter 360mm and 160mm under sunny conditions. The illuminance in the middle of the sunlight capture have relationship with its size, but not linear. To improve the efficiency of the solar light pipes, the structure and the performance of the sunlight capture must be enhanced. For example, University of Science and Technology Beijing Gymnasium, Beijing 2008 Olympic events of Judo and Taekwondo, 148 solar light pipes were installed with the diameter 530mm for each light pipe. Two sunlight captures with different shape were installed and tested. From the measuring results of the illuminance on the work plane of the gymnasium, the improvement sunlight captures have better effects with the size of augmenting and the machining of the internal surface at the same time, so that the refraction increased and the efficiency of solar light pipes improved. The better effects of supplementary lighting for the gymnasium have been achieved.

  5. Integrated Device for Circulating Tumor Cell Capture, Characterization and Lens-Free Microscopy

    DTIC Science & Technology

    2012-08-01

    peripheral blood of breast cancer patients indicates high metastatic potential and increased morbidity. Development of a cost - effective CTC detection and...microfilter platform captures CTC from the cancer patients’ blood cost effectively , where the larger CTC are preferentially retained on the membrane...development of a cost - effective and high-throughput CTC analysis system would revolutionize the field of CTC detection, prognosis, and therapeutic

  6. Recognition Using Biospecific Interaction Analysis

    DTIC Science & Technology

    1991-08-01

    U.S. Army Chemical Research, Development and Engineering Center, ATTN: SMCCR- SPS -T, Aberdeen Proving Ground, MD 21010-5423. However, the Defense...17 4. Capture of mAb 8A3 and Challenge with Entamoeba histolytica Pathogenic (P2) and Nonpathogenic (NP2) Antigenic Preparations...18 5. Capture of mAb 8A3 and Challenge with Entamoeba histolytica Pathogenic Antigen (P2) at a Minimal Dilution... 19 6

  7. First measurement of the vector analyzing power in muon capture by polarized muonic {sup 3}He

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cummings, W.J.; Behr, J.; Bogorad, P.

    1995-09-01

    This paper describes the first measurement of spin observables in nuclear muon capture by {sup 3}He. The sensitivity of spin observables to the pseudoscalar coupling is described. The triton asymmetry presented has to be corrected for small systematic effects in order to extract the vector analyzing power. The analysis of these effects is currently underway.

  8. Single step synthesis of nanostructured boron nitride for boron neutron capture therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Bikramjeet; Singh, Paviter; Kumar, Akshay, E-mail: akshaykumar.tiet@gmail.com

    2015-05-15

    Nanostructured Boron Nitride (BN) has been successfully synthesized by carbo-thermic reduction of Boric Acid (H{sub 3}BO{sub 3}). This method is a relatively low temperature synthesis route and it can be used for large scale production of nanostructured BN. The synthesized nanoparticles have been characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM) and differential thermal analyzer (DTA). XRD analysis confirmed the formation of single phase nanostructured Boron Nitride. SEM analysis showed that the particles are spherical in shape. DTA analysis showed that the phase is stable upto 900 °C and the material can be used for high temperature applications asmore » well boron neutron capture therapy (BNCT)« less

  9. Proceedings of a workshop on digital mapping techniques; methods for geologic map data capture, management, and publication - June 2 - 5, 1997, Lawrence, Kansas

    USGS Publications Warehouse

    Soller, David R.

    1997-01-01

    Introduction: From June 2-5, 1997, selected technical representatives of the USGS and State geological surveys participated in the 'AASG/USGS Digital Mapping Techniques' workshop in Lawrence, Kansas. The workshop was initiated by the AASG/USGS Data Capture Working Group, and was hosted by the Kansas Geological Survey (KGS). With a focus on methods for data capture and digital map production, the goal was to help move the state surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and GIS analysis.

  10. Individual heterogeneity and identifiability in capture-recapture models

    USGS Publications Warehouse

    Link, W.A.

    2004-01-01

    Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.

  11. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of chlorinated pesticides in aquatic tissue by capillary-column gas chromatography with electron-capture detection

    USGS Publications Warehouse

    Leiker, Thomas J.; Madsen, J.E.; Deacon, J.R.; Foreman, W.T.

    1995-01-01

    A method for the determination of chlorinated organic compounds in aquatic tissue by dual capillary-column gas chromatography with electron-capture detection is described. Whole-body-fish or corbicula tissue is homogenized, Soxhlet extracted, lipid removed by gel permeation chromatography, and fractionated using alumina/silica adsorption chromatography. The extracts are analyzed by dissimilar capillary-column gas chromatography with electron-capture detection. The method reporting limits are 5 micrograms per kilogram (μg/kg) for chlorinated compounds, 50 μg/kg for polychlorinated biphenyls, and 200 μg/kg for toxaphene.

  12. Membrane Process to Capture CO{sub 2} from Coal-Fired Power Plant Flue Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Tim; Wei, Xiaotong; Firat, Bilgen

    2012-03-31

    This final report describes work conducted for the U.S. Department of Energy National Energy Technology Laboratory (DOE NETL) on development of an efficient membrane process to capture carbon dioxide (CO{sub 2}) from power plant flue gas (award number DE-NT0005312). The primary goal of this research program was to demonstrate, in a field test, the ability of a membrane process to capture up to 90% of CO{sub 2} in coal-fired flue gas, and to evaluate the potential of a full-scale version of the process to perform this separation with less than a 35% increase in the levelized cost of electricity (LCOE).more » Membrane Technology and Research (MTR) conducted this project in collaboration with Arizona Public Services (APS), who hosted a membrane field test at their Cholla coal-fired power plant, and the Electric Power Research Institute (EPRI) and WorleyParsons (WP), who performed a comparative cost analysis of the proposed membrane CO{sub 2} capture process. The work conducted for this project included membrane and module development, slipstream testing of commercial-sized modules with natural gas and coal-fired flue gas, process design optimization, and a detailed systems and cost analysis of a membrane retrofit to a commercial power plant. The Polaris? membrane developed over a number of years by MTR represents a step-change improvement in CO{sub 2} permeance compared to previous commercial CO{sub 2}-selective membranes. During this project, membrane optimization work resulted in a further doubling of the CO{sub 2} permeance of Polaris membrane while maintaining the CO{sub 2}/N{sub 2} selectivity. This is an important accomplishment because increased CO{sub 2} permeance directly impacts the membrane skid cost and footprint: a doubling of CO{sub 2} permeance halves the skid cost and footprint. In addition to providing high CO{sub 2} permeance, flue gas CO{sub 2} capture membranes must be stable in the presence of contaminants including SO{sub 2}. Laboratory tests showed no degradation in Polaris membrane performance during two months of continuous operation in a simulated flue gas environment containing up to 1,000 ppm SO{sub 2}. A successful slipstream field test at the APS Cholla power plant was conducted with commercialsize Polaris modules during this project. This field test is the first demonstration of stable performance by commercial-sized membrane modules treating actual coal-fired power plant flue gas. Process design studies show that selective recycle of CO{sub 2} using a countercurrent membrane module with air as a sweep stream can double the concentration of CO{sub 2} in coal flue gas with little energy input. This pre-concentration of CO{sub 2} by the sweep membrane reduces the minimum energy of CO{sub 2} separation in the capture unit by up to 40% for coal flue gas. Variations of this design may be even more promising for CO{sub 2} capture from NGCC flue gas, in which the CO{sub 2} concentration can be increased from 4% to 20% by selective sweep recycle. EPRI and WP conducted a systems and cost analysis of a base case MTR membrane CO{sub 2} capture system retrofitted to the AEP Conesville Unit 5 boiler. Some of the key findings from this study and a sensitivity analysis performed by MTR include: The MTR membrane process can capture 90% of the CO{sub 2} in coal flue gas and produce high-purity CO{sub 2} (>99%) ready for sequestration. CO{sub 2} recycle to the boiler appears feasible with minimal impact on boiler performance; however, further study by a boiler OEM is recommended. For a membrane process built today using a combination of slight feed compression, permeate vacuum, and current compression equipment costs, the membrane capture process can be competitive with the base case MEA process at 90% CO{sub 2} capture from a coal-fired power plant. The incremental LCOE for the base case membrane process is about equal to that of a base case MEA process, within the uncertainty in the analysis. With advanced membranes (5,000 gpu for CO{sub 2} and 50 for CO{sub 2}/N{sub 2}), operating with no feed compression and low-cost CO{sub 2} compression equipment, an incremental LCOE of $33/MWh at 90% capture can be achieved (40% lower than the advanced MEA case). Even with lower cost compression, it appears unlikely that a membrane process using high feed compression (>5 bar) can be competitive with amine absorption, due to the capital cost and energy consumption of this equipment. Similarly, low vacuum pressure (<0.2 bar) cannot be used due to poor efficiency and high cost of this equipment. High membrane permeance is important to reduce the capital cost and footprint of the membrane unit. CO{sub 2}/N{sub 2} selectivity is less important because it is too costly to generate a pressure ratio where high selectivity can be useful. A potential cost ?sweet spot? exists for use of membrane-based technology, if 50-70% CO{sub 2} capture is acceptable. There is a minimum in the cost of CO{sub 2} avoided/ton that membranes can deliver at 60% CO{sub 2} capture, which is 20% lower than the cost at 90% capture. Membranes operating with no feed compression are best suited for lower capture rates. Currently, it appears that the biggest hurdle to use of membranes for post-combustion CO{sub 2} capture is compression equipment cost. An alternative approach is to use sweep membranes in parallel with another CO{sub 2} capture technology that does not require feed compression or vacuum equipment. Hybrid designs that utilize sweep membranes for selective CO{sub 2} recycle show potential to significantly reduce the minimum energy of CO{sub 2} separation.« less

  13. Drainage identification analysis and mapping, phase 2.

    DOT National Transportation Integrated Search

    2017-01-01

    Drainage Identification, Analysis and Mapping System (DIAMS) is a computerized database that captures and : stores relevant information associated with all aboveground and underground hydraulic structures belonging to : the New Jersey Department of T...

  14. Analysis of capture-recapture models with individual covariates using data augmentation

    USGS Publications Warehouse

    Royle, J. Andrew

    2009-01-01

    I consider the analysis of capture-recapture models with individual covariates that influence detection probability. Bayesian analysis of the joint likelihood is carried out using a flexible data augmentation scheme that facilitates analysis by Markov chain Monte Carlo methods, and a simple and straightforward implementation in freely available software. This approach is applied to a study of meadow voles (Microtus pennsylvanicus) in which auxiliary data on a continuous covariate (body mass) are recorded, and it is thought that detection probability is related to body mass. In a second example, the model is applied to an aerial waterfowl survey in which a double-observer protocol is used. The fundamental unit of observation is the cluster of individual birds, and the size of the cluster (a discrete covariate) is used as a covariate on detection probability.

  15. Comparative analysis of the effects of electron and hole capture on the power characteristics of a semiconductor quantum-well laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokolova, Z. N., E-mail: Zina.Sokolova@mail.ioffe.ru; Pikhtin, N. A.; Tarasov, I. S.

    The operating characteristics of a semiconductor quantum-well laser calculated using three models are compared. These models are (i) a model not taking into account differences between the electron and hole parameters and using the electron parameters for both types of charge carriers; (ii) a model, which does not take into account differences between the electron and hole parameters and uses the hole parameters for both types of charge carriers; and (iii) a model taking into account the asymmetry between the electron and hole parameters. It is shown that, at the same velocity of electron and hole capture into an unoccupiedmore » quantum well, the laser characteristics, obtained using the three models, differ considerably. These differences are due to a difference between the filling of the electron and hole subbands in a quantum well. The electron subband is more occupied than the hole subband. As a result, at the same velocities of electron and hole capture into an empty quantum well, the effective electron-capture velocity is lower than the effective hole-capture velocity. Specifically, it is shown that for the laser structure studied the hole-capture velocity of 5 × 10{sup 5} cm/s into an empty quantum well and the corresponding electron-capture velocity of 3 × 10{sup 6} cm/s into an empty quantum well describe the rapid capture of these carriers, at which the light–current characteristic of the laser remains virtually linear up to high pump-current densities. However, an electron-capture velocity of 5 × 10{sup 5} cm/s and a corresponding hole-capture velocity of 8.4 × 10{sup 4} cm/s describe the slow capture of these carriers, causing significant sublinearity in the light–current characteristic.« less

  16. Low Cost, High Capacity Regenerable Sorbent for Carbon Dioxide Capture from Existing Coal-fired Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alptekin, Gokhan; Jayaraman, Ambalavanan; Dietz, Steven

    In this project TDA Research, Inc (TDA) has developed a new post combustion carbon capture technology based on a vacuum swing adsorption system that uses a steam purge and demonstrated its technical feasibility and economic viability in laboratory-scale tests and tests in actual coal derived flue gas. TDA uses an advanced physical adsorbent to selectively remove CO 2 from the flue gas. The sorbent exhibits a much higher affinity for CO 2 than N 2, H 2O or O 2, enabling effective CO 2 separation from the flue gas. We also carried out a detailed process design and analysis ofmore » the new system as part of both sub-critical and super-critical pulverized coal fired power plants. The new technology uses a low cost, high capacity adsorbent that selectively removes CO 2 in the presence of moisture at the flue gas temperature without a need for significant cooling of the flue gas or moisture removal. The sorbent is based on a TDA proprietary mesoporous carbon that consists of surface functionalized groups that remove CO 2 via physical adsorption. The high surface area and favorable porosity of the sorbent also provides a unique platform to introduce additional functionality, such as active groups to remove trace metals (e.g., Hg, As). In collaboration with the Advanced Power and Energy Program of the University of California, Irvine (UCI), TDA developed system simulation models using Aspen PlusTM simulation software to assess the economic viability of TDA’s VSA-based post-combustion carbon capture technology. The levelized cost of electricity including the TS&M costs for CO 2 is calculated as $116.71/MWh and $113.76/MWh for TDA system integrated with sub-critical and super-critical pulverized coal fired power plants; much lower than the $153.03/MWhand $147.44/MWh calculated for the corresponding amine based systems. The cost of CO 2 captured for TDA’s VSA based system is $38.90 and $39.71 per tonne compared to $65.46 and $66.56 per tonne for amine based system on 2011 $ basis, providing 40% lower cost of CO 2 captured. In this analysis we have used a sorbent life of 4 years. If a longer sorbent life can be maintained (which is not unreasonable for fixed bed commercial PSA systems), this would lower the cost of CO 2 captured by $0.05 per tonne (e.g., to $38.85 and $39.66 per tonne at 5 years sorbent replacement). These system analysis results suggest that TDA’s VSA-based post-combustion capture technology can substantially improve the power plant’s thermal performance while achieving near zero emissions, including greater than 90% carbon capture. The higher net plant efficiency and lower capital and operating costs results in a substantial reduction in the cost of carbon capture and cost of electricity for the power plant equipped with TDA’s technology.« less

  17. Comparative study of the double-K -shell-vacancy production in single- and double-electron-capture decay

    NASA Astrophysics Data System (ADS)

    Ratkevich, S. S.; Gangapshev, A. M.; Gavrilyuk, Yu. M.; Karpeshin, F. F.; Kazalov, V. V.; Kuzminov, V. V.; Panasenko, S. I.; Trzhaskovskaya, M. B.; Yakimenko, S. P.

    2017-12-01

    Background: A double-K -electron capture is a rare nuclear-atomic process in which two K electrons are captured simultaneously from the atomic shell. A "hollow atom" is created as a result of this process. In single-K -shell electron-capture decays, there is a small probability that the second electron in the K shell is excited to an unoccupied level or can (mostly) be ejected to the continuum. In either case, a double vacancy is created in the K shell. The relaxation of the double-K -shell vacancy, accompanied by the emission of two K -fluorescence photons, makes it possible to perform experimental studies of such rare processes with the large-volume proportional gas chamber. Purpose: The purpose of the present analysis is to estimate a double-K -shell vacancy creation probability per K -shell electron capture PK K of 81Kr, as well as to measure the half-life of 78Kr relative to 2 ν 2 K capture. Method: Time-resolving current pulse from the large low-background proportional counter (LPC), filled with the krypton sample, was applied to detect triple coincidences of "shaked" electrons and two fluorescence photons. Results: The number of K -shell vacancies per the K -electron capture, produced as a result of the shake-off process, has been measured for the decay of 81Kr. The probability for this decay was found to be PK K=(5.7 ±0.8 ) ×10-5 with a systematic error of (ΔPKK) syst=±0.4 ×10-5 . For the 78Kr(2 ν 2 K ) decay, the comparative study of single- and double-capture decays allowed us to obtain the signal-to-background ratio up to 15/1. The half-life T1/2 2 ν 2 K(g .s .→g .s .) =[1 .9-0.7+1.3(stat) ±0.3 (syst) ] ×1022 y is determined from the analysis of data that have been accumulated over 782 days of live measurements in the experiment that used samples consisted of 170.6 g of 78Kr. Conclusions: The data collected during low background measurements using the LPC were analyzed to search the rare atomic and nuclear processes. We have determined PKK exp for the E C decay of 81Kr, which are in satisfactory agreement with Z-2 dependence of PK K predicted by Primakoff and Porter. This made possible to more accurately determine the background contribution in the energy region of our interest for the search for the 2 K capture in 78Kr. The general procedure of data analysis allowed us to determine the half-life of 78Kr relative to 2 ν 2 K transition with a greater statistical accuracy than in our previous works.

  18. Laser capture microdissection: Arcturus(XT) infrared capture and UV cutting methods.

    PubMed

    Gallagher, Rosa I; Blakely, Steven R; Liotta, Lance A; Espina, Virginia

    2012-01-01

    Laser capture microdissection (LCM) is a technique that allows the precise procurement of enriched cell populations from a heterogeneous tissue under direct microscopic visualization. LCM can be used to harvest the cells of interest directly or can be used to isolate specific cells by ablating the unwanted cells, resulting in histologically enriched cell populations. The fundamental components of laser microdissection technology are (a) visualization of the cells of interest via microscopy, (b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and (c) removal of cells of interest from the heterogeneous tissue section. Laser energy supplied by LCM instruments can be infrared (810 nm) or ultraviolet (355 nm). Infrared lasers melt thermolabile polymers for cell capture, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes the unique features of the Arcturus(XT) laser capture microdissection instrument, which incorporates both infrared capture and ultraviolet cutting technology in one instrument, using a proteomic downstream assay as a model.

  19. Comprehensive comparison of three commercial human whole-exome capture platforms.

    PubMed

    Asan; Xu, Yu; Jiang, Hui; Tyler-Smith, Chris; Xue, Yali; Jiang, Tao; Wang, Jiawei; Wu, Mingzhi; Liu, Xiao; Tian, Geng; Wang, Jun; Wang, Jian; Yang, Huangming; Zhang, Xiuqing

    2011-09-28

    Exome sequencing, which allows the global analysis of protein coding sequences in the human genome, has become an effective and affordable approach to detecting causative genetic mutations in diseases. Currently, there are several commercial human exome capture platforms; however, the relative performances of these have not been characterized sufficiently to know which is best for a particular study. We comprehensively compared three platforms: NimbleGen's Sequence Capture Array and SeqCap EZ, and Agilent's SureSelect. We assessed their performance in a variety of ways, including number of genes covered and capture efficacy. Differences that may impact on the choice of platform were that Agilent SureSelect covered approximately 1,100 more genes, while NimbleGen provided better flanking sequence capture. Although all three platforms achieved similar capture specificity of targeted regions, the NimbleGen platforms showed better uniformity of coverage and greater genotype sensitivity at 30- to 100-fold sequencing depth. All three platforms showed similar power in exome SNP calling, including medically relevant SNPs. Compared with genotyping and whole-genome sequencing data, the three platforms achieved a similar accuracy of genotype assignment and SNP detection. Importantly, all three platforms showed similar levels of reproducibility, GC bias and reference allele bias. We demonstrate key differences between the three platforms, particularly advantages of solutions over array capture and the importance of a large gene target set.

  20. MPCV Exercise Operational Volume Analysis

    NASA Technical Reports Server (NTRS)

    Godfrey, A.; Humphreys, B.; Funk, J.; Perusek, G.; Lewandowski, B. E.

    2017-01-01

    In order to minimize the loss of bone and muscle mass during spaceflight, the Multi-purpose Crew Vehicle (MPCV) will include an exercise device and enough free space within the cabin for astronauts to use the device effectively. The NASA Digital Astronaut Project (DAP) has been tasked with using computational modeling to aid in determining whether or not the available operational volume is sufficient for in-flight exercise.Motion capture data was acquired using a 12-camera Smart DX system (BTS Bioengineering, Brooklyn, NY), while exercisers performed 9 resistive exercises without volume restrictions in a 1g environment. Data were collected from two male subjects, one being in the 99th percentile of height and the other in the 50th percentile of height, using between 25 and 60 motion capture markers. Motion capture data was also recorded as a third subject, also near the 50th percentile in height, performed aerobic rowing during a parabolic flight. A motion capture system and algorithms developed previously and presented at last years HRP-IWS were utilized to collect and process the data from the parabolic flight [1]. These motions were applied to a scaled version of a biomechanical model within the biomechanical modeling software OpenSim [2], and the volume sweeps of the motions were visually assessed against an imported CAD model of the operational volume. Further numerical analysis was performed using Matlab (Mathworks, Natick, MA) and the OpenSim API. This analysis determined the location of every marker in space over the duration of the exercise motion, and the distance of each marker to the nearest surface of the volume. Containment of the exercise motions within the operational volume was determined on a per-exercise and per-subject basis. The orientation of the exerciser and the angle of the footplate were two important factors upon which containment was dependent. Regions where the exercise motion exceeds the bounds of the operational volume have been identified by determining which markers from the motion capture exceed the operational volume and by how much. A credibility assessment of this analysis was performed in accordance with NASA-STD-7009 prior to delivery to the MPCV program.

  1. An efficient scan diagnosis methodology according to scan failure mode for yield enhancement

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok

    2008-12-01

    Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.

  2. Transfer and capture into distant retrograde orbits

    NASA Astrophysics Data System (ADS)

    Scott, Christopher J.

    This dissertation utilizes theory and techniques derived from the fields of dynamical systems theory, astrodyanamics, celestial mechanics, and fluid mechanics to analyze the phenomenon of satellite capture and interrelated spacecraft transfers in restricted three-body systems. The results extend current knowledge and understanding of capture dynamics in the context of astrodynamics and celestial mechanics. Manifold theory, fast Lyapunov indicator maps, and the classification of space structure facilitate an analysis of the transport of objects from the chaotic reaches of the solar system to the distant retrograde region in the sun-Jupiter system. Apart from past studies this dissertation considers the role of the complex lobe structure encompassing stable regions in the circular restricted three-body problem. These structures are shown to be responsible for the phenomenon of sticky orbits and the transport of objects among stable regions. Since permanent capture can only be achieved through a change in energy, fast Lyapunov indicator maps and other methods which reveal the structure of the conservative system are used to discern capture regions and identify the underpinnings of the dynamics. Fast Lyapunov indicator maps provide an accurate classification of orbits of permanent capture and escape, yet monopolize computational resources. In anticipation of a fully three-dimensional analysis in the dissipative system a new mapping parameter is introduced based on energy degradation and averaged velocity. Although the study specifically addresses the sun-Jupiter system, the qualitative results and devised techniques can be applied throughout the solar system and to capture about extrasolar planets. Extending the analysis beyond the exterior of the stable distant retrograde region fosters the construction of transfer orbits from low-Earth orbit to a stable periodic orbit at the center of the stable distant retrograde region. Key to this analysis is the predictability of collision orbits within the highly chaotic region commonly recognized as a saddle point on the energy manifold. The pragmatic techniques derived from this analysis solve a number of complications apparent in the literature. Notably a reliable methodology for the construction of an arbitrary number of transfer orbits circumvents the requirement of computing specialized periodic orbits or extensive numerical sampling of the phase space. The procedure provides a complete description of the design space accessing a wide range of distant retrograde orbits sizes, insertion points, and parking orbit altitudes in an automated manner. The transfers are studied in a similar fashion to periodic orbits unveiling the intimate relationship among design parameters and phase space structure. An arbitrary number of Earth return periodic orbits can be generated as a by-product. These orbits may be useful for spacecraft that must make a number of passes near the second primary without a reduction in energy. Further analysis of the lobe dynamics and a modification of the transfers to the center of the stable region yields sets of single impulse transfers to sticky distant retrograde orbits. It is shown that the evolution of the phase space structures with energy corresponds to the variation of capture time and target size. Capture phenomenon is related to the stability characteristics of the unstable periodic orbit and the geometry of the corresponding homoclinic tangle at various energies. Future spacecraft with little or no propulsive means may take advantage of these natural trajectories for operations in the region. Temporary capture along a sticky orbit may come before incremental stabilization of the spacecraft by way of a series of small impulsive or a low continuous thrust maneuvers. The requirements of small stabilization maneuver are calculated and compared to a direct transfer to the center of stable region. This mission design may be desirable as any failure in the classic set of maneuvers to the center of the stable region could result in the loss of the spacecraft. A simple low-thrust stabilization method is analyzed in a similar manner to nebular drag. It is shown that stabilization maneuvers initiated within the sticky region can be achieved via a simple control law. Moreover, the sticky region can be used as a staging point for both spiral-in and spiral-out maneuvers. For the spiral in maneuver this negates a large, initial maneuver required to reach the center of the stable region. It is shown that large lengths of orbits exist within the sticky regions which reliably lead to permanent capture. In the case of spiral-out the spacecraft is transported to a highly energetic yet stable orbit about the second primary. From here a small maneuver could allow the spacecraft to access other regions of the solar system.

  3. Carbon dioxide capture from atmospheric air using sodium hydroxide spray.

    PubMed

    Stolaroff, Joshuah K; Keith, David W; Lowry, Gregory V

    2008-04-15

    In contrast to conventional carbon capture systems for power plants and other large point sources, the system described in this paper captures CO2 directly from ambient air. This has the advantages that emissions from diffuse sources and past emissions may be captured. The objective of this research is to determine the feasibility of a NaOH spray-based contactor for use in an air capture system by estimating the cost and energy requirements per unit CO2 captured. A prototype system is constructed and tested to measure CO2 absorption, energy use, and evaporative water loss and compared with theoretical predictions. A numerical model of drop collision and coalescence is used to estimate operating parameters for a full-scale system, and the cost of operating the system per unit CO2 captured is estimated. The analysis indicates that CO2 capture from air for climate change mitigation is technically feasible using off-the-shelf technology. Drop coalescence significantly decreases the CO2 absorption efficiency; however, fan and pump energy requirements are manageable. Water loss is significant (20 mol H2O/mol CO2 at 15 degrees C and 65% RH) but can be lowered by appropriately designing and operating the system. The cost of CO2 capture using NaOH spray (excluding solution recovery and CO2 sequestration, which may be comparable) in the full-scale system is 96 $/ton-CO2 in the base case, and ranges from 53 to 127 $/ton-CO2 under alternate operating parameters and assumptions regarding capital costs and mass transfer rate. The low end of the cost range is reached by a spray with 50 microm mean drop diameter, which is achievable with commercially available spray nozzles.

  4. An algorithm for verifying biventricular capture based on evoked-response morphology.

    PubMed

    Diotallevi, Paolo; Ravazzi, Pier Antonio; Gostoli, Enrico; De Marchi, Giuseppe; Militello, Carmelo; Kraetschmer, Hannes

    2005-01-01

    Cardiac resynchronization therapy relies on consistent beat-by-beat myocardial capture in both ventricles. A pacemaker ensuring right (RV) and left ventricular (LV) capture through reliable capture verification and automatic output adjustment would contribute to patients' safety and quality of life. We studied the feasibility of an algorithm based on evoked-response (ER) morphology for capture verification in both the ventricles. RV and LV ER signals were recorded in 20 patients (mean age 72.5 years, range 64.3-80.4 years, 4 females and 16 males) during implantation of biventricular (BiV) pacing systems. Leads of several manufacturers were tested. Pacing and intracardiac electrogram (IEGM) recording were performed using an external pulse generator. IEGM and surface-lead electrocardiogram (ECG) signals were recorded under different pacing conditions for 10 seconds each: RV pacing only, LV pacing only, and BiV pacing with several interventricular delays. Based on morphology characteristics, ERs were classified manually for capture and failure to capture, and the validity of the classification was assessed by reference to the ECG. A total of 3,401 LV- and 3,345 RV-paced events were examined. In the RV and LV, the sensitivities of the algorithm were 95.6% and 96.1% in the RV and LV, respectively, and the corresponding specificities were 91.4% and 95.2%, respectively. The lower sensitivity in the RV was attributed to signal blanking in both channels during BiV pacing with a nonzero interventricular delay. The analysis revealed that the algorithm for identifying capture and failure to capture based on the ER-signal morphology was safe and effective in each ventricle with all leads tested in the study.

  5. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  6. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  7. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  8. 40 CFR Appendix A to Subpart Dddd... - Alternative Procedure To Determine Capture Efficiency From Enclosures Around Hot Presses in the...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....2Response Time Test. Conduct this test once prior to each test series. Introduce zero gas into the... analysis. 3.0Definitions 3.1Capture efficiency (CE). The weight per unit time of SF6 entering the control device divided by the weight per unit time of SF6 released through manifolds at multiple locations within...

  9. Derivation of Rigid Body Analysis Models from Vehicle Architecture Abstractions

    DTIC Science & Technology

    2011-06-17

    models of every type have their basis in some type of physical representation of the design domain. Rather than describing three-dimensional continua of...arrangement, while capturing just enough physical detail to be used as the basis for a meaningful representation of the design , and eventually, analyses that...permit architecture assessment. The design information captured by the abstractions is available at the very earliest stages of the vehicle

  10. Separable Bilayer Microfiltration Device for Viable Label-free Enrichment of Circulating Tumour Cells

    NASA Astrophysics Data System (ADS)

    Zhou, Ming-Da; Hao, Sijie; Williams, Anthony J.; Harouaka, Ramdane A.; Schrand, Brett; Rawal, Siddarth; Ao, Zheng; Brennaman, Randall; Gilboa, Eli; Lu, Bo; Wang, Shuwen; Zhu, Jiyue; Datar, Ram; Cote, Richard; Tai, Yu-Chong; Zheng, Si-Yang

    2014-12-01

    The analysis of circulating tumour cells (CTCs) in cancer patients could provide important information for therapeutic management. Enrichment of viable CTCs could permit performance of functional analyses on CTCs to broaden understanding of metastatic disease. However, this has not been widely accomplished. Addressing this challenge, we present a separable bilayer (SB) microfilter for viable size-based CTC capture. Unlike other single-layer CTC microfilters, the precise gap between the two layers and the architecture of pore alignment result in drastic reduction in mechanical stress on CTCs, capturing them viably. Using multiple cancer cell lines spiked in healthy donor blood, the SB microfilter demonstrated high capture efficiency (78-83%), high retention of cell viability (71-74%), high tumour cell enrichment against leukocytes (1.7-2 × 103), and widespread ability to establish cultures post-capture (100% of cell lines tested). In a metastatic mouse model, SB microfilters successfully enriched viable mouse CTCs from 0.4-0.6 mL whole mouse blood samples and established in vitro cultures for further genetic and functional analysis. Our preliminary studies reflect the efficacy of the SB microfilter device to efficiently and reliably enrich viable CTCs in animal model studies, constituting an exciting technology for new insights in cancer research.

  11. Incidence of tuberculous meningitis in France, 2000: a capture-recapture analysis.

    PubMed

    Cailhol, J; Che, D; Jarlier, V; Decludt, B; Robert, J

    2005-07-01

    To estimate the incidence of culture-positive and culture-negative tuberculous meningitis (TBM) in France in 2000. Capture-recapture method using two unrelated sources of data: the tuberculosis (TB) mandatory notification system (MNTB), recording patients treated by anti-tuberculosis drugs, and a survey by the National Reference Centre (NRC) for mycobacterial drug resistance, recording culture-positive TBM. Of 112 cases of TBM reported to the MNTB, 28 culture-positive and 34 culture-negative meningitis cases were validated (17 duplicates, 3 cases from outside France, 21 false notifications, and 9 lost records were excluded). The NRC recorded 31 culture-positive cases, including 21 known by the MNTB. When the capture-recapture method was applied to the reported culture-positive meningitis cases, the estimated number of meningitis cases was 41 and the incidence was 0.7 cases per million. Sensitivity was 75.6% for the NRC, 68.3% for the MNTB, and 92.7% for both systems together. When sensitivity of the MNTB for culture-positive cases was applied to culture-negative meningitis, the total estimated number of culture-negative meningitis cases was 50 and the incidence was 0.85 cases per million. TBM is underestimated in France. Capture-recapture analysis using different sources to better estimate its incidence is of great interest.

  12. Long-range Self-interacting Dark Matter in the Sun

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jing; State Key Laboratory of Theoretical Physics, Kavli Institute for Theoretical Physics China,Institute of Theoretical Physics, Chinese Academy of Science,Zhong Guan Cun East Street 55#, Beijing, 100190; Liang, Zheng-Liang

    2015-12-10

    We investigate the implications of the long-rang self-interaction on both the self-capture and the annihilation of the self-interacting dark matter (SIDM) trapped in the Sun. Our discussion is based on a specific SIDM model in which DM particles self-interact via a light scalar mediator, or Yukawa potential, in the context of quantum mechanics. Within this framework, we calculate the self-capture rate across a broad region of parameter space. While the self-capture rate can be obtained separately in the Born regime with perturbative method, and in the classical limits with the Rutherford formula, our calculation covers the gap between in amore » non-perturbative fashion. Besides, the phenomenology of both the Sommerfeld-enhanced s- and p-wave annihilation of the solar SIDM is also involved in our discussion. Moreover, by combining the analysis of the Super-Kamiokande (SK) data and the observed DM relic density, we constrain the nuclear capture rate of the DM particles in the presence of the dark Yukawa potential. The consequence of the long-range dark force on probing the solar SIDM turns out to be significant if the force-carrier is much lighter than the DM particle, and a quantitative analysis is provided.« less

  13. Electricity from fossil fuels without CO2 emissions: assessing the costs of carbon dioxide capture and sequestration in U.S. electricity markets.

    PubMed

    Johnson, T L; Keith, D W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO2 emissions via CO2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  14. Electricity from Fossil Fuels without CO2 Emissions: Assessing the Costs of Carbon Dioxide Capture and Sequestration in U.S. Electricity Markets.

    PubMed

    Johnson, Timothy L; Keith, David W

    2001-10-01

    The decoupling of fossil-fueled electricity production from atmospheric CO 2 emissions via CO 2 capture and sequestration (CCS) is increasingly regarded as an important means of mitigating climate change at a reasonable cost. Engineering analyses of CO 2 mitigation typically compare the cost of electricity for a base generation technology to that for a similar plant with CO 2 capture and then compute the carbon emissions mitigated per unit of cost. It can be hard to interpret mitigation cost estimates from this plant-level approach when a consistent base technology cannot be identified. In addition, neither engineering analyses nor general equilibrium models can capture the economics of plant dispatch. A realistic assessment of the costs of carbon sequestration as an emissions abatement strategy in the electric sector therefore requires a systems-level analysis. We discuss various frameworks for computing mitigation costs and introduce a simplified model of electric sector planning. Results from a "bottom-up" engineering-economic analysis for a representative U.S. North American Electric Reliability Council (NERC) region illustrate how the penetration of CCS technologies and the dispatch of generating units vary with the price of carbon emissions and thereby determine the relationship between mitigation cost and emissions reduction.

  15. On the correlation between motion data captured from low-cost gaming controllers and high precision encoders.

    PubMed

    Purkayastha, Sagar N; Byrne, Michael D; O'Malley, Marcia K

    2012-01-01

    Gaming controllers are attractive devices for research due to their onboard sensing capabilities and low-cost. However, a proper quantitative analysis regarding their suitability for use in motion capture, rehabilitation and as input devices for teleoperation and gesture recognition has yet to be conducted. In this paper, a detailed analysis of the sensors of two of these controllers, the Nintendo Wiimote and the Sony Playstation 3 Sixaxis, is presented. The acceleration and angular velocity data from the sensors of these controllers were compared and correlated with computed acceleration and angular velocity data derived from a high resolution encoder. The results show high correlation between the sensor data from the controllers and the computed data derived from the position data of the encoder. From these results, it can be inferred that the Wiimote is more consistent and better suited for motion capture applications and as an input device than the Sixaxis. The applications of the findings are discussed with respect to potential research ventures.

  16. Improving the Design of a Conservation Reserve for a Critically Endangered Species

    PubMed Central

    2017-01-01

    Setting aside protected areas is a key strategy for tackling biodiversity loss. Reserve effectiveness depends on the extent to which protected areas capture both known occurrences and areas likely to support the species. We assessed the effectiveness of the existing reserve network for Leadbeater’s Possum (Gymnobelideus leadbeateri) and other forest-dependent species, and compared the existing reserve system to a set of plausible reserve expansion options based on area targets implied in a recent Population Viability Analysis (PVA). The existing Leadbeater’s Reserve and surrounding reserve system captured 7.6% and 29.6% of cumulative habitat suitability, respectively, across the landscape. Expanded reserve scenarios captured 34% to 62% of cumulative habitat suitability. We found acute trade-offs between conserving Leadbeater’s Possum habitat and conserving habitat of other forest-dependent species. Our analysis provides a template for systematically expanding and evaluating reserve expansion options in terms of trade-offs between priority species’ needs. PMID:28121984

  17. Capture-recapture studies for multiple strata including non-markovian transitions

    USGS Publications Warehouse

    Brownie, C.; Hines, J.E.; Nichols, J.D.; Pollock, K.H.; Hestbeck, J.B.

    1993-01-01

    We consider capture-recapture studies where release and recapture data are available from each of a number of strata on every capture occasion. Strata may, for example, be geographic locations or physiological states. Movement of animals among strata occurs with unknown probabilities, and estimation of these unknown transition probabilities is the objective. We describe a computer routine for carrying out the analysis under a model that assumes Markovian transitions and under reduced parameter versions of this model. We also introduce models that relax the Markovian assumption and allow 'memory' to operate (i.e., allow dependence of the transition probabilities on the previous state). For these models, we sugg st an analysis based on a conditional likelihood approach. Methods are illustrated with data from a large study on Canada geese (Branta canadensis) banded in three geographic regions. The assumption of Markovian transitions is rejected convincingly for these data, emphasizing the importance of the more general models that allow memory.

  18. Isolation and mutational analysis of circulating tumor cells from lung cancer patients with magnetic sifters and biochips†

    PubMed Central

    Earhart, Christopher M.; Hughes, Casey E.; Gaster, Richard S.; Ooi, Chin Chun; Wilson, Robert J.; Zhou, Lisa Y.; Humke, Eric W.; Xu, Lingyun; Wong, Dawson J.; Willingham, Stephen B.; Schwartz, Erich J.; Weissman, Irving L.; Jeffrey, Stefanie S.; Neal, Joel W.; Rohatgi, Rajat; Wakelee, Heather A.; Wang, Shan X.

    2014-01-01

    Detection and characterization of circulating tumor cells (CTCs) may reveal insights into the diagnosis and treatment of malignant disease. Technologies for isolating CTCs developed thus far suffer from one or more limitations, such as low throughput, inability to release captured cells, and reliance on expensive instrumentation for enrichment or subsequent characterization. We report a continuing development of a magnetic separation device, the magnetic sifter, which is a miniature microfluidic chip with a dense array of magnetic pores. It offers high efficiency capture of tumor cells, labeled with magnetic nanoparticles, from whole blood with high throughput and efficient release of captured cells. For subsequent characterization of CTCs, an assay, using a protein chip with giant magnetoresistive nanosensors, has been implemented for mutational analysis of CTCs enriched with the magnetic sifter. The use of these magnetic technologies, which are separate devices, may lead the way to routine preparation and characterization of “liquid biopsies” from cancer patients. PMID:23969419

  19. Transcriptome In Vivo Analysis (TIVA) of spatially defined single cells in intact live mouse and human brain tissue

    PubMed Central

    Lovatt, Ditte; Ruble, Brittani K.; Lee, Jaehee; Dueck, Hannah; Kim, Tae Kyung; Fisher, Stephen; Francis, Chantal; Spaethling, Jennifer M.; Wolf, John A.; Grady, M. Sean; Ulyanova, Alexandra V.; Yeldell, Sean B.; Griepenburg, Julianne C.; Buckley, Peter T.; Kim, Junhyong; Sul, Jai-Yoon; Dmochowski, Ivan J.; Eberwine, James

    2014-01-01

    Transcriptome profiling is an indispensable tool in advancing the understanding of single cell biology, but depends upon methods capable of isolating mRNA at the spatial resolution of a single cell. Current capture methods lack sufficient spatial resolution to isolate mRNA from individual in vivo resident cells without damaging adjacent tissue. Because of this limitation, it has been difficult to assess the influence of the microenvironment on the transcriptome of individual neurons. Here, we engineered a Transcriptome In Vivo Analysis (TIVA)-tag, which upon photoactivation enables mRNA capture from single cells in live tissue. Using the TIVA-tag in combination with RNA-seq to analyze transcriptome variance among single dispersed cells and in vivo resident mouse and human neurons, we show that the tissue microenvironment shapes the transcriptomic landscape of individual cells. The TIVA methodology provides the first noninvasive approach for capturing mRNA from single cells in their natural microenvironment. PMID:24412976

  20. A Combined Experimental/Computational Investigation of a Rocket Based Combined Cycle Inlet

    NASA Technical Reports Server (NTRS)

    Smart, Michael K.; Trexler, Carl A.; Goldman, Allen L.

    2001-01-01

    A rocket based combined cycle inlet geometry has undergone wind tunnel testing and computational analysis with Mach 4 flow at the inlet face. Performance parameters obtained from the wind tunnel tests were the mass capture, the maximum back-pressure, and the self-starting characteristics of the inlet. The CFD analysis supplied a confirmation of the mass capture, the inlet efficiency and the details of the flowfield structure. Physical parameters varied during the test program were cowl geometry, cowl position, body-side bleed magnitude and ingested boundary layer thickness. An optimum configuration was determined for the inlet as a result of this work.

  1. Stock Analysis of Metapenaeus affinis (H.Milne Edwards, 1837) on the North Coast of Central Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Wijaya Saputra, Suradi; Solichin, Anhar; Teguh Taufani, Wiwiet

    2018-02-01

    This research aims at studying the length of first capture (Lc), length of sexual mature (Lm), and the stock analysis of Metapenaeus affinis in the North coast of Central Java, Indonesia. The field research activities were conducted from May 2016 to April 2017 using survey method and direct observations on the catch area of fishing units. The results showed that total length of the first capture for the male shrimp is 76,4 mm and 63 mm for females. The total length of first sexual mature is 116 mm. Y'/R result analysis on male shrimp shows 0,595 maximum exploitation rate (Emax), the rate of exploitation at E0,1 is 0,521 and the rate of exploitation at E0.5 is 0,352. While the female shrimp obtained is Emax = 0.637, E0.1 = 0.562 and E0.50 = 0,373. To increase the production and to preserve shrimp resources, it is ideal if Lc = Lm, i.e. at a total length of 116 mm. In conclusion, the increase in length of first captured will increase Y / R 'and B / R'

  2. Pentafluorobenzyl bromide-A versatile derivatization agent in chromatography and mass spectrometry: I. Analysis of inorganic anions and organophosphates.

    PubMed

    Tsikas, Dimitrios

    2017-02-01

    Pentafluorobenzyl bromide (PFB-Br) is a versatile derivatization agent. It is widely used in chromatography and mass spectrometry since several decades. The bromide atom is largely the single leaving group of PFB-Br. It is substituted by wide a spectrum of nucleophiles in aqueous and non-aqueous systems to form electrically neutral, in most organic solvents soluble, generally thermally stable, volatile, strongly electron-capturing and ultraviolet light-absorbing derivatives. Because of these greatly favoured physicochemical properties, PFB-Br emerged an ideal derivatization agent for highly sensitive analysis of endogenous and exogenous substances including various inorganic and organic anions by electron capture detection or after electron-capture negative-ion chemical ionization in GC-MS. The present article attempts an appraisal of the utility of PFB-Br in analytical chemistry. It reviews and discusses papers dealing with the use of PFB-Br as the derivatization reagent in the qualitative and quantitative analysis of endogenous and exogenous inorganic anions in various biological samples, notably plasma, urine and saliva. These analytes include nitrite, nitrate, cyanide and dialkyl organophosphates. Special emphasis is given to mass spectrometry-based approaches and stable-isotope dilution techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Engineering and Economic Analysis of an Advanced Ultra-Supercritical Pulverized Coal Power Plant with and without Post-Combustion Carbon Capture Task 7. Design and Economic Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booras, George; Powers, J.; Riley, C.

    2015-09-01

    This report evaluates the economics and performance of two A-USC PC power plants; Case 1 is a conventionally configured A-USC PC power plant with superior emission controls, but without CO 2 removal; and Case 2 adds a post-combustion carbon capture (PCC) system to the plant from Case 1, using the design and heat integration strategies from EPRI’s 2015 report, “Best Integrated Coal Plant.” The capture design basis for this case is “partial,” to meet EPA’s proposed New Source Performance Standard, which was initially proposed as 500 kg-CO 2/MWh (gross) or 1100 lb-CO 2/MWh (gross), but modified in August 2015 tomore » 635 kg-CO 2/MWh (gross) or 1400 lb-CO 2/MWh (gross). This report draws upon the collective experience of consortium members, with EPRI and General Electric leading the study. General Electric provided the steam cycle analysis as well as v the steam turbine design and cost estimating. EPRI performed integrated plant performance analysis using EPRI’s PC Cost model.« less

  4. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  5. Exercise Sensing and Pose Recovery Inference Tool (ESPRIT) - A Compact Stereo-based Motion Capture Solution For Exercise Monitoring

    NASA Technical Reports Server (NTRS)

    Lee, Mun Wai

    2015-01-01

    Crew exercise is important during long-duration space flight not only for maintaining health and fitness but also for preventing adverse health problems, such as losses in muscle strength and bone density. Monitoring crew exercise via motion capture and kinematic analysis aids understanding of the effects of microgravity on exercise and helps ensure that exercise prescriptions are effective. Intelligent Automation, Inc., has developed ESPRIT to monitor exercise activities, detect body markers, extract image features, and recover three-dimensional (3D) kinematic body poses. The system relies on prior knowledge and modeling of the human body and on advanced statistical inference techniques to achieve robust and accurate motion capture. In Phase I, the company demonstrated motion capture of several exercises, including walking, curling, and dead lifting. Phase II efforts focused on enhancing algorithms and delivering an ESPRIT prototype for testing and demonstration.

  6. The Isolation of Pure Populations of Neurons by Laser Capture Microdissection: Methods and Application in Neuroscience.

    PubMed

    Morris, Renée; Mehta, Prachi

    2018-01-01

    In mammals, the central nervous system (CNS) is constituted of various cellular elements, posing a challenge to isolating specific cell types to investigate their expression profile. As a result, tissue homogenization is not amenable to analyses of motor neurons profiling as these represent less than 10% of the total spinal cord cell population. One way to tackle the problem of tissue heterogeneity and obtain meaningful genomic, proteomic, and transcriptomic profiling is to use laser capture microdissection technology (LCM). In this chapter, we describe protocols for the capture of isolated populations of motor neurons from spinal cord tissue sections and for downstream transcriptomic analysis of motor neurons with RT-PCR. We have also included a protocol for the immunological confirmation that the captured neurons are indeed motor neurons. Although focused on spinal cord motor neurons, these protocols can be easily optimized for the isolation of any CNS neurons.

  7. Monte Carlo analysis of TRX lattices with ENDF/B version 3 data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardy, J. Jr.

    1975-03-01

    Four TRX water-moderated lattices of slightly enriched uranium rods have been reanalyzed with consistent ENDF/B Version 3 data by means of the full-range Monte Carlo program RECAP. The following measured lattice parameters were studied: ratio of epithermal-to-thermal $sup 238$U capture, ratio of epithermal- to-thermal $sup 235$U fissions, ration of $sup 238$U captures to $sup 235$U fissions, ratio of $sup 238$U fissions to $sup 235$U fissions, and multiplication factor. In addition to the base calculations, some studies were done to find sensitivity of the TRX lattice parameters to selected variations of cross section data. Finally, additional experimental evidence is afforded bymore » effective $sup 238$U capture integrals for isolated rods. Shielded capture integrals were calculated for $sup 238$U metal and oxide rods. These are compared with other measurements. (auth)« less

  8. Cryo-Scanning Electron Microscopy of Captured Cirrus Ice Particles

    NASA Astrophysics Data System (ADS)

    Magee, N. B.; Boaggio, K.; Bandamede, M.; Bancroft, L.; Hurler, K.

    2016-12-01

    We present the latest collection of high-resolution cryo-scanning electron microscopy images and microanalysis of cirrus ice particles captured by high-altitude balloon (ICE-Ball, see abstracts by K. Boaggio and M. Bandamede). Ice particle images and sublimation-residues are derived from particles captured during approximately 15 balloon flights conducted in Pennsylvania and New Jersey over the past 12 months. Measurements include 3D digital elevation model reconstructions of ice particles, and associated statistical analyses of entire particles and particle sub-facets and surfaces. This 3D analysis reveals that morphologies of most ice particles captured deviate significantly from ideal habits, and display geometric complexity and surface roughness at multiple measureable scales, ranging from 100's nanometers to 100's of microns. The presentation suggests potential a path forward for representing scattering from a realistically complex array of ice particle shapes and surfaces.

  9. Search for two-neutrino double electron capture of 124Xe with XENON100

    NASA Astrophysics Data System (ADS)

    Aprile, E.; Aalbers, J.; Agostini, F.; Alfonsi, M.; Amaro, F. D.; Anthony, M.; Arneodo, F.; Barrow, P.; Baudis, L.; Bauermeister, B.; Benabderrahmane, M. L.; Berger, T.; Breur, P. A.; Brown, A.; Brown, E.; Bruenner, S.; Bruno, G.; Budnik, R.; Bütikofer, L.; Calvén, J.; Cardoso, J. M. R.; Cervantes, M.; Cichon, D.; Coderre, D.; Colijn, A. P.; Conrad, J.; Cussonneau, J. P.; Decowski, M. P.; de Perio, P.; di Gangi, P.; di Giovanni, A.; Diglio, S.; Duchovni, E.; Fei, J.; Ferella, A. D.; Fieguth, A.; Franco, D.; Fulgione, W.; Gallo Rosso, A.; Galloway, M.; Gao, F.; Garbini, M.; Geis, C.; Goetzke, L. W.; Greene, Z.; Grignon, C.; Hasterok, C.; Hogenbirk, E.; Itay, R.; Kaminsky, B.; Kessler, G.; Kish, A.; Landsman, H.; Lang, R. F.; Lellouch, D.; Levinson, L.; Le Calloch, M.; Levy, C.; Lin, Q.; Lindemann, S.; Lindner, M.; Lopes, J. A. M.; Manfredini, A.; Marrodán Undagoitia, T.; Masbou, J.; Massoli, F. V.; Masson, D.; Mayani, D.; Meng, Y.; Messina, M.; Micheneau, K.; Miguez, B.; Molinario, A.; Murra, M.; Naganoma, J.; Ni, K.; Oberlack, U.; Orrigo, S. E. A.; Pakarha, P.; Pelssers, B.; Persiani, R.; Piastra, F.; Pienaar, J.; Piro, M.-C.; Plante, G.; Priel, N.; Rauch, L.; Reichard, S.; Reuter, C.; Rizzo, A.; Rosendahl, S.; Rupp, N.; Dos Santos, J. M. F.; Sartorelli, G.; Scheibelhut, M.; Schindler, S.; Schreiner, J.; Schumann, M.; Scotto Lavina, L.; Selvi, M.; Shagin, P.; Silva, M.; Simgen, H.; Sivers, M. V.; Stein, A.; Thers, D.; Tiseni, A.; Trinchero, G.; Tunnell, C. D.; Wall, R.; Wang, H.; Weber, M.; Wei, Y.; Weinheimer, C.; Wulf, J.; Zhang, Y.; Xenon Collaboration

    2017-02-01

    Two-neutrino double electron capture is a rare nuclear decay where two electrons are simultaneously captured from the atomic shell. For 124Xe this process has not yet been observed and its detection would provide a new reference for nuclear matrix element calculations. We have conducted a search for two-neutrino double electron capture from the K shell of 124Xe using 7636 kg d of data from the XENON100 dark matter detector. Using a Bayesian analysis we observed no significant excess above background, leading to a lower 90% credibility limit on the half-life T1 /2>6.5 ×1020 yr. We have also evaluated the sensitivity of the XENON1T experiment, which is currently being commissioned, and found a sensitivity of T1 /2>6.1 ×1022 yr after an exposure of 2 t yr .

  10. A Versatile Microarray Platform for Capturing Rare Cells

    NASA Astrophysics Data System (ADS)

    Brinkmann, Falko; Hirtz, Michael; Haller, Anna; Gorges, Tobias M.; Vellekoop, Michael J.; Riethdorf, Sabine; Müller, Volkmar; Pantel, Klaus; Fuchs, Harald

    2015-10-01

    Analyses of rare events occurring at extremely low frequencies in body fluids are still challenging. We established a versatile microarray-based platform able to capture single target cells from large background populations. As use case we chose the challenging application of detecting circulating tumor cells (CTCs) - about one cell in a billion normal blood cells. After incubation with an antibody cocktail, targeted cells are extracted on a microarray in a microfluidic chip. The accessibility of our platform allows for subsequent recovery of targets for further analysis. The microarray facilitates exclusion of false positive capture events by co-localization allowing for detection without fluorescent labelling. Analyzing blood samples from cancer patients with our platform reached and partly outreached gold standard performance, demonstrating feasibility for clinical application. Clinical researchers free choice of antibody cocktail without need for altered chip manufacturing or incubation protocol, allows virtual arbitrary targeting of capture species and therefore wide spread applications in biomedical sciences.

  11. Automatic human body modeling for vision-based motion capture system using B-spline parameterization of the silhouette

    NASA Astrophysics Data System (ADS)

    Jaume-i-Capó, Antoni; Varona, Javier; González-Hidalgo, Manuel; Mas, Ramon; Perales, Francisco J.

    2012-02-01

    Human motion capture has a wide variety of applications, and in vision-based motion capture systems a major issue is the human body model and its initialization. We present a computer vision algorithm for building a human body model skeleton in an automatic way. The algorithm is based on the analysis of the human shape. We decompose the body into its main parts by computing the curvature of a B-spline parameterization of the human contour. This algorithm has been applied in a context where the user is standing in front of a camera stereo pair. The process is completed after the user assumes a predefined initial posture so as to identify the main joints and construct the human model. Using this model, the initialization problem of a vision-based markerless motion capture system of the human body is solved.

  12. Supplemental materials for the analysis of capture-recapture data for polar bears in Western Hudson Bay, Canada, 1984-2004

    USGS Publications Warehouse

    Regehr, Eric V.; Lunn, Nicholas J.; Amstrup, Steven C.; Stirling, Ian

    2007-01-01

    Regehr and others (2007, Survival and population size of polar bears in western Hudson Bay in relation to earlier sea ice breakup: Journal of Wildlife Management, v. 71, no. 8) evaluated survival in relation to climatic conditions and estimated population size for polar bears (Ursus maritimus) in western Hudson Bay, Canada. Here, we provide supplemental materials for the analyses in Regehr and others (2007). We demonstrate how tag-return data from harvested polar bears were used to adjust estimates of total survival for human-caused mortality. We describe the sex and age composition of the capture and harvest samples and provide results for goodness-of-fit tests applied to capture-recapture models. We also describe the capture-recapture model selection procedure and the structure of the most supported model, which was used to estimate survival and population size.

  13. Synergistic adhesion mechanisms of spider capture silk.

    PubMed

    Guo, Yang; Chang, Zheng; Guo, Hao-Yuan; Fang, Wei; Li, Qunyang; Zhao, Hong-Ping; Feng, Xi-Qiao; Gao, Huajian

    2018-03-01

    It is well known that capture silk, the main sticky component of the orb web of a spider, plays an important role in the spider's ability to capture prey via adhesion. However, the detailed mechanism with which the spider achieves its unparalleled high-adhesion performance remains elusive. In this work, we combine experiments and theoretical analysis to investigate the adhesion mechanisms of spider silk. In addition to the widely recognized adhesion effect of the sticky glue, we reveal a synergistic enhancement mechanism due to the elasticity of silk fibres. A balance between silk stiffness, strength and glue stickiness is crucial to endow the silk with superior adhesion, as well as outstanding energy absorption capacity and structural robustness. The revealed mechanisms deepen our understanding of the working principles of spider silk and suggest guidelines for biomimetic designs of spider-inspired adhesion and capture devices. © 2018 The Author(s).

  14. A photoelectrochemical platform for the capture and release of rare single cells.

    PubMed

    Parker, Stephen G; Yang, Ying; Ciampi, Simone; Gupta, Bakul; Kimpton, Kathleen; Mansfeld, Friederike M; Kavallaris, Maria; Gaus, Katharina; Gooding, J Justin

    2018-06-12

    For many normal and aberrant cell behaviours, it is important to understand the origin of cellular heterogeneity. Although powerful methods for studying cell heterogeneity have emerged, they are more suitable for common rather than rare cells. Exploring the heterogeneity of rare single cells is challenging because these rare cells must be first pre-concentrated and undergo analysis prior to classification and expansion. Here, a versatile capture & release platform consisting of an antibody-modified and electrochemically cleavable semiconducting silicon surface for release of individual cells of interest is presented. The captured cells can be interrogated microscopically and tested for drug responsiveness prior to release and recovery. The capture & release strategy was applied to identify rare tumour cells from whole blood, monitor the uptake of, and response to, doxorubicin and subsequently select cells for single-cell gene expression based on their response to the doxorubicin.

  15. Test of 4-body Theory via Polarized p-T Capture Below 80 keV

    NASA Astrophysics Data System (ADS)

    Canon, R. S.; Gaff, S. J.; Kelley, J. H.; Schreiber, E. C.; Weller, H. R.; Wulf, E. A.; Prior, R. M.; Spraker, M.; Tilley, D. R.

    1998-10-01

    Our previous study of polarized p-d capture at energies below 80 keV revealed the major role played by MEC effects and provided a clean testing ground for state-of-the-art 3-body theory (the ``Ay puzzle'' remains)(G. Schmid et al); PRL 76, 3088(1996); PRC 56, 2565(1997). Four-body theory is on the threshold(A. Fonseca,W. Glöckle,A. Kievsky,H. Witala;Private communication) of being able to make similar ab-initio predictions. The p-T capture reaction is expected to exhibit strong MEC effects at very low energies for reasons similar to those in p-d capture. Preliminary results indicate finite values of A_y(90^circ) in the 50-80 keV region. These results will be discussed with respect to their implications on the M1 strength present in this reaction. Plans for future measurements and analysis will also be described.

  16. Attention capture without awareness in a non-spatial selection task.

    PubMed

    Oriet, Chris; Pandey, Mamata; Kawahara, Jun-Ichiro

    2017-02-01

    Distractors presented prior to a critical target in a rapid sequence of visually-presented items induce a lag-dependent deficit in target identification, particularly when the distractor shares a task-relevant feature of the target. Presumably, such capture of central attention is important for bringing a target into awareness. The results of the present investigation suggest that greater capture of attention by a distractor is not accompanied by greater awareness of it. Moreover, awareness tends to be limited to superficial characteristics of the target such as colour. The findings are interpreted within the context of a model that assumes sudden increases in arousal trigger selection of information for consolidation in working memory. In this conceptualization, prolonged analysis of distractor items sharing task-relevant features leads to larger target identification deficits (i.e., greater capture) but no increase in awareness. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  18. Design framework for a spectral mask for a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Berkner, Kathrin; Shroff, Sapna A.

    2012-01-01

    Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.

  19. A smart core-sheath nanofiber that captures and releases red blood cells from the blood.

    PubMed

    Shi, Q; Hou, J; Zhao, C; Xin, Z; Jin, J; Li, C; Wong, S-C; Yin, J

    2016-01-28

    A smart core-sheath nanofiber for non-adherent cell capture and release is demonstrated. The nanofibers are fabricated by single-spinneret electrospinning of poly(N-isopropylacrylamide) (PNIPAAm), polycaprolactone (PCL) and nattokinase (NK) solution blends. The self-assembly of PNIPAAm and PCL blends during the electrospinning generates the core-sheath PCL/PNIPAAm nanofibers with PNIPAAm as the sheath. The PNIPAAm-based core-sheath nanofibers are switchable between hydrophobicity and hydrophilicity with temperature change and enhance stability in the blood. When the nanofibers come in contact with blood, the NK is released from the nanofibers to resist platelet adhesion on the nanofiber surface, facilitating the direct capture and isolation of red blood cells (RBCs) from the blood above phase-transition temperature of PNIPAAm. Meanwhile, the captured RBCs are readily released from the nanofibers with temperature stimuli in an undamaged manner. The release efficiency of up to 100% is obtained while maintaining cellular integrity and function. This work presents promising nanofibers to effectively capture non-adherent cells and release for subsequent molecular analysis and diagnosis of single cells.

  20. Yemen Country Analysis Brief

    EIA Publications

    2014-01-01

    Update: September 30, 2015 Since the publication of the Yemen Country Analysis Brief in September 2014 below, the situation in Yemen has deteriorated significantly. In September 2014, the Shia Houthi rebel group seized the Yemeni capital of Sanaa. In January 2015, the Houthis captured the presidential palace and other strategic buildings, forcing President Hadi and his ministers to resign and to dissolve parliament. In March 2015, a coalition led by Saudi Arabia began airstrikes on Houthi targets, which are still active as of this publication (September 2015). The report below represents the energy situation in Yemen before the Houthi capture of Sanaa and the rest of Yemen. Lack of data and the halting of nearly all energy sector activity in Yemen do not allow for a full update of the Country Analysis Brief.

  1. The other prey-capture silk: Fibres made by glow-worms (Diptera: Keroplatidae) comprise cross-β-sheet crystallites in an abundant amorphous fraction.

    PubMed

    Walker, Andrew A; Weisman, Sarah; Trueman, Holly E; Merritt, David J; Sutherland, Tara D

    2015-09-01

    Glow-worms (larvae of dipteran genus Arachnocampa) are restricted to moist habitats where they capture flying prey using snares composed of highly extensible silk fibres and sticky mucus droplets. Little is known about the composition or structure of glow-worm snares, or the extent of possible convergence between glow-worm and arachnid capture silks. We characterised Arachnocampa richardsae silk and mucus using X-ray scattering, Fourier transform infrared spectroscopy and amino acid analysis. Silk but not mucus contained crystallites of the cross-β-sheet type, which occur in unrelated insect silks but have not been reported previously in fibres used for prey capture. Mucus proteins were rich in Gly (28.5%) and existed in predominantly a random coil structure, typical of many adhesive proteins. In contrast, the silk fibres were unusually rich in charged and polar residues, particularly Lys (18.1%), which we propose is related to their use in a highly hydrated state. Comparison of X-ray scattering, infrared spectroscopy and amino acid analysis data suggests that silk fibres contain a high fraction of disordered protein. We suggest that in the native hydrated state, silk fibres are capable of extension via deformation of both disordered regions and cross-β-sheet crystallites, and that high extensibility is an adaptation promoting successful prey capture. This study illustrates the rich variety of protein motifs that are available for recruitment into biopolymers, and how convergently evolved materials can nevertheless be based on fundamentally different protein structures. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  2. Inter-pulse high-resolution gamma-ray spectra using a 14 MeV pulsed neutron generator

    USGS Publications Warehouse

    Evans, L.G.; Trombka, J.I.; Jensen, D.H.; Stephenson, W.A.; Hoover, R.A.; Mikesell, J.L.; Tanner, A.B.; Senftle, F.E.

    1984-01-01

    A neutron generator pulsed at 100 s-1 was suspended in an artificial borehole containing a 7.7 metric ton mixture of sand, aragonite, magnetite, sulfur, and salt. Two Ge(HP) gamma-ray detectors were used: one in a borehole sonde, and one at the outside wall of the sample tank opposite the neutron generator target. Gamma-ray spectra were collected by the outside detector during each of 10 discrete time windows during the 10 ms period following the onset of gamma-ray build-up after each neutron burst. The sample was measured first when dry and then when saturated with water. In the dry sample, gamma rays due to inelastic neutron scattering, neutron capture, and decay were counted during the first (150 ??s) time window. Subsequently only capture and decay gamma rays were observed. In the wet sample, only neutron capture and decay gamma rays were observed. Neutron capture gamma rays dominated the spectrum during the period from 150 to 400 ??s after the neutron burst in both samples, but decreased with time much more rapidly in the wet sample. A signal-to-noise-ratio (S/N) analysis indicates that optimum conditions for neutron capture analysis occurred in the 350-800 ??s window. A poor S/N in the first 100-150 ??s is due to a large background continuum during the first time interval. Time gating can be used to enhance gamma-ray spectra, depending on the nuclides in the target material and the reactions needed to produce them, and should improve the sensitivity of in situ well logging. ?? 1984.

  3. Investigating changes in brain network properties in HIV-associated neurocognitive disease (HAND) using mutual connectivity analysis (MCA)

    NASA Astrophysics Data System (ADS)

    Abidin, Anas Zainul; D'Souza, Adora M.; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    About 50% of subjects infected with HIV present deficits in cognitive domains, which are known collectively as HIV associated neurocognitive disorder (HAND). The underlying synaptodendritic damage can be captured using resting state functional MRI, as has been demonstrated by a few earlier studies. Such damage may induce topological changes of brain connectivity networks. We test this hypothesis by capturing the functional interdependence of 90 brain network nodes using a Mutual Connectivity Analysis (MCA) framework with non-linear time series modeling based on Generalized Radial Basis function (GRBF) neural networks. The network nodes are selected based on the regions defined in the Automated Anatomic Labeling (AAL) atlas. Each node is represented by the average time series of the voxels of that region. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We tested for differences in these properties in network graphs obtained for 10 subjects (6 male and 4 female, 5 HIV+ and 5 HIV-). Global network properties captured some differences between these subject cohorts, though significant differences were seen only with the clustering coefficient measure. Local network properties, such as local efficiency and the degree of connections, captured significant differences in regions of the frontal lobe, precentral and cingulate cortex amongst a few others. These results suggest that our method can be used to effectively capture differences occurring in brain network connectivity properties revealed by resting-state functional MRI in neurological disease states, such as HAND.

  4. Tsetse fly feeding preference as determined by vehicular trapping in Tanzania.

    PubMed

    Gates, D B; Williamson, D L

    1984-06-01

    In eastern Tanzania an electric grid trap carried in the back of a moving pick-up truck was used to capture engorged Glossina morsitans morsitans and G. pallidipes for an analysis of their food sources. Although 12 000 head of domestic cattle represented c. 75% of the animal biomass in the survey area, they provided only 5.6% of the total blood meals, while 74.8% were from warthogs and bushpigs. The percentage of females among the captured flies was 12 and 47 for G. m. morsitans and G. pallidipes, respectively. The incidence of engorged flies captured by this method ranged from 15 to 20% in males and 26 to 42% in females.

  5. Effective capture and release of circulating tumor cells using core-shell Fe3O4@MnO2 nanoparticles

    NASA Astrophysics Data System (ADS)

    Xiao, Liang; He, Zhao-Bo; Cai, Bo; Rao, Lang; Cheng, Long; Liu, Wei; Guo, Shi-Shang; Zhao, Xing-Zhong

    2017-01-01

    Circulating tumor cells (CTCs) have been believed to hold significant insights for cancer diagnosis and therapy. Here, we developed a simple and effective method to capture and release viable CTCs using core-shell Fe3O4@MnO2 nanoparticles. Fe3O4@MnO2 nanoparticles bioconjugated with anti-EpCAM antibody have characteristics of specific recognition, magnetic-driven cell isolation and oxalic acid-assisted cell release. The capture and release efficiency of target cancer cells were ∼83% and ∼55%, respectively. And ∼70% of released cells kept good viability, which could facilitate the subsequent cellular analysis.

  6. Capture of unstable protein complex on the streptavidin-coated single-walled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Liu, Zunfeng; Voskamp, Patrick; Zhang, Yue; Chu, Fuqiang; Abrahams, Jan Pieter

    2013-04-01

    Purification of unstable protein complexes is a bottleneck for investigation of their 3D structure and in protein-protein interaction studies. In this paper, we demonstrate that streptavidin-coated single-walled carbon nanotubes (Strep•SWNT) can be used to capture the biotinylated DNA- EcoRI complexes on a 2D surface and in solution using atomic force microscopy and electrophoresis analysis, respectively. The restriction enzyme EcoRI forms unstable complexes with DNA in the absence of Mg2+. Capturing the EcoRI-DNA complexes on the Strep•SWNT succeeded in the absence of Mg2+, demonstrating that the Strep•SWNT can be used for purifying unstable protein complexes.

  7. Rapid and direct detection of attomole adenosine triphosphate (ATP) by MALDI-MS using rutile titania chips.

    PubMed

    Manikandan, Muthu; Hasan, Nazim; Wu, Hui-Fen

    2012-11-07

    We report the rutile titania-based capture of ATP and its application as a MALDI-MS target plate. This chip, when immersed in solutions containing different concentrations of ATP, can capture ATP and lead to its successful detection in MALDI-MS. We have optimized the ideal surface, showing an increased capture efficacy of the 900 °C (rutile) titania surfaces. We demonstrate the use of this chip as a target plate for direct analysis of the attached ATP using MALDI-MS, down to attomolar concentrations. This chip has a promising future for the detection of ATP in environmental samples, which may eventually be used as a pollution indicator in particular environments.

  8. Bat predation on nocturnally migrating birds

    PubMed Central

    Ibáñez, Carlos; Juste, Javier; García-Mudarra, Juan L.; Agirre-Mendi, Pablo T.

    2001-01-01

    Bat predation on birds is a very rare phenomenon in nature. Most documented reports of bird-eating bats refer to tropical bats that occasionally capture resting birds. Millions of small birds concentrate and cross over the world's temperate regions during migration, mainly at night, but no nocturnal predators are known to benefit from this enormous food resource. An analysis of 14,000 fecal pellets of the greater noctule bat (Nyctalus lasiopterus) reveals that this species captures and eats large numbers of migrating passerines, making it the only bat species so far known that regularly preys on birds. The echolocation characteristics and wing morphology of this species strongly suggest that it captures birds in flight. PMID:11493689

  9. Research on particulate filter simulation and regeneration control strategy

    NASA Astrophysics Data System (ADS)

    Dawei, Qu; Jun, Li; Yu, Liu

    2017-03-01

    This paper reports a DPF (Diesel Particulate Filter) collection mathematical model for a new regeneration control strategy. The new strategy is composed by main parts, such as regeneration time capturing, temperature rising strategy and regeneration control strategy. In the part of regeneration time capturing, a multi-level regeneration capturing method is put forward based on the combined effect of the PM (Particulate Matter) loading, pressure drop and fuel consumption. The temperature rising strategy proposes the global temperature for all operating conditions. The regeneration control process considers the particle loading density, temperature and oxygen respectively. Based on the analysis of the initial overheating, runaway temperature and local hot spot, the final control strategy is established.

  10. Pi-CO₂ aqueous post-combustion CO₂ capture: Proof of concept through thermodynamic, hydrodynamic, and gas-lift pump modeling

    DOE PAGES

    Blount, G.; Gorensek, M.; Hamm, L.; ...

    2014-12-31

    Partnering in Innovation, Inc. (Pi-Innovation) introduces an aqueous post-combustion carbon dioxide (CO₂) capture system (Pi-CO₂) that offers high market value by directly addressing the primary constraints limiting beneficial re-use markets (lowering parasitic energy costs, reducing delivered cost of capture, eliminating the need for special solvents, etc.). A highly experienced team has completed initial design, modeling, manufacturing verification, and financial analysis for commercial market entry. Coupled thermodynamic and thermal-hydraulic mass transfer modeling results fully support proof of concept. Pi-CO₂ has the potential to lower total cost and risk to levels sufficient to stimulate global demand for CO₂ from local industrial sources.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullmann, J. L.; Kawano, T.; Baramsai, B.

    The cross section for neutron capture in the continuum region has been difficult to calculate accurately. Previous results for 238 U show that including an M 1 scissors-mode contribution to the photon strength function resulted in very good agreement between calculation and measurement. Our paper extends that analysis to 234 , 236 U by using γ -ray spectra measured with the Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center to constrain the photon strength function used to calculate the capture cross section. Calculations using a strong scissors-mode contribution reproduced the measured γ -ray spectramore » and were in excellent agreement with the reported cross sections for all three isotopes.« less

  12. Why Compare? A Response to Stephen Lawton.

    ERIC Educational Resources Information Center

    Gordon, Liz; Pearce, Diane

    1993-01-01

    Aims to stimulate interest in comparative education policy analysis by critiquing a paper by Stephen Lawton. Such comparative analysis is important in understanding neoliberal education reforms, but more work is needed to provide adequate categories for analysis. Lawton's categories, reformulated here as efficiency, managing and provider capture,…

  13. First Measurement of θ 13 From Delayed Neutron Capture on Hydrogen in the Double Chooz Experiment

    DOE PAGES

    Abe, Y.; Aberle, C.; dos Anjos, J. C.; ...

    2013-04-27

    The Double Chooz experiment has determined the value of the neutrino oscillation parameter θ 13 from an analysis of inverse beta decay interactions with neutron capture on hydrogen. The analysis presented here uses a three times larger fiducial volume than the standard Double Chooz assessment, which is restricted to a region doped with gadolinium (Gd), yielding an exposure of 113.1 GW-ton-years. The data sample used in this analysis is distinct from that of the Gd analysis, and the systematic uncertainties are also largely independent, with some exceptions, such as the reactor neutrino flux prediction. A combined rate- and energy-dependent fitmore » finds sin 22θ 13 = 0.097±0.034(stat.)±0.034(syst.), excluding the no-oscillation hypothesis at 2.0σ. This result is consistent with previous measurements of sin 22θ 13.« less

  14. A smart core-sheath nanofiber that captures and releases red blood cells from the blood

    NASA Astrophysics Data System (ADS)

    Shi, Q.; Hou, J.; Zhao, C.; Xin, Z.; Jin, J.; Li, C.; Wong, S.-C.; Yin, J.

    2016-01-01

    A smart core-sheath nanofiber for non-adherent cell capture and release is demonstrated. The nanofibers are fabricated by single-spinneret electrospinning of poly(N-isopropylacrylamide) (PNIPAAm), polycaprolactone (PCL) and nattokinase (NK) solution blends. The self-assembly of PNIPAAm and PCL blends during the electrospinning generates the core-sheath PCL/PNIPAAm nanofibers with PNIPAAm as the sheath. The PNIPAAm-based core-sheath nanofibers are switchable between hydrophobicity and hydrophilicity with temperature change and enhance stability in the blood. When the nanofibers come in contact with blood, the NK is released from the nanofibers to resist platelet adhesion on the nanofiber surface, facilitating the direct capture and isolation of red blood cells (RBCs) from the blood above phase-transition temperature of PNIPAAm. Meanwhile, the captured RBCs are readily released from the nanofibers with temperature stimuli in an undamaged manner. The release efficiency of up to 100% is obtained while maintaining cellular integrity and function. This work presents promising nanofibers to effectively capture non-adherent cells and release for subsequent molecular analysis and diagnosis of single cells.A smart core-sheath nanofiber for non-adherent cell capture and release is demonstrated. The nanofibers are fabricated by single-spinneret electrospinning of poly(N-isopropylacrylamide) (PNIPAAm), polycaprolactone (PCL) and nattokinase (NK) solution blends. The self-assembly of PNIPAAm and PCL blends during the electrospinning generates the core-sheath PCL/PNIPAAm nanofibers with PNIPAAm as the sheath. The PNIPAAm-based core-sheath nanofibers are switchable between hydrophobicity and hydrophilicity with temperature change and enhance stability in the blood. When the nanofibers come in contact with blood, the NK is released from the nanofibers to resist platelet adhesion on the nanofiber surface, facilitating the direct capture and isolation of red blood cells (RBCs) from the blood above phase-transition temperature of PNIPAAm. Meanwhile, the captured RBCs are readily released from the nanofibers with temperature stimuli in an undamaged manner. The release efficiency of up to 100% is obtained while maintaining cellular integrity and function. This work presents promising nanofibers to effectively capture non-adherent cells and release for subsequent molecular analysis and diagnosis of single cells. Electronic supplementary information (ESI) available: Electrospinning of polymer nanofibers; FTIR spectra and XPS spectra of PCL, PNIPAAm and PCL/PNIPAAm nanofibers; SEM images of PCL/PNIPAAm nanofibers with varied composition; PNIPAAm content on the sheath of nanofibers; stability of core-sheath PCL/PNIPAAm nanofibers. Platelet adhesion on the PCL/PNIPAAm nanofibers in the presence of NK; Protein adsorption on nanofibers. See DOI: 10.1039/c5nr07070h

  15. Electronic capture of patient-reported and clinician-reported outcome measures in an elective orthopaedic setting: a retrospective cohort analysis.

    PubMed

    Malhotra, Karan; Buraimoh, Olatunbosun; Thornton, James; Cullen, Nicholas; Singh, Dishan; Goldberg, Andrew J

    2016-06-20

    To determine whether an entirely electronic system can be used to capture both patient-reported outcomes (electronic Patient-Reported Outcome Measures, ePROMs) as well as clinician-validated diagnostic and complexity data in an elective surgical orthopaedic outpatient setting. To examine patients' experience of this system and factors impacting their experience. Retrospective analysis of prospectively collected data. Single centre series. Outpatient clinics at an elective foot and ankle unit in the UK. All new adult patients attending elective orthopaedic outpatient clinics over a 32-month period. All patients were invited to complete ePROMs prior to attending their outpatient appointment. At their appointment, those patients who had not completed ePROMs were offered the opportunity to complete it on a tablet device with technical support. Matched diagnostic and complexity data were captured by the treating consultant during the appointment. Capture rates of patient-reported and clinician-reported data. All information and technology (IT) failures, language and disability barriers were captured. Patients were asked to rate their experience of using ePROMs. The scoring systems used included EQ-5D-5L, the Manchester-Oxford Foot Questionnaire (MOxFQ) and the Visual Analogue Scale (VAS) pain score. Out of 2534 new patients, 2176 (85.9%) completed ePROMs, of whom 1090 (50.09%) completed ePROMs at home/work prior to their appointment. 31.5% used a mobile (smartphone/tablet) device. Clinician-reported data were captured on 2491 patients (98.3%). The mean patient experience score of using Patient-Reported Outcome Measures (PROMs) was 8.55±1.85 out of 10 and 666 patients (30.61%) left comments. Of patients leaving comments, 214 (32.13%) felt ePROMs did not adequately capture their symptoms and these patients had significantly lower patient experience scores (p<0.001). This study demonstrates the successful implementation of technology into a service improvement programme. Excellent capture rates of ePROMs and clinician-validated diagnostic data can be achieved within a National Health Service setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Development of Glycoprotein Capture-Based Label-Free Method for the High-throughput Screening of Differential Glycoproteins in Hepatocellular Carcinoma*

    PubMed Central

    Chen, Rui; Tan, Yexiong; Wang, Min; Wang, Fangjun; Yao, Zhenzhen; Dong, Liwei; Ye, Mingliang; Wang, Hongyang; Zou, Hanfa

    2011-01-01

    A robust, reproducible, and high throughput method was developed for the relative quantitative analysis of glycoprotein abundances in human serum. Instead of quantifying glycoproteins by glycopeptides in conventional quantitative glycoproteomics, glycoproteins were quantified by nonglycosylated peptides derived from the glycoprotein digest, which consists of the capture of glycoproteins in serum samples and the release of nonglycopeptides by trypsin digestion of captured glycoproteins followed by two-dimensional liquid chromatography-tandem MS analysis of released peptides. Protein quantification was achieved by comparing the spectrum counts of identified nonglycosylated peptides of glycoproteins between different samples. This method was demonstrated to have almost the same specificity and sensitivity in glycoproteins quantification as capture at glycopeptides level. The differential abundance of proteins present at as low as nanogram per milliliter levels was quantified with high confidence. The established method was applied to the analysis of human serum samples from healthy people and patients with hepatocellular carcinoma (HCC) to screen differential glycoproteins in HCC. Thirty eight glycoproteins were found with substantial concentration changes between normal and HCC serum samples, including α-fetoprotein, the only clinically used marker for HCC diagnosis. The abundance changes of three glycoproteins, i.e. galectin-3 binding protein, insulin-like growth factor binding protein 3, and thrombospondin 1, which were associated with the development of HCC, were further confirmed by enzyme-linked immunosorbent assay. In conclusion, the developed method was an effective approach to quantitatively analyze glycoproteins in human serum and could be further applied in the biomarker discovery for HCC and other cancers. PMID:21474793

  17. An analysis of characteristics of post-authorisation studies registered on the ENCePP EU PAS Register

    PubMed Central

    Carroll, Robert; Ramagopalan, Sreeram V.; Cid-Ruzafa, Javier; Lambrelli, Dimitra; McDonald, Laura

    2017-01-01

    Background: The objective of this study was to investigate the study design characteristics of Post-Authorisation Studies (PAS) requested by the European Medicines Agency which were recorded on the European Union (EU) PAS Register held by the European Network of Centres for Pharmacoepidemiology and Pharmacovigilance (ENCePP). Methods: We undertook a cross-sectional descriptive analysis of all studies registered on the EU PAS Register as of 18 th October 2016. Results: We identified a total of 314 studies on the EU PAS Register, including 81 (26%) finalised, 160 (51%) ongoing and 73 (23%) planned. Of those studies identified, 205 (65%) included risk assessment in their scope, 133 (42%) included drug utilisation and 94 (30%) included effectiveness evaluation. Just over half of the studies (175; 56%) used primary data capture, 135 (43%) used secondary data and 4 (1%) used a hybrid design combining both approaches. Risk assessment and effectiveness studies were more likely to use primary data capture (60% and 85% respectively as compared to 39% and 14% respectively for secondary). The converse was true for drug utilisation studies where 59% were secondary vs. 39% for primary. For type 2 diabetes mellitus, database studies were more commonly used (80% vs 3% chart review, 3% hybrid and 13% primary data capture study designs) whereas for studies in oncology, primary data capture were more likely to be used (85% vs 4% chart review, and 11% database study designs). Conclusions: Results of this analysis show that PAS design varies according to study objectives and therapeutic area. PMID:29188016

  18. Graft Immunocomplex Capture Fluorescence Analysis to Detect Donor-Specific Antibodies and HLA Antigen Complexes in the Allograft.

    PubMed

    Nakamura, Tsukasa; Ushigome, Hidetaka; Watabe, Kiyoko; Imanishi, Yui; Masuda, Koji; Matsuyama, Takehisa; Harada, Shumpei; Koshino, Katsuhiro; Iida, Taku; Nobori, Shuji; Yoshimura, Norio

    2017-04-01

    Immunocomplex capture fluorescence analysis (ICFA) is an attractive method to detect donor-specific anti-HLA antibodies (DSA) and HLA antigen complexes. Currently, antibody-mediated rejection (AMR) due to DSA is usually diagnosed by C4d deposition and serological DSA detection. Conversely, there is a discrepancy between these findings frequently. Thereupon, our graft ICFA technique may contribute to establish the diagnosis of AMR. Graft samples were obtained by a percutaneous needle biopsy. Then, the specimen was dissolved in PBS by the lysis buffer. Subsequently, HLA antigens were captured by anti-HLA beads. Then, DSA-HLA complexes were detected by PE-conjugated anti-human IgG antibodies, where DSA had already reacted with the allograft in vivo, analyzed by a Luminex system. A ratio (sample MFI/blank beads MFI) was calculated: ≥ 1.0 was determined as positive. We found that DSA-HLA complexes in the graft were successfully detected from only slight positive 1.03 to 79.27 in a chronic active AMR patient by graft ICFA. Next, positive graft ICFA had predicted the early phase of AMR (MFI ratio: 1.38) even in patients with no serum DSA. Finally, appropriate therapies for AMR deleted DSA deposition (MFI ratio from 0.3 to 0.7) from allografts. This novel application would detect early phase or incomplete pathological cases of AMR, which could lead to a correct diagnosis and initiation of appropriate therapies. Moreover, graft ICFA might address a variety of long-standing questions in terms of DSA. AMR: Antibody-mediated rejection; DSA: Donor-specific antibodies; ICFA: Immunocomplex capture fluorescence analysis.

  19. Magnetic Capture of a Molecular Biomarker from Synovial Fluid in a Rat Model of Knee Osteoarthritis

    PubMed Central

    Yarmola, Elena G.; Shah, Yash; Arnold, David P.; Dobson, Jon; Allen, Kyle D.

    2015-01-01

    Biomarker development for osteoarthritis (OA) often begins in rodent models, but can be limited by an inability to aspirate synovial fluid from a rodent stifle (similar to the human knee). To address this limitation, we have developed a magnetic nanoparticle-based technology to collect biomarkers from a rodent stifle, termed magnetic capture. Using a common OA biomarker - the c-terminus telopeptide of type II collagen (CTXII) - magnetic capture was optimized in vitro using bovine synovial fluid and then tested in a rat model of knee OA. Anti-CTXII antibodies were conjugated to the surface of superparamagnetic iron oxide-containing polymeric particles. Using these anti-CTXII particles, magnetic capture was able to estimate the level of CTXII in 25 µL aliquots of bovine synovial fluid; and under controlled conditions, this estimate was unaffected by synovial fluid viscosity. Following in vitro testing, anti-CTXII particles were tested in a rat monoiodoacetate model of knee OA. CTXII could be magnetically captured from a rodent stifle without the need to aspirate fluid and showed 10 fold changes in CTXII levels from OA-affected joints relative to contralateral control joints. Combined, these data demonstrate the ability and sensitivity of magnetic capture for post-mortem analysis of OA biomarkers in the rat. PMID:26136062

  20. Magnetic Capture of a Molecular Biomarker from Synovial Fluid in a Rat Model of Knee Osteoarthritis.

    PubMed

    Yarmola, Elena G; Shah, Yash; Arnold, David P; Dobson, Jon; Allen, Kyle D

    2016-04-01

    Biomarker development for osteoarthritis (OA) often begins in rodent models, but can be limited by an inability to aspirate synovial fluid from a rodent stifle (similar to the human knee). To address this limitation, we have developed a magnetic nanoparticle-based technology to collect biomarkers from a rodent stifle, termed magnetic capture. Using a common OA biomarker--the c-terminus telopeptide of type II collagen (CTXII)--magnetic capture was optimized in vitro using bovine synovial fluid and then tested in a rat model of knee OA. Anti-CTXII antibodies were conjugated to the surface of superparamagnetic iron oxide-containing polymeric particles. Using these anti-CTXII particles, magnetic capture was able to estimate the level of CTXII in 25 μL aliquots of bovine synovial fluid; and under controlled conditions, this estimate was unaffected by synovial fluid viscosity. Following in vitro testing, anti-CTXII particles were tested in a rat monoiodoacetate model of knee OA. CTXII could be magnetically captured from a rodent stifle without the need to aspirate fluid and showed tenfold changes in CTXII levels from OA-affected joints relative to contralateral control joints. Combined, these data demonstrate the ability and sensitivity of magnetic capture for post-mortem analysis of OA biomarkers in the rat.

  1. Highly Surface-Active Ca(OH)2 Monolayer as a CO2 Capture Material.

    PubMed

    Özçelik, V Ongun; Gong, Kai; White, Claire E

    2018-03-14

    Greenhouse gas emissions originating from fossil fuel combustion contribute significantly to global warming, and therefore the design of novel materials that efficiently capture CO 2 can play a crucial role in solving this challenge. Here, we show that reducing the dimensionality of bulk crystalline portlandite results in a stable monolayer material, named portlandene, that is highly effective at capturing CO 2 . On the basis of theoretical analysis comprised of ab initio quantum mechanical calculations and force-field molecular dynamics simulations, we show that this single-layer phase is robust and maintains its stability even at high temperatures. The chemical activity of portlandene is seen to further increase upon defect engineering of its surface using vacancy sites. Defect-containing portlandene is capable of separating CO and CO 2 from a syngas (CO/CO 2 /H 2 ) stream, yet is inert to water vapor. This selective behavior and the associated mechanisms have been elucidated by examining the electronic structure, local charge distribution, and bonding orbitals of portlandene. Additionally, unlike conventional CO 2 capturing technologies, the regeneration process of portlandene does not require high temperature heat treatment because it can release the captured CO 2 by application of a mild external electric field, making portlandene an ideal CO 2 capturing material for both pre- and postcombustion processes.

  2. A Comparative Study of the CO2 Absorption in Some Solvent-Free Alkanolamines and in Aqueous Monoethanolamine (MEA).

    PubMed

    Barzagli, Francesco; Mani, Fabrizio; Peruzzini, Maurizio

    2016-07-05

    The neat secondary amines 2-(methylamino)ethanol, 2-(ethylamino)ethanol, 2-(isopropylamino)ethanol, 2-(benzylamino)ethanol and 2-(butylamino)ethanol react with CO2 at 50-60 °C and room pressure yielding liquid carbonated species without their dilution with any additional solvent. These single-component absorbents have the theoretical CO2 capture capacity of 0.50 (mol CO2/mol amine) due to the formation of the corresponding amine carbamates and protonated amines that were identified by the (13)C NMR analysis. These single-component absorbents were used for CO2 capture (15% and 40% v/v in air) in two series of different procedures: (1) batch experiments aimed at investigating the efficiency and the rate of CO2 capture; (2) continuous cycles of absorption-desorption carried out in packed columns with absorption temperatures brought at 50-60 °C and desorption temperatures at 100-120 °C at room pressure. A number of different amines and experimental setups gave CO2 capture efficiency greater than 90%. For comparison purposes, 30 wt % aqueous MEA was used for CO2 capture under the same operational conditions described for the solvent-free amines. The potential advantages of solvent-free alkanolamines over aqueous MEA in the CO2 capture process were discussed.

  3. Multimodal RNA-seq using single-strand, double-strand, and CircLigase-based capture yields a refined and extended description of the C. elegans transcriptome.

    PubMed

    Lamm, Ayelet T; Stadler, Michael R; Zhang, Huibin; Gent, Jonathan I; Fire, Andrew Z

    2011-02-01

    We have used a combination of three high-throughput RNA capture and sequencing methods to refine and augment the transcriptome map of a well-studied genetic model, Caenorhabditis elegans. The three methods include a standard (non-directional) library preparation protocol relying on cDNA priming and foldback that has been used in several previous studies for transcriptome characterization in this species, and two directional protocols, one involving direct capture of single-stranded RNA fragments and one involving circular-template PCR (CircLigase). We find that each RNA-seq approach shows specific limitations and biases, with the application of multiple methods providing a more complete map than was obtained from any single method. Of particular note in the analysis were substantial advantages of CircLigase-based and ssRNA-based capture for defining sequences and structures of the precise 5' ends (which were lost using the double-strand cDNA capture method). Of the three methods, ssRNA capture was most effective in defining sequences to the poly(A) junction. Using data sets from a spectrum of C. elegans strains and stages and the UCSC Genome Browser, we provide a series of tools, which facilitate rapid visualization and assignment of gene structures.

  4. Use of models to map potential capture of surface water

    USGS Publications Warehouse

    Leake, Stanley A.

    2006-01-01

    The effects of ground-water withdrawals on surface-water resources and riparian vegetation have become important considerations in water-availability studies. Ground water withdrawn by a well initially comes from storage around the well, but with time can eventually increase inflow to the aquifer and (or) decrease natural outflow from the aquifer. This increased inflow and decreased outflow is referred to as “capture.” For a given time, capture can be expressed as a fraction of withdrawal rate that is accounted for as increased rates of inflow and decreased rates of outflow. The time frames over which capture might occur at different locations commonly are not well understood by resource managers. A ground-water model, however, can be used to map potential capture for areas and times of interest. The maps can help managers visualize the possible timing of capture over large regions. The first step in the procedure to map potential capture is to run a ground-water model in steady-state mode without withdrawals to establish baseline total flow rates at all sources and sinks. The next step is to select a time frame and appropriate withdrawal rate for computing capture. For regional aquifers, time frames of decades to centuries may be appropriate. The model is then run repeatedly in transient mode, each run with one well in a different model cell in an area of interest. Differences in inflow and outflow rates from the baseline conditions for each model run are computed and saved. The differences in individual components are summed and divided by the withdrawal rate to obtain a single capture fraction for each cell. Values are contoured to depict capture fractions for the time of interest. Considerations in carrying out the analysis include use of realistic physical boundaries in the model, understanding the degree of linearity of the model, selection of an appropriate time frame and withdrawal rate, and minimizing error in the global mass balance of the model.

  5. Toward Capturing Momentary Changes of Heart Rate Variability by a Dynamic Analysis Method

    PubMed Central

    Zhang, Haoshi; Zhu, Mingxing; Zheng, Yue; Li, Guanglin

    2015-01-01

    The analysis of heart rate variability (HRV) has been performed on long-term electrocardiography (ECG) recordings (12~24 hours) and short-term recordings (2~5 minutes), which may not capture momentary change of HRV. In this study, we present a new method to analyze the momentary HRV (mHRV). The ECG recordings were segmented into a series of overlapped HRV analysis windows with a window length of 5 minutes and different time increments. The performance of the proposed method in delineating the dynamics of momentary HRV measurement was evaluated with four commonly used time courses of HRV measures on both synthetic time series and real ECG recordings from human subjects and dogs. Our results showed that a smaller time increment could capture more dynamical information on transient changes. Considering a too short increment such as 10 s would cause the indented time courses of the four measures, a 1-min time increment (4-min overlapping) was suggested in the analysis of mHRV in the study. ECG recordings from human subjects and dogs were used to further assess the effectiveness of the proposed method. The pilot study demonstrated that the proposed analysis of mHRV could provide more accurate assessment of the dynamical changes in cardiac activity than the conventional measures of HRV (without time overlapping). The proposed method may provide an efficient means in delineating the dynamics of momentary HRV and it would be worthy performing more investigations. PMID:26172953

  6. TARGET: Rapid Capture of Process Knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.

    1993-01-01

    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.

  7. Biochar and enhanced phosphate capture: Mapping mechanisms to functional properties.

    PubMed

    Shepherd, Jessica G; Joseph, Stephen; Sohi, Saran P; Heal, Kate V

    2017-07-01

    A multi-technique analysis was performed on a range of biochar materials derived from secondary organic resources and aimed at sustainable recovery and re-use of wastewater phosphorus (P). Our purpose was to identify mechanisms of P capture in biochar and thereby inform its future optimisation as a sustainable P fertiliser. The biochar feedstock comprised pellets of anaerobically digested sewage sludge (PAD) or pellets of the same blended in the ratio 9:1 with ochre sourced from minewater treatment (POCAD), components which have limited alternative economic value. In the present study the feedstocks were pyrolysed at two highest treatment temperatures of 450 and 550 °C. Each of the resulting biochars were repeatedly exposed to a 20 mg l -1 PO 4 -P solution, to produce a parallel set of P-exposed biochars. Biochar exterior and/or interior surfaces were quantitatively characterised using laser-ablation (LA)-ICP-MS, X-ray diffraction, X-ray photo-electron spectroscopy (XPS) and scanning electron microscopy coupled with energy dispersive X-ray. The results highlighted the general importance of Fe minerals in P capture. XPS analysis of POCAD550 indicated lower oxidation state Fe2p3 bonding compared to POCAD450, and LA-ICP-MS indicated stronger covariation of Fe and S, even after P exposure. This suggests that low-solubility Fe/S compounds are formed during pyrolysis, are affected by process parameters and impact on P capture. Other data suggested capture roles for aluminium, calcium and silicon. Overall, our analyses suggest that a range of mechanisms for P capture are concurrently active in biochar. We highlighted the potential to manipulate these through choice of form and composition of feedstock as well as pyrolysis processing, so that biochar may be increasingly tailored towards specific functionality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Foot and Ankle Kinematics and Dynamic Electromyography: Quantitative Analysis of Recovery From Peroneal Neuropathy in a Professional Football Player.

    PubMed

    Prasad, Nikhil K; Coleman Wood, Krista A; Spinner, Robert J; Kaufman, Kenton R

    The assessment of neuromuscular recovery after peripheral nerve surgery has typically been a subjective physical examination. The purpose of this report was to assess the value of gait analysis in documenting recovery quantitatively. A professional football player underwent gait analysis before and after surgery for a peroneal intraneural ganglion cyst causing a left-sided foot drop. Surface electromyography (SEMG) recording from surface electrodes and motion parameter acquisition from a computerized motion capture system consisting of 10 infrared cameras were performed simultaneously. A comparison between SEMG recordings before and after surgery showed a progression from disorganized activation in the left tibialis anterior and peroneus longus muscles to temporally appropriate activation for the phase of the gait cycle. Kinematic analysis of ankle motion planes showed resolution from a complete foot drop preoperatively to phase-appropriate dorsiflexion postoperatively. Gait analysis with dynamic SEMG and motion capture complements physical examination when assessing postoperative recovery in athletes.

  9. Tracking B-Cell Repertoires and Clonal Histories in Normal and Malignant Lymphocytes.

    PubMed

    Weston-Bell, Nicola J; Cowan, Graeme; Sahota, Surinder S

    2017-01-01

    Methods for tracking B-cell repertoires and clonal history in normal and malignant B-cells based on immunoglobulin variable region (IGV) gene analysis have developed rapidly with the advent of massive parallel next-generation sequencing (mpNGS) protocols. mpNGS permits a depth of analysis of IGV genes not hitherto feasible, and presents challenges of bioinformatics analysis, which can be readily met by current pipelines. This strategy offers a potential resolution of B-cell usage at a depth that may capture fully the natural state, in a given biological setting. Conventional methods based on RT-PCR amplification and Sanger sequencing are also available where mpNGS is not accessible. Each method offers distinct advantages. Conventional methods for IGV gene sequencing are readily adaptable to most laboratories and provide an ease of analysis to capture salient features of B-cell use. This chapter describes two methods in detail for analysis of IGV genes, mpNGS and conventional RT-PCR with Sanger sequencing.

  10. Analysis of Glycoproteins in Human Serum by Means of Glycospecific Magnetic Bead Separation and LC-MALDI-TOF/TOF Analysis with Automated Glycopeptide Detection

    PubMed Central

    Sparbier, Katrin; Asperger, Arndt; Resemann, Anja; Kessler, Irina; Koch, Sonja; Wenzel, Thomas; Stein, Günter; Vorwerg, Lars; Suckau, Detlev; Kostrzewa, Markus

    2007-01-01

    Comprehensive proteomic analyses require efficient and selective pre-fractionation to facilitate analysis of post-translationally modified peptides and proteins, and automated analysis workflows enabling the detection, identification, and structural characterization of the corresponding peptide modifications. Human serum contains a high number of glycoproteins, comprising several orders of magnitude in concentration. Thereby, isolation and subsequent identification of low-abundant glycoproteins from serum is a challenging task. selective capturing of glycopeptides and -proteins was attained by means of magnetic particles specifically functionalized with lectins or boronic acids that bind to various structural motifs. Human serum was incubated with differentially functionalized magnetic micro-particles (lectins or boronic acids), and isolated proteins were digested with trypsin. Subsequently, the resulting complex mixture of peptides and glycopeptides was subjected to LC-MALDI analysis and database searching. In parallel, a second magnetic bead capturing was performed on the peptide level to separate and analyze by LC-MALDI intact glycopeptides, both peptide sequence and glycan structure. Detection of glycopeptides was achieved by means of a software algorithm that allows extraction and characterization of potential glycopeptide candidates from large LC-MALDI-MS/MS data sets, based on N-glycopeptide-specific fragmentation patterns and characteristic fragment mass peaks, respectively. By means of fast and simple glycospecific capturing applied in conjunction with extensive LC-MALDI-MS/MS analysis and novel data analysis tools, a high number of low-abundant proteins were identified, comprising known or predicted glycosylation sites. According to the specific binding preferences of the different types of beads, complementary results were obtained from the experiments using either magnetic ConA-, LCA-, WGA-, and boronic acid beads, respectively. PMID:17916798

  11. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  12. Sherlock Holmes and the proteome--a detective story.

    PubMed

    Righetti, Pier Giorgio; Boschetti, Egisto

    2007-02-01

    The performance of a hexapeptide ligand library in capturing the 'hidden proteome' is illustrated and evaluated. This library, insolubilized on an organic polymer and available under the trade name 'Equalizer Bead Technology', acts by capturing all components of a given proteome, by concentrating rare and very rare proteins, and simultaneously diluting the abundant ones. This results in a proteome of 'normalized' relative abundances, amenable to analysis by MS and any other analytical tool. Examples are given of analysis of human urine and serum, as well as cell and tissue lysates, such as Escherichia coli and Saccharomyces cerevisiae extracts. Another important application is impurity tracking and polishing of recombinant DNA products, especially biopharmaceuticals meant for human consumption.

  13. Successful treatment of capture myopathy in three wild greater sandhill cranes (Grus canadensis tabida).

    PubMed

    Businga, Nancy K; Langenberg, Julie; Carlson, LaVinda

    2007-12-01

    Two adult and 1 juvenile free-flying greater sandhill cranes (Grus canadensis tabida) were diagnosed with capture myopathy after alpha-chloralose baiting and physical capture during a banding and feeding ecologic study. Blood samples were collected for serum biochemical analysis at the time of capture for the 2 adults, and at 24 hours postcapture, at various intervals during treatment, and at the time of release for all 3 birds. Concentrations of creatine kinase, aspartate transaminase, and lactate dehydrogenase were high within 1 hour of capture and peaked approximately 3 days after capture. By days 10-17 after capture, creatine kinase and lactate dehydrogenase concentrations both decreased to within the reference range measured for cranes at capture, but aspartate transaminase concentrations remained 2-5 times higher than the measured reference range. Treatment consisted of corticosteroids, selenium/vitamin E, parenteral fluids, and gavage feedings. Physical therapy consisted of assisting the cranes to stand and walk 2-8 times a day, massaging leg muscles, and moving limbs manually through the range of motion. The adults were released when they were able to stand up independently and were pacing in the pen. The juvenile was released 12 hours after it was able to stand independently but was returned to the pen when it fell and could not rise. It was treated supportively for an additional 3 days and then successfully released. Both adult cranes were observed on their territories with their original mates after release and returned to their territories for the subsequent 8 years, raising chicks most years. After release, the juvenile was observed in a flock of cranes near its natal territory for the next 2 days.

  14. Fecal bacterial communities of wild-captured and stranded green turtles (Chelonia mydas) on the Great Barrier Reef.

    PubMed

    Ahasan, Md Shamim; Waltzek, Thomas B; Huerlimann, Roger; Ariel, Ellen

    2017-12-01

    Green turtles (Chelonia mydas) are endangered marine herbivores that break down food particles, primarily sea grasses, through microbial fermentation. However, the microbial community and its role in health and disease is still largely unexplored. In this study, we investigated and compared the fecal bacterial communities of eight wild-captured green turtles to four stranded turtles in the central Great Barrier Reef regions that include Bowen and Townsville. We used high-throughput sequencing analysis targeting the hypervariable V1-V3 regions of the bacterial 16S rRNA gene. At the phylum level, Firmicutes predominated among wild-captured green turtles, followed by Bacteroidetes and Proteobacteria. In contrast, Proteobacteria (Gammaproteobacteria) was the most significantly dominant phylum among all stranded turtles, followed by Bacteroidetes and Firmicutes. In addition, Fusobacteria was also significantly abundant in stranded turtles. No significant differences were found between the wild-captured turtles in Bowen and Townsville. At the family level, the core bacterial community consisted of 25 families that were identified in both the wild-captured and stranded green turtles, while two unique sets of 14 families each were only found in stranded or wild-captured turtles. The predominance of Bacteroides in all groups indicates the importance of these bacteria in turtle gut health. In terms of bacterial diversity and richness, wild-captured green turtles showed a higher bacterial diversity and richness compared with stranded turtles. The marked differences in the bacterial communities between wild-captured and stranded turtles suggest the possible dysbiosis in stranded turtles in addition to potential causal agents. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Measuring and Validating Neutron Capture Cross Sections Using a Lead Slowing-Down Spectrometer

    NASA Astrophysics Data System (ADS)

    Thompson, Nicholas

    Accurate nuclear data is essential for the modeling, design, and operation of nuclear systems. In this work, the Rensselaer Polytechnic Institute (RPI) Lead Slowing-Down Spectrometer (LSDS) at the Gaerttner Linear Accelerator Center (LINAC) was used to measure neutron capture cross sections and validate capture cross sections in cross section libraries. The RPI LINAC was used to create a fast burst of neutrons in the center of the LSDS, a large cube of high purity lead. A sample and YAP:Ce scintillator were placed in the LSDS, and as neutrons lost energy through scattering interactions with the lead, the scintillator detected capture gammas resulting from neutron capture events in the sample. Samples of silver, gold, cobalt, iron, indium, molybdenum, niobium, nickel, tin, tantalum, and zirconium were measured. Data was collected as a function of time after neutron pulse, or slowing-down time, which is correlated to average neutron energy. An analog and a digital data acquisition system collected data simultaneously, allowing for collection of pulse shape information as well as timing. Collection of digital data allowed for pulse shape analysis after the experiment. This data was then analyzed and compared to Monte Carlo simulations to validate the accuracy of neutron capture cross section libraries. These measurements represent the first time that neutron capture cross sections have been measured using an LSDS in the United States, and the first time tools such as coincidence measurements and pulse height weighting have been applied to measurements of neutron capture cross sections using an LSDS. Significant differences between measurement results and simulation results were found in multiple materials, and some errors in nuclear data libraries have already been identified due to these measurements.

  16. A novel imaging technique for measuring kinematics of light-weight flexible structures.

    PubMed

    Zakaria, Mohamed Y; Eliethy, Ahmed S; Canfield, Robert A; Hajj, Muhammad R

    2016-07-01

    A new imaging algorithm is proposed to capture the kinematics of flexible, thin, light structures including frequencies and motion amplitudes for real time analysis. The studied case is a thin flexible beam that is preset at different angles of attack in a wind tunnel. As the angle of attack is increased beyond a critical value, the beam was observed to undergo a static deflection that is ensued by limit cycle oscillations. Imaging analysis of the beam vibrations shows that the motion consists of a superposition of the bending and torsion modes. The proposed algorithm was able to capture the oscillation amplitudes as well as the frequencies of both bending and torsion modes. The analysis results are validated through comparison with measurements from a piezoelectric sensor that is attached to the beam at its root.

  17. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  18. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  19. A novel imaging technique for measuring kinematics of light-weight flexible structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakaria, Mohamed Y., E-mail: zakaria@vt.edu; Eliethy, Ahmed S.; Canfield, Robert A.

    2016-07-15

    A new imaging algorithm is proposed to capture the kinematics of flexible, thin, light structures including frequencies and motion amplitudes for real time analysis. The studied case is a thin flexible beam that is preset at different angles of attack in a wind tunnel. As the angle of attack is increased beyond a critical value, the beam was observed to undergo a static deflection that is ensued by limit cycle oscillations. Imaging analysis of the beam vibrations shows that the motion consists of a superposition of the bending and torsion modes. The proposed algorithm was able to capture the oscillationmore » amplitudes as well as the frequencies of both bending and torsion modes. The analysis results are validated through comparison with measurements from a piezoelectric sensor that is attached to the beam at its root.« less

  20. Isolation of Circulating Plasma Cells in Multiple Myeloma Using CD138 Antibody-Based Capture in a Microfluidic Device

    NASA Astrophysics Data System (ADS)

    Qasaimeh, Mohammad A.; Wu, Yichao C.; Bose, Suman; Menachery, Anoop; Talluri, Srikanth; Gonzalez, Gabriel; Fulciniti, Mariateresa; Karp, Jeffrey M.; Prabhala, Rao H.; Karnik, Rohit

    2017-04-01

    The necessity for bone marrow aspiration and the lack of highly sensitive assays to detect residual disease present challenges for effective management of multiple myeloma (MM), a plasma cell cancer. We show that a microfluidic cell capture based on CD138 antigen, which is highly expressed on plasma cells, permits quantitation of rare circulating plasma cells (CPCs) in blood and subsequent fluorescence-based assays. The microfluidic device is based on a herringbone channel design, and exhibits an estimated cell capture efficiency of ~40-70%, permitting detection of <10 CPCs/mL using 1-mL sample volumes, which is difficult using existing techniques. In bone marrow samples, the microfluidic-based plasma cell counts exhibited excellent correlation with flow cytometry analysis. In peripheral blood samples, the device detected a baseline of 2-5 CD138+ cells/mL in healthy donor blood, with significantly higher numbers in blood samples of MM patients in remission (20-24 CD138+ cells/mL), and yet higher numbers in MM patients exhibiting disease (45-184 CD138+ cells/mL). Analysis of CPCs isolated using the device was consistent with serum immunoglobulin assays that are commonly used in MM diagnostics. These results indicate the potential of CD138-based microfluidic CPC capture as a useful ‘liquid biopsy’ that may complement or partially replace bone marrow aspiration.

  1. Molecular characterisation of a mosaicism with a complex chromosome rearrangement: evidence for coincident chromosome healing by telomere capture and neo‐telomere formation

    PubMed Central

    Chabchoub, Elyes; Rodríguez, Laura; Galán, Enrique; Mansilla, Elena; Martínez‐Fernandez, Maria Luisa; Martínez‐Frías, Maria Luisa; Fryns, Jean‐Pierre; Vermeesch, Joris Robert

    2007-01-01

    Background Broken chromosomes must acquire new telomeric “caps” to be structurally stable. Chromosome healing can be mediated either by telomerase through neo‐telomere synthesis or by telomere capture. Aim To unravel the mechanism(s) generating complex chromosomal mosaicisms and healing broken chromosomes. Methods G banding, array comparative genomic hybridization (aCGH), fluorescence in‐situ hybridisation (FISH) and short tandem repeat analysis (STR) was performed on a girl presenting with mental retardation, facial dysmorphism, urogenital malformations and limb anomalies carrying a complex chromosomal mosaicism. Results & discussion The karyotype showed a de novo chromosome rearrangement with two cell lines: one cell line with a deletion 9pter and one cell line carrying an inverted duplication 9p and a non‐reciprocal translocation 5pter fragment. aCGH, FISH and STR analysis enabled the deduction of the most likely sequence of events generating this complex mosaic. During embryogenesis, a double‐strand break occurred on the paternal chromosome 9. Following mitotic separation of both broken sister chromatids, one acquired a telomere vianeo‐telomere formation, while the other generated a dicentric chromosome which underwent breakage during anaphase, giving rise to the del inv dup(9) that was subsequently healed by chromosome 5 telomere capture. Conclusion Broken chromosomes can coincidently be rescued by both telomere capture and neo‐telomere synthesis. PMID:17172463

  2. Targeting Human Serum Fucome by an Integrated Liquid-phase Multi Column Platform Operating in “Cascade” to Facilitate Comparative Mass Spectrometric Analysis of Disease-Free and Breast Cancer Sera

    PubMed Central

    Selvaraju, Subhashini; Rassi, Ziad El

    2013-01-01

    A fully integrated platform was developed for capturing/fractionating human fucome from disease-free and breast cancer sera. It comprised multicolumn operated by HPLC pumps and switching valves for the simultaneous depletion of high abundance proteins via affinity-based subtraction and the capturing of fucosylated glycoproteins via lectin affinity chromatography followed by the fractionation of the captured glycoproteins by reversed phase chromatography (RPC). Two lectin columns specific to fucose, namely Aleuria aurantia lectin (AAL) and Lotus tetragonolobus agglutinin (LTA) were utilized. The platform allowed the “cascading” of the serum sample from column-to-column in the liquid phase with no sample manipulation between the various steps. This guaranteed no sample loss and no propagation of experimental biases between the various columns. Finally, the fucome was fractionated by RPC yielding desalted fractions in volatile acetonitrile-rich mobile phase, which after vacuum evaporation were subjected to trypsinolysis for LC-MS/MS analysis. This permitted the identification of the differentially expressed proteins (DEP) in breast cancer serum yielding a broad panel of 35 DEP from the combined LTA and AAL captured proteins and a narrower panel of 8 DEP that were commonly differentially expressed in both LTA and AAL fractions, which are considered as more representative of cancer altered fucome. PMID:23533108

  3. Using large-scale Granger causality to study changes in brain network properties in the Clinically Isolated Syndrome (CIS) stage of multiple sclerosis

    NASA Astrophysics Data System (ADS)

    Abidin, Anas Z.; Chockanathan, Udaysankar; DSouza, Adora M.; Inglese, Matilde; Wismüller, Axel

    2017-03-01

    Clinically Isolated Syndrome (CIS) is often considered to be the first neurological episode associated with Multiple sclerosis (MS). At an early stage the inflammatory demyelination occurring in the CNS can manifest as a change in neuronal metabolism, with multiple asymptomatic white matter lesions detected in clinical MRI. Such damage may induce topological changes of brain networks, which can be captured by advanced functional MRI (fMRI) analysis techniques. We test this hypothesis by capturing the effective relationships of 90 brain regions, defined in the Automated Anatomic Labeling (AAL) atlas, using a large-scale Granger Causality (lsGC) framework. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We study for differences in these properties in network graphs obtained for 18 subjects (10 male and 8 female, 9 with CIS and 9 healthy controls). Global network properties captured trending differences with modularity and clustering coefficient (p<0.1). Additionally, local network properties, such as local efficiency and the strength of connections, captured statistically significant (p<0.01) differences in some regions of the inferior frontal and parietal lobe. We conclude that multivariate analysis of fMRI time-series can reveal interesting information about changes occurring in the brain in early stages of MS.

  4. Attitude control analysis of tethered de-orbiting

    NASA Astrophysics Data System (ADS)

    Peters, T. V.; Briz Valero, José Francisco; Escorial Olmos, Diego; Lappas, V.; Jakowski, P.; Gray, I.; Tsourdos, A.; Schaub, H.; Biesbroek, R.

    2018-05-01

    The increase of satellites and rocket upper stages in low earth orbit (LEO) has also increased substantially the danger of collisions in space. Studies have shown that the problem will continue to grow unless a number of debris are removed every year. A typical active debris removal (ADR) mission scenario includes launching an active spacecraft (chaser) which will rendezvous with the inactive target (debris), capture the debris and eventually deorbit both satellites. Many concepts for the capture of the debris while keeping a connection via a tether, between the target and chaser have been investigated, including harpoons, nets, grapples and robotic arms. The paper provides an analysis on the attitude control behaviour for a tethered de-orbiting mission based on the ESA e.Deorbit reference mission, where Envisat is the debris target to be captured by a chaser using a net which is connected to the chaser with a tether. The paper provides novel insight on the feasibility of tethered de-orbiting for the various mission phases such as stabilization after capture, de-orbit burn (plus stabilization), stabilization during atmospheric pass, highlighting the importance of various critical mission parameters such as the tether material. It is shown that the selection of the appropriate tether material while using simple controllers can reduce the effort needed for tethered deorbiting and can safely control the attitude of the debris/chaser connected with a tether, without the danger of a collision.

  5. Development of a novel engineered E. coli host cell line platform with improved column capacity performance for ion-exchange chromatography.

    PubMed

    Mukherjee, Rudra Palash; Fruchtl, McKinzie S; Beitle, Robert R; Brune, Ellen M

    2018-02-01

    This article reports on the analysis of an engineered Escherichia coli designed to reduce the host cell protein (HCP) burden on recombinant protein purification by column chromatography. Since downstream purification accounts for a major portion of production costs when using a recombinant platform, minimization of HCPs that are initially captured or otherwise interfere during chromatography will positively impact the entire purification process. Such a strategy, of course, would also require the cell line to grow, and express recombinant proteins, at levels comparable to, or better than, its parent strain. An E. coli strain with a small number of strategic deletions (LTSF06) was transformed to produce three different recombinant biologics to examine growth and expression, and with another model protein to assess growth and the effect of selectively reduced HCPs on target product capture on DEAE ion exchange medium. Cell growth levels were maintained or increased for all constructs, and a significant reduction in HCP adsorption was realized. Indeed, a breakthrough analysis indicated that as a result of reducing adsorption of particular HCPs, a 37% increase in target protein capture was observed. This increase in product capture efficiency was achieved by focusing not on HCPs that co-elute with the recombinant target, but rather on those possessing particular column adsorption and elution characteristics. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Bench-Scale Silicone Process for Low-Cost CO{sub 2} Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vipperla, Ravikumar; Yee, Michael; Steele, Ray

    This report presents system and economic analysis for a carbon capture unit which uses an amino-silicone solvent for CO{sub 2} capture and sequestration (CCS) in a pulverized coal (PC) boiler. The amino-silicone solvent is based on GAP-1 with Tri-Ethylene Glycol (TEG) as a co-solvent. The report also shows results for a CCS unit based on a conventional approach using mono-ethanol amine (MEA). Models were developed for both processes and used to calculate mass and energy balances. Capital costs and energy penalty were calculated for both systems, as well as the increase in cost of electricity. The amino-silicone solvent based systemmore » demonstrates significant advantages compared to the MEA system.« less

  7. A Quasi-Static Method for Determining the Characteristics of a Motion Capture Camera System in a "Split-Volume" Configuration

    NASA Technical Reports Server (NTRS)

    Miller, Chris; Mulavara, Ajitkumar; Bloomberg, Jacob

    2001-01-01

    To confidently report any data collected from a video-based motion capture system, its functional characteristics must be determined, namely accuracy, repeatability and resolution. Many researchers have examined these characteristics with motion capture systems, but they used only two cameras, positioned 90 degrees to each other. Everaert used 4 cameras, but all were aligned along major axes (two in x, one in y and z). Richards compared the characteristics of different commercially available systems set-up in practical configurations, but all cameras viewed a single calibration volume. The purpose of this study was to determine the accuracy, repeatability and resolution of a 6-camera Motion Analysis system in a split-volume configuration using a quasistatic methodology.

  8. Using argument notation to engineer biological simulations with increased confidence

    PubMed Central

    Alden, Kieran; Andrews, Paul S.; Polack, Fiona A. C.; Veiga-Fernandes, Henrique; Coles, Mark C.; Timmis, Jon

    2015-01-01

    The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions. PMID:25589574

  9. Using argument notation to engineer biological simulations with increased confidence.

    PubMed

    Alden, Kieran; Andrews, Paul S; Polack, Fiona A C; Veiga-Fernandes, Henrique; Coles, Mark C; Timmis, Jon

    2015-03-06

    The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions.

  10. Data processing workflows from low-cost digital survey to various applications: three case studies of Chinese historic architecture

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Cao, Y. K.

    2015-08-01

    The paper focuses on the versatility of data processing workflows ranging from BIM-based survey to structural analysis and reverse modeling. In China nowadays, a large number of historic architecture are in need of restoration, reinforcement and renovation. But the architects are not prepared for the conversion from the booming AEC industry to architectural preservation. As surveyors working with architects in such projects, we have to develop efficient low-cost digital survey workflow robust to various types of architecture, and to process the captured data for architects. Although laser scanning yields high accuracy in architectural heritage documentation and the workflow is quite straightforward, the cost and portability hinder it from being used in projects where budget and efficiency are of prime concern. We integrate Structure from Motion techniques with UAV and total station in data acquisition. The captured data is processed for various purposes illustrated with three case studies: the first one is as-built BIM for a historic building based on registered point clouds according to Ground Control Points; The second one concerns structural analysis for a damaged bridge using Finite Element Analysis software; The last one relates to parametric automated feature extraction from captured point clouds for reverse modeling and fabrication.

  11. Comparison of the Cloud Morphology Spatial Structure Between Jupiter and Saturn Using JunoCam and Cassini ISS

    NASA Astrophysics Data System (ADS)

    Garland, Justin; Sayanagi, Kunio M.; Blalock, John J.; Gunnarson, Jacob; McCabe, Ryan M.; Gallego, Angelina; Hansen, Candice; Orton, Glenn S.

    2017-10-01

    We present an analysis of the spatial-scales contained in the cloud morphology of Jupiter’s southern high latitudes using images captured by JunoCam in 2016 and 2017, and compare them to those on Saturn using images captured using the Imaging Science Subsystem (ISS) on board the Cassini orbiter. For Jupiter, the characteristic spatial scale of cloud morphology as a function of latitude is calculated from images taken in three visual (600-800, 500-600, 420-520 nm) bands and a near-infrared (880- 900 nm) band. In particular, we analyze the transition from the banded structure characteristic of Jupiter’s mid-latitudes to the chaotic structure of the polar region. We apply similar analysis to Saturn using images captured using Cassini ISS. In contrast to Jupiter, Saturn maintains its zonally organized cloud morphology from low latitudes up to the poles, culminating in the cyclonic polar vortices centered at each of the poles. By quantifying the differences in the spatial scales contained in the cloud morphology, our analysis will shed light on the processes that control the banded structures on Jupiter and Saturn. Our work has been supported by the following grants: NASA PATM NNX14AK07G, NASA MUREP NNX15AQ03A, and NSF AAG 1212216.

  12. Robust and Accurate Shock Capturing Method for High-Order Discontinuous Galerkin Methods

    NASA Technical Reports Server (NTRS)

    Atkins, Harold L.; Pampell, Alyssa

    2011-01-01

    A simple yet robust and accurate approach for capturing shock waves using a high-order discontinuous Galerkin (DG) method is presented. The method uses the physical viscous terms of the Navier-Stokes equations as suggested by others; however, the proposed formulation of the numerical viscosity is continuous and compact by construction, and does not require the solution of an auxiliary diffusion equation. This work also presents two analyses that guided the formulation of the numerical viscosity and certain aspects of the DG implementation. A local eigenvalue analysis of the DG discretization applied to a shock containing element is used to evaluate the robustness of several Riemann flux functions, and to evaluate algorithm choices that exist within the underlying DG discretization. A second analysis examines exact solutions to the DG discretization in a shock containing element, and identifies a "model" instability that will inevitably arise when solving the Euler equations using the DG method. This analysis identifies the minimum viscosity required for stability. The shock capturing method is demonstrated for high-speed flow over an inviscid cylinder and for an unsteady disturbance in a hypersonic boundary layer. Numerical tests are presented that evaluate several aspects of the shock detection terms. The sensitivity of the results to model parameters is examined with grid and order refinement studies.

  13. fMRI capture of auditory hallucinations: Validation of the two-steps method.

    PubMed

    Leroy, Arnaud; Foucher, Jack R; Pins, Delphine; Delmaire, Christine; Thomas, Pierre; Roser, Mathilde M; Lefebvre, Stéphanie; Amad, Ali; Fovet, Thomas; Jaafari, Nemat; Jardri, Renaud

    2017-10-01

    Our purpose was to validate a reliable method to capture brain activity concomitant with hallucinatory events, which constitute frequent and disabling experiences in schizophrenia. Capturing hallucinations using functional magnetic resonance imaging (fMRI) remains very challenging. We previously developed a method based on a two-steps strategy including (1) multivariate data-driven analysis of per-hallucinatory fMRI recording and (2) selection of the components of interest based on a post-fMRI interview. However, two tests still need to be conducted to rule out critical pitfalls of conventional fMRI capture methods before this two-steps strategy can be adopted in hallucination research: replication of these findings on an independent sample and assessment of the reliability of the hallucination-related patterns at the subject level. To do so, we recruited a sample of 45 schizophrenia patients suffering from frequent hallucinations, 20 schizophrenia patients without hallucinations and 20 matched healthy volunteers; all participants underwent four different experiments. The main findings are (1) high accuracy in reporting unexpected sensory stimuli in an MRI setting; (2) good detection concordance between hypothesis-driven and data-driven analysis methods (as used in the two-steps strategy) when controlled unexpected sensory stimuli are presented; (3) good agreement of the two-steps method with the online button-press approach to capture hallucinatory events; (4) high spatial consistency of hallucinatory-related networks detected using the two-steps method on two independent samples. By validating the two-steps method, we advance toward the possible transfer of such technology to new image-based therapies for hallucinations. Hum Brain Mapp 38:4966-4979, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Mind the gaps: what's missing from current economic evaluations of universal HPV vaccination?

    PubMed

    Marsh, Kevin; Chapman, Ruth; Baggaley, Rebecca F; Largeron, Nathalie; Bresse, Xavier

    2014-06-24

    Since the original licensing of human papilloma virus (HPV) vaccination for women, evidence is accumulating of its effectiveness in preventing HPV-related conditions in men, and universal vaccination (vaccinating men and women) is now recommended in some countries. Several models of the cost-effectiveness of universal HPV vaccination have been published, but results have been mixed. This article assesses the extent to which economic studies have captured the range of values associated with universal HPV vaccination, and how this influences estimates of its cost-effectiveness. Eight published economic evaluations of universal HPV vaccination were reviewed to identify which of the values associated with universal HPV vaccination were included in each analysis. Studies of the cost-effectiveness of universal HPV vaccination capture only a fraction of the values generated. Most studies focused on impacts on health and health system cost, and only captured these partially. A range of values is excluded from most studies, including impacts on productivity, patient time and costs, carers and family costs, and broader social values such as the right to access treatment. Further, those studies that attempted to capture these values only did so partially. Decisions to invest in universal HPV vaccination need to be based on a complete assessment of the value that it generates. This is not provided by existing economic evaluations. Further work is required to understand this value. First, research is required to understand how HPV-related health outcomes impact on society including, for instance, their impact on productivity. Second, consideration should be given to alternative approaches to capture this broader set of values in a manner useful to decisions-makers, such as multi-criteria decision analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Combined Pressure, Temperature Contrast and Surface-Enhanced Separation of Carbon Dioxide for Post-Combustion Carbon Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhen; Wong, Michael; Gupta, Mayank

    The Rice University research team developed a hybrid carbon dioxide (CO 2) absorption process combining absorber and stripper columns using a high surface area ceramic foam gas-liquid contactor for enhanced mass transfer and utilizing waste heat for regeneration. This integrated absorber/desorber arrangement will reduce space requirements, an important factor for retrofitting existing coal-fired power plants with CO 2 capture technology. Described in this report, we performed an initial analysis to estimate the technical and economic feasibility of the process. A one-dimensional (1D) CO 2 absorption column was fabricated to measure the hydrodynamic and mass transfer characteristics of the ceramic foam.more » A bench-scale prototype was constructed to implement the complete CO 2 separation process and tested to study various aspects of fluid flow in the process. A model was developed to simulate the two-dimensional (2D) fluid flow and optimize the CO 2 capture process. Test results were used to develop a final technoeconomic analysis and identify the most appropriate absorbent as well as optimum operating conditions to minimize capital and operating costs. Finally, a technoeconomic study was performed to assess the feasibility of integrating the process into a 600 megawatt electric (MWe) coal-fired power plant. With process optimization, $82/MWh of COE can be achieved using our integrated absorber/desorber CO 2 capture technology, which is very close to DOE's target that no more than a 35% increase in COE with CCS. An environmental, health, and safety (EH&S) assessment of the capture process indicated no significant concern in terms of EH&S effects or legislative compliance.« less

  16. Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission

    NASA Technical Reports Server (NTRS)

    Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.

    2004-01-01

    A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.

  17. The use of solar energy can enhance the conversion of carbon dioxide into energy-rich products: stepping towards artificial photosynthesis.

    PubMed

    Aresta, Michele; Dibenedetto, Angela; Angelini, Antonella

    2013-08-13

    The need to cut CO₂ emission into the atmosphere is pushing scientists and technologists to discover and implement new strategies that may be effective for controlling the CO₂ atmospheric level (and its possible effects on climate change). One option is the capture of CO₂ from power plant flue gases or other industrial processes to avoid it entering the atmosphere. The captured CO₂ can be either disposed in natural fields (geological cavities, spent gas or oil wells, coal beads, aquifers; even oceans have been proposed) or used as a source of carbon in synthetic processes. In this paper, we present the options for CO₂ utilization and make an analysis of possible solutions for the conversion of large volumes of CO₂ by either combining it with H₂, that must be generated from water, or by directly converting it into fuels by electrolysis in water using solar energy. A CO₂-H₂-based economy may address the issue of reducing the environmental burden of energy production, also saving fossil carbon for future generations. The integration of CO₂ capture and utilization with CO₂ capture and storage would result in a more economically and energetically viable practice of CO₂ capture.

  18. Stellar Neutron Capture Cross Sections of the Lu and Hf Isotopes

    NASA Astrophysics Data System (ADS)

    Wisshak, K.; Voss, F.; Käppeler, F.; Kazakov, L.; Krtička, M.

    2005-05-01

    The neutron capture cross sections of 175,176Lu and 176,177,178,179,180Hf have been measured in the energy range from 3 to 225 keV at the Karlsruhe 3.7 MV Van de Graaff accelerator relative to the gold standard. Neutrons were produced by the 7Li(p,n)7Be reaction and capture events were detected by the Karlsruhe 4πBaF2 detector. The cross section ratios could be determined with uncertainties between 0.9 and 1.8% about a factor of five more accurate than previous data. A strong population of isomeric states was found in neutron capture of the Hf isotopes, which are only partially explained by CASINO/GEANT simulations based on the known level schemes. Maxwellian averaged neutron capture cross sections were calculated for thermal energies between kT = 8 keV and 100 keV. Severe differences up to40% were found to the data of a recent evaluation based on existing experimental results. The new data allow for a much more reliable analysis of the important branching in the s-process synthesis path at 176Lu which can be interpreted as an s-process thermometer.

  19. S-factor for radiative capture reactions for light nuclei at astrophysical energies

    NASA Astrophysics Data System (ADS)

    Ghasemi, Reza; Sadeghi, Hossein

    2018-06-01

    The astrophysical S-factors of thermonuclear reactions, including radiative capture reactions and their analysis in the frame of different theoretical models, are the main source of nuclear processes. We have done research on the radiative capture reactions importance in the framework of a potential model. Investigation of the reactions in the astrophysical energies is of great interest in the aspect of astrophysics and nuclear physics for developing correct models of burning and evolution of stars. The experimental measurements are very difficult and impossible because of these reactions occurrence at low-energies. In this paper we do a calculation on radiative capture astrophysical S-factors for nuclei in the mass region A < 17. We calculate the astrophysical factor for the dipole electronic transition E1 and magnetic dipole transition M1 and electric quadrupole transition E2 by using the M3Y potential for non-resonances and resonances captures. Then we have got the parameter of a central part and spin-orbit part of M3Y potential and spectroscopic factor for reaction channels. For the astrophysical S-factor of this article the good agreement is achieved In comparison with experimental data and other theoretical methods.

  20. Ubiquitous human upper-limb motion estimation using wearable sensors.

    PubMed

    Zhang, Zhi-Qiang; Wong, Wai-Choong; Wu, Jian-Kang

    2011-07-01

    Human motion capture technologies have been widely used in a wide spectrum of applications, including interactive game and learning, animation, film special effects, health care, navigation, and so on. The existing human motion capture techniques, which use structured multiple high-resolution cameras in a dedicated studio, are complicated and expensive. With the rapid development of microsensors-on-chip, human motion capture using wearable microsensors has become an active research topic. Because of the agility in movement, upper-limb motion estimation has been regarded as the most difficult problem in human motion capture. In this paper, we take the upper limb as our research subject and propose a novel ubiquitous upper-limb motion estimation algorithm, which concentrates on modeling the relationship between upper-arm movement and forearm movement. A link structure with 5 degrees of freedom (DOF) is proposed to model the human upper-limb skeleton structure. Parameters are defined according to Denavit-Hartenberg convention, forward kinematics equations are derived, and an unscented Kalman filter is deployed to estimate the defined parameters. The experimental results have shown that the proposed upper-limb motion capture and analysis algorithm outperforms other fusion methods and provides accurate results in comparison to the BTS optical motion tracker.

  1. CO 2 Capture by Cold Membrane Operation with Actual Power Plant Flue Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaubey, Trapti; Kulkarni, Sudhir; Hasse, David

    The main objective of the project was to develop a post-combustion CO 2 capture process based on the hybrid cold temperature membrane operation. The CO 2 in the flue gas from coal fired power plant is pre-concentrated to >60% CO 2 in the first stage membrane operation followed by further liquefaction of permeate stream to achieve >99% CO 2 purity. The aim of the project was based on DOE program goal of 90% CO 2 capture with >95% CO 2 purity from Pulverized Coal (PC) fired power plants with $40/tonne of carbon capture cost by 2025. The project moves themore » technology from TRL 4 to TRL 5. The project involved optimization of Air Liquide commercial 12” PI-1 bundle to improve the bundle productivity by >30% compared to the previous baseline (DE-FE0004278) using computational fluid dynamics (CFD) modeling and bundle testing with synthetic flue gas at 0.1 MWe bench scale skid located at Delaware Research and Technology Center (DRTC). In parallel, the next generation polyimide based novel PI-2 membrane was developed with 10 times CO 2 permeance compared to the commercial PI-1 membrane. The novel PI-2 membrane was scaled from mini-permeator to 1” permeator and 1” bundle for testing. Bundle development was conducted with a Development Spin Unit (DSU) installed at MEDAL. Air Liquide’s cold membrane technology was demonstrated with real coal fired flue gas at the National Carbon Capture Center (NCCC) with a 0.3 MWe field-test unit (FTU). The FTU was designed to incorporate testing of two PI-1 commercial membrane bundles (12” or 6” diameter) in parallel or series. A slip stream was sent to the next generation PI-2 membrane for testing with real flue gas. The system exceeded performance targets with stable PI-1 membrane operation for over 500 hours of single bundle, steady state testing. The 12” PI-1 bundle exceeded the productivity target by achieving ~600 Nm3/hr, where the target was set at ~455 Nm3/hr at 90% capture rate. The cost of 90% CO 2 capture from a 550 MWe net coal power plant was estimated between 40 and $45/tonne. A 6” PI-1 bundle exhibited superior bundle performance compared to the 12” PI-1 bundle. However, the carbon capture cost was not lower with the 6” PI-1 bundle due to the higher bundle installed cost. A 1” PI-1 bundle was tested to compare bundles with different length / diameter ratios. This bundle exhibited the lowest performance due to the different fiber winding pattern and increased bundle non-ideality. Several long-term and parametric tests were conducted with 3,200 hours of total run-time at NCCC. Finally, the new PI-2 membrane fiber was tested at a small scale (1” modules) in real flue gas and exhibited up to 10 times the CO 2 permeance and slightly lower CO 2/N 2 selectivity as the commercial PI-1 fiber. This corresponded to a projected 4 - 5 times increase in the productivity per bundle and a potential cost reduction of $3/tonne for CO2 capture, as compared with PI-1. An analytical campaign was conducted to trace different impurities such as NOx, mercury, Arsenic, Selenium in gas and liquid samples through the carbon capture system. An Environmental, Health and Safety (EH&S) analysis was completed to estimate emissions from a 550 MWe net power plant with carbon capture using cold membrane. A preliminary design and cost analysis was completed for 550 tpd (~25 MWe) plant to assess the capital investment and carbon capture cost for PI-1 and PI-2 membrane solutions from coal fired flue gas. A comparison was made with an amine based solution with significant cost advantage for the membrane at this scale. Additional preliminary design and cost analysis was completed between coal, natural gas and SMR flue gas for carbon capture at 550 tpd (~25 MWe) plant.« less

  2. The Relative Efficiency of Two Strategies for Conducting Cognitive Task Analysis

    ERIC Educational Resources Information Center

    Flynn, Catherine L.

    2012-01-01

    Cognitive task analysis (CTA) has evolved over the past half century to capture the mental decisions and analysis that experts have learned to implement when solving complex problems. Since expertise is largely automated and nonconscious, a variety of observation and interview strategies have been developed to identify the most critical cognitive…

  3. Computerized image analysis for acetic acid induced intraepithelial lesions

    NASA Astrophysics Data System (ADS)

    Li, Wenjing; Ferris, Daron G.; Lieberman, Rich W.

    2008-03-01

    Cervical Intraepithelial Neoplasia (CIN) exhibits certain morphologic features that can be identified during a visual inspection exam. Immature and dysphasic cervical squamous epithelium turns white after application of acetic acid during the exam. The whitening process occurs visually over several minutes and subjectively discriminates between dysphasic and normal tissue. Digital imaging technologies allow us to assist the physician analyzing the acetic acid induced lesions (acetowhite region) in a fully automatic way. This paper reports a study designed to measure multiple parameters of the acetowhitening process from two images captured with a digital colposcope. One image is captured before the acetic acid application, and the other is captured after the acetic acid application. The spatial change of the acetowhitening is extracted using color and texture information in the post acetic acid image; the temporal change is extracted from the intensity and color changes between the post acetic acid and pre acetic acid images with an automatic alignment. The imaging and data analysis system has been evaluated with a total of 99 human subjects and demonstrate its potential to screening underserved women where access to skilled colposcopists is limited.

  4. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  5. More than Meets the Eye: Age Differences in the Capture and Suppression of Oculomotor Action

    PubMed Central

    Ridderinkhof, K. Richard; Wijnen, Jasper G.

    2011-01-01

    Salient visual stimuli capture attention and trigger an eye-movement toward its location reflexively, regardless of an observer’s intentions. Here we aim to investigate the effect of aging (1) on the extent to which salient yet task-irrelevant stimuli capture saccades, and (2) on the ability to selectively suppress such oculomotor responses. Young and older adults were asked to direct their eyes to a target appearing in a stimulus array. Analysis of overall performance shows that saccades to the target object were disrupted by the appearance of a task-irrelevant abrupt-onset distractor when the location of this distractor did not coincide with that of the target object. Conditional capture function analyses revealed that, compared to young adults, older adults were more susceptible to oculomotor capture, and exhibited deficient selective suppression of the responses captured by task-irrelevant distractors. These effects were uncorrelated, suggesting two independent sources off age-related decline. Thus, with advancing age, salient visual distractors become more distracting; in part because they trigger reflexive eye-movements more potently; in part because of failing top-down control over such reflexes. The fact that these process-specific age effects remained concealed in overall oculomotor performance analyses emphasizes the utility of looking beyond the surface; indeed, there may be more than meets the eye. PMID:22046165

  6. Noise, gain, and capture probability of p-type InAs-GaAs quantum-dot and quantum dot-in-well infrared photodetectors

    NASA Astrophysics Data System (ADS)

    Wolde, Seyoum; Lao, Yan-Feng; Unil Perera, A. G.; Zhang, Y. H.; Wang, T. M.; Kim, J. O.; Schuler-Sandy, Ted; Tian, Zhao-Bing; Krishna, S.

    2017-06-01

    We report experimental results showing how the noise in a Quantum-Dot Infrared photodetector (QDIP) and Quantum Dot-in-a-well (DWELL) varies with the electric field and temperature. At lower temperatures (below ˜100 K), the noise current of both types of detectors is dominated by generation-recombination (G-R) noise which is consistent with a mechanism of fluctuations driven by the electric field and thermal noise. The noise gain, capture probability, and carrier life time for bound-to-continuum or quasi-bound transitions in DWELL and QDIP structures are discussed. The capture probability of DWELL is found to be more than two times higher than the corresponding QDIP. Based on the analysis, structural parameters such as the numbers of active layers, the surface density of QDs, and the carrier capture or relaxation rate, type of material, and electric field are some of the optimization parameters identified to improve the gain of devices.

  7. Development and operation of a 6LiF:ZnS(Ag)-scintillating plastic capture-gated detector

    NASA Astrophysics Data System (ADS)

    Wilhelm, K.; Nattress, J.; Jovanovic, I.

    2017-01-01

    We report on the design, construction, and operation of a capture-gated neutron detector based on a heterogeneous scintillating structure comprising two scintillator types. A flat, 500 μm thick sheet composed of a mixture of lithium-6-fluoride capture agent, 6LiF, and zinc sulfide phosphor, ZnS(Ag), is wrapped around scintillating polyvinyl toluene (PVT) in a form of cylinder. The 6LiF: ZnS(Ag) sheet uses an aluminum foil backing as a support for the scintillating material and as an optical reflector, and its optical properties have been characterized independently. The composite scintillator was tested using 252Cf, DD fusion, 137Cs, and 60Co sources. The intrinsic detection efficiency for neutrons from an unmoderated 252Cf source and rejection of gammas from 137Cs were measured to be 3.6 % and 10-6, respectively. A figure of merit for pulse shape discrimination of 4.6 was achieved, and capture-gated spectroscopic analysis is demonstrated.

  8. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  9. Bench-Scale Silicone Process for Low-Cost CO{sub 2} Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vipperla, Ravikumar; Yee, Michael; Steele, Ray

    This report presents system and economic analysis for a carbon capture unit which uses an amino-silicone solvent for CO{sub 2} capture and sequestration (CCS) in a pulverized coal (PC) boiler. The amino-silicone solvent is based on GAP-1 with tri-ethylene glycol (TEG) as a co-solvent. For comparison purposes, the report also shows results for a CCS unit based on a conventional approach using mono-ethanol amine (MEA). At a steam temperature of 395 °C (743 °F), the CCS energy penalty for amino-silicone solvent is only 30.4% which compares to a 35.9% energy penalty for MEA. The increase in COE for the amino-siliconemore » solvent relative to the non-capture case is between 98% and 103% (depending on the solvent cost) which compares to an ~109% COE cost increase for MEA. In summary, the amino-silicone solvent has significant advantages over conventional systems using MEA.« less

  10. Polyethylenimine-incorporated zeolite 13X with mesoporosity for post-combustion CO2 capture

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Kim, Su-Sung; Cho, Won-Seung; Ahn, Wha-Seung

    2015-03-01

    X-type zeolite with mesoporosity (Meso-13X) was prepared by using dimethyloctadecyl[3-(trimethoxysilyl)propyl]ammonium chloride as a mesopore-generating agent, and then modified with polyethylenimine (PEI) through a physical impregnation method to form a hybrid material (Meso-13X-PEI). Meso-13X with and without PEI was characterized by X-ray powder diffraction (XRD), N2 adsorption-desorption isotherm at 77 K, scanning electron microscopy (SEM), and thermogravimetric analysis (TGA). Meso-13X-PEI exhibited higher CO2 capture capacity than PEI-modified zeolite 13X owing to its larger pore volume that accommodates more amine species inside the pore structure, and the mesoporosity also can facilitate dispersion of PEI molecules inside the pore channels. Compared to zeolite 13X, Meso-13X-PEI showed much higher CO2 capture selectivity (against N2) as well as higher CO2 capture capacity at relatively high temperature (e.g. 100 °C) and dilute CO2 concentration relevant to post-combustion conditions.

  11. Quantitative analysis of aberrant protein glycosylation in liver cancer plasma by AAL-enrichment and MRM mass spectrometry.

    PubMed

    Ahn, Yeong Hee; Shin, Park Min; Kim, Yong-Sam; Oh, Na Ree; Ji, Eun Sun; Kim, Kwang Hoe; Lee, Yeon Jung; Kim, Sung Ho; Yoo, Jong Shin

    2013-11-07

    A lectin-coupled mass spectrometry (MS) approach was employed to quantitatively monitor aberrant protein glycosylation in liver cancer plasma. To do this, we compared the difference in the total protein abundance of a target glycoprotein between hepatocellular carcinoma (HCC) plasmas and hepatitis B virus (HBV) plasmas, as well as the difference in lectin-specific protein glycoform abundance of the target glycoprotein. Capturing the lectin-specific protein glycoforms from a plasma sample was accomplished by using a fucose-specific aleuria aurantia lectin (AAL) immobilized onto magnetic beads via a biotin-streptavidin conjugate. Following tryptic digestion of both the total plasma and its AAL-captured fraction of each HCC and HBV sample, targeted proteomic mass spectrometry was conducted quantitatively by a multiple reaction monitoring (MRM) technique. From the MRM-based analysis of the total plasmas and AAL-captured fractions, differences between HCC and HBV plasma groups in fucosylated glycoform levels of target glycoproteins were confirmed to arise from both the change in the total protein abundance of the target proteins and the change incurred by aberrant fucosylation on target glycoproteins in HCC plasma, even when no significant change occurs in the total protein abundance level. Combining the MRM-based analysis method with the lectin-capturing technique proved to be a successful means of quantitatively investigating aberrant protein glycosylation in cancer plasma samples. Additionally, it was elucidated that the differences between HCC and control groups in fucosylated biomarker candidates A1AT and FETUA mainly originated from an increase in fucosylation levels on these target glycoproteins, rather than an increase in the total protein abundance of the target glycoproteins.

  12. A novel Markov Blanket-based repeated-fishing strategy for capturing phenotype-related biomarkers in big omics data.

    PubMed

    Li, Hongkai; Yuan, Zhongshang; Ji, Jiadong; Xu, Jing; Zhang, Tao; Zhang, Xiaoshuai; Xue, Fuzhong

    2016-03-09

    We propose a novel Markov Blanket-based repeated-fishing strategy (MBRFS) in attempt to increase the power of existing Markov Blanket method (DASSO-MB) and maintain its advantages in omic data analysis. Both simulation and real data analysis were conducted to assess its performances by comparing with other methods including χ(2) test with Bonferroni and B-H adjustment, least absolute shrinkage and selection operator (LASSO) and DASSO-MB. A serious of simulation studies showed that the true discovery rate (TDR) of proposed MBRFS was always close to zero under null hypothesis (odds ratio = 1 for each SNPs) with excellent stability in all three scenarios of independent phenotype-related SNPs without linkage disequilibrium (LD) around them, correlated phenotype-related SNPs without LD around them, and phenotype-related SNPs with strong LD around them. As expected, under different odds ratio and minor allel frequency (MAFs), MBRFS always had the best performances in capturing the true phenotype-related biomarkers with higher matthews correlation coefficience (MCC) for all three scenarios above. More importantly, since proposed MBRFS using the repeated fishing strategy, it still captures more phenotype-related SNPs with minor effects when non-significant phenotype-related SNPs emerged under χ(2) test after Bonferroni multiple correction. The various real omics data analysis, including GWAS data, DNA methylation data, gene expression data and metabolites data, indicated that the proposed MBRFS always detected relatively reasonable biomarkers. Our proposed MBRFS can exactly capture the true phenotype-related biomarkers with the reduction of false negative rate when the phenotype-related biomarkers are independent or correlated, as well as the circumstance that phenotype-related biomarkers are associated with non-phenotype-related ones.

  13. Acute psychoactive effects of intravenous ketamine during treatment of mood disorders: Analysis of the Clinician Administered Dissociative State Scale.

    PubMed

    van Schalkwyk, Gerrit I; Wilkinson, Samuel T; Davidson, Larry; Silverman, Wendy K; Sanacora, Gerard

    2018-02-01

    Ketamine has rapid-acting antidepressant effects. Frequently, ketamine administration also causes acute psychoactive effects - in trials, these effects are commonly measured using the Clinician Administered Dissociative State Scale (CADSS). However, the CADSS was not designed for this specific purpose, having been validated in other clinical contexts, and anecdotally does not appear to fully capture ketamine's acute psychoactive effects. Data were obtained from 110 individuals with mood disorders (predominantly major depressive disorder) who underwent intravenous ketamine infusion. An exploratory factor analysis (EFA) was performed on the CADSS, along with assessment of internal consistency. Qualitative methods were used to conduct in-depth interviews with a subset of these participants to identify key features of the acute ketamine experience, including aspects that may not be captured by the CADSS. The mean total score of the CADSS was low at 7.7 (SD 9.2). Analysis of internal consistency showed a Cronbach's alpha of 0.74. Five CADSS items had low correlations with the total score. EFA lead to a one-factor solution containing 16 items. Five of the six highest loading items involved perceptual disturbances, either of time or sensation. Qualitative analyses of 10 patient narratives revealed two phenomena not captured on the CADSS: disinhibition and a sense of peace. This study was by limited by the absence of other ratings of the participants' experience. Findings suggest that the CADSS partially captures the acute effects of ketamine administration. Further research may seek to validate a revised version of the CADSS that more accurately measures these effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Cobalt related defect levels in silicon analyzed by temperature- and injection-dependent lifetime spectroscopy

    NASA Astrophysics Data System (ADS)

    Diez, S.; Rein, S.; Roth, T.; Glunz, S. W.

    2007-02-01

    Temperature- and injection-dependent lifetime spectroscopy (TIDLS) as a method to characterize point defects in silicon with several energy levels is demonstrated. An intentionally cobalt-contaminated p-type wafer was investigated by means of lifetime measurements performed at different temperatures up to 151°C. Two defect energy levels were required to model the lifetime curves on basis of the Shockley-Read-Hall statistics. The detailed analysis is based on the determination of the recently introduced defect parameter solution surface (DPSS) in order to extract the underlying defect parameters. A unique solution has been found for a deep defect level located in the upper band gap half with an energy depth of EC-Et=0.38±0.01eV, with a corresponding ratio of capture cross sections k =σn/σp=0.16 within the interval of uncertainty of 0.06-0.69. Additionally, a deep donor level in the lower band gap half known from the literature could be assigned to a second energy level within the DPSS analysis at Et-EV=0.41±0.02eV with a corresponding ratio of capture cross sections k =σn/σp=16±3. An investigation of the temperature dependence of the capture cross section for electrons suggests that the underlying recombination process of the defect in the lower band gap half is driven by a two stage cascade capture with an activation energy of ΔE =52±2meV. These results show that TIDLS in combination with DPSS analysis is a powerful method to characterize even multiple defect levels that are affecting carrier recombination lifetime in parallel.

  15. Composition of the excimer laser-induced plume produced during LASIK refractive surgery

    NASA Astrophysics Data System (ADS)

    Glickman, Randolph D.; Liu, Yun; Mayo, George L.; Baribeau, Alan D.; Starck, Tomy; Bankhead, Tom

    2003-07-01

    Because of concerns about potential hazards to surgical personnel of the plume associated with laser refractive surgery, this study was performed to characterize the composition of such plumes. Filter elements were removed from the smoke evacuator of a VISX S3 excimer laser (filter pore size ~0.3 microns) and from a Mastel Clean Room ( filter pore size ~0.2 microns) used with a LADARVISION excimer laser. The filters from both laser systems captured the laser-induced plumes from multiple, routine, LASIK patient procedures. Some filters were processed for scanning electron microscopy, while others were extracted with methanol and chloroform for biochemical analysis. Both the VISX "Final Air" filter and the Mastel "Clean Room" filter captured material that was not observed in filters that had clean operating room air only passed through them. In the VISX system, air flows through the filter unit parallel to the filter matrix. SEM analysis showed these filters captured discrete particles of 0.3 to 3.0 microns in size. In the Mastel Clean Room unit, air flows orthogonally through the filter, and the filter matrix was heavily layered with captured debris so that individual particles were not readily distinguished. Amino acid analysis and gel electrophoresis of extracted material revealed proteinaceous molecules as large as 5000 molecular weight. Such large molecules in the laser plume are not predicted by the existing theory of photochemical ablation. The presence of relatively large biomolecules may constitute a risk of allergenic reactions in personnel exposed to the plume, and also calls into question the precise mechanism of excimer laser photochemical ablation. Supported by the RMG Research Endowment, and Research to Prevent Blindness

  16. Whole exome sequencing is an efficient, sensitive and specific method of mutation detection in osteogenesis imperfecta and Marfan syndrome

    PubMed Central

    McInerney-Leo, Aideen M; Marshall, Mhairi S; Gardiner, Brooke; Coucke, Paul J; Van Laer, Lut; Loeys, Bart L; Summers, Kim M; Symoens, Sofie; West, Jennifer A; West, Malcolm J; Paul Wordsworth, B; Zankl, Andreas; Leo, Paul J; Brown, Matthew A; Duncan, Emma L

    2013-01-01

    Osteogenesis imperfecta (OI) and Marfan syndrome (MFS) are common Mendelian disorders. Both conditions are usually diagnosed clinically, as genetic testing is expensive due to the size and number of potentially causative genes and mutations. However, genetic testing may benefit patients, at-risk family members and individuals with borderline phenotypes, as well as improving genetic counseling and allowing critical differential diagnoses. We assessed whether whole exome sequencing (WES) is a sensitive method for mutation detection in OI and MFS. WES was performed on genomic DNA from 13 participants with OI and 10 participants with MFS who had known mutations, with exome capture followed by massive parallel sequencing of multiplexed samples. Single nucleotide polymorphisms (SNPs) and small indels were called using Genome Analysis Toolkit (GATK) and annotated with ANNOVAR. CREST, exomeCopy and exomeDepth were used for large deletion detection. Results were compared with the previous data. Specificity was calculated by screening WES data from a control population of 487 individuals for mutations in COL1A1, COL1A2 and FBN1. The target capture of five exome capture platforms was compared. All 13 mutations in the OI cohort and 9/10 in the MFS cohort were detected (sensitivity=95.6%) including non-synonymous SNPs, small indels (<10 bp), and a large UTR5/exon 1 deletion. One mutation was not detected by GATK due to strand bias. Specificity was 99.5%. Capture platforms and analysis programs differed considerably in their ability to detect mutations. Consumable costs for WES were low. WES is an efficient, sensitive, specific and cost-effective method for mutation detection in patients with OI and MFS. Careful selection of platform and analysis programs is necessary to maximize success. PMID:24501682

  17. Improved Vote Aggregation Techniques for the Geo-Wiki Cropland Capture Crowdsourcing Game

    NASA Astrophysics Data System (ADS)

    Baklanov, Artem; Fritz, Steffen; Khachay, Michael; Nurmukhametov, Oleg; Salk, Carl; See, Linda; Shchepashchenko, Dmitry

    2016-04-01

    Crowdsourcing is a new approach for solving data processing problems for which conventional methods appear to be inaccurate, expensive, or time-consuming. Nowadays, the development of new crowdsourcing techniques is mostly motivated by so called Big Data problems, including problems of assessment and clustering for large datasets obtained in aerospace imaging, remote sensing, and even in social network analysis. By involving volunteers from all over the world, the Geo-Wiki project tackles problems of environmental monitoring with applications to flood resilience, biomass data analysis and classification of land cover. For example, the Cropland Capture Game, which is a gamified version of Geo-Wiki, was developed to aid in the mapping of cultivated land, and was used to gather 4.5 million image classifications from the Earth's surface. More recently, the Picture Pile game, which is a more generalized version of Cropland Capture, aims to identify tree loss over time from pairs of very high resolution satellite images. Despite recent progress in image analysis, the solution to these problems is hard to automate since human experts still outperform the majority of machine learning algorithms and artificial systems in this field on certain image recognition tasks. The replacement of rare and expensive experts by a team of distributed volunteers seems to be promising, but this approach leads to challenging questions such as: how can individual opinions be aggregated optimally, how can confidence bounds be obtained, and how can the unreliability of volunteers be dealt with? In this paper, on the basis of several known machine learning techniques, we propose a technical approach to improve the overall performance of the majority voting decision rule used in the Cropland Capture Game. The proposed approach increases the estimated consistency with expert opinion from 77% to 86%.

  18. Membrane microfilter device for selective capture, electrolysis and genomic analysis of human circulating tumor cells.

    PubMed

    Zheng, Siyang; Lin, Henry; Liu, Jing-Quan; Balic, Marija; Datar, Ram; Cote, Richard J; Tai, Yu-Chong

    2007-08-31

    This paper presents development of a parylene membrane microfilter device for single stage capture and electrolysis of circulating tumor cells (CTCs) in human blood, and the potential of this device to allow genomic analysis. The presence and number of CTCs in blood has recently been demonstrated to provide significant prognostic information for patients with metastatic breast cancer. While finding as few as five CTCs in about 7.5mL of blood (i.e., 10(10) blood cells in) is clinically significant, detection of CTCs is currently difficult and time consuming. CTC enrichment is performed by either gradient centrifugation of CTC based on their buoyant density or magnetic separation of epithelial CTC, both of which are laborious procedures with variable efficiency, and CTC identification is typically done by trained pathologists through visual observation of stained cytokeratin-positive epithelial CTC. These processes may take hours, if not days. Work presented here provides a micro-electro-mechanical system (MEMS)-based option to make this process simpler, faster, better and cheaper. We exploited the size difference between CTCs and human blood cells to achieve the CTC capture on filter with approximately 90% recovery within 10 min, which is superior to current approaches. Following capture, we facilitated polymerase chain reaction (PCR)-based genomic analysis by performing on-membrane electrolysis with embedded electrodes reaching each of the individual 16,000 filtering pores. The biggest advantage for this on-membrane in situ cell lysis is the high efficiency since cells are immobilized, allowing their direct contact with electrodes. As a proof-of-principle, we show beta actin gene PCR, the same technology can be easily extended to real time PCR for CTC-specific transcript to allow molecular identification of CTC and their further characterization.

  19. Preliminary system design of a Three Arm Capture Mechanism (TACM) flight demonstration article

    NASA Technical Reports Server (NTRS)

    Schaefer, Otto; Stasi, Bill

    1993-01-01

    The overall objective of the Three Arm Capture Mechanism (TACM) is to serve as a demonstration of capability for capture of objects in space. These objects could be satellites, expended boosters, pieces of debris, etc.; anything of significant size. With this capability we can significantly diminish the danger of major collisions of debris with valuable space assets and with each other, which would otherwise produce many smaller, high velocity pieces of debris which also become concerns. The captured objects would be jettisoned into the atmosphere, relocated in 'parking' orbits, or recovered for disposition or refurbishment. The dollar value of satellites launched into space continues to grow along with the cost of insurance; having a capture capability takes a positive step towards diminishing this added cost. The effort covered is a planning step towards a flight demonstration of the satellite capture capability. Based on the requirement to capture a communication class satellite, its associated booster, or both, a preliminary system definition of a retrieval kit is defined. The objective of the flight demonstration is to demonstrate the techniques proposed to perform the mission and to obtain data on technical issues requiring an in situ space environment. The former especially includes issues such as automated image recognition techniques and control strategies that enable an unmanned vehicle to rendezvous and capture a satellite, contact dynamics between the two bodies, and the flight segment level of automation required to support the mission. A development plan for the operational retrieval capability includes analysis work, computer and ground test simulations, and finally a flight demonstration. A concept to perform a selected mission capturing a precessing communications satellite is described. Further development efforts using analytical tools and laboratory facilities are required prior to reaching the point at which a full commitment to the flight demonstration design can be made.

  20. Ecological Footprint Analysis (EFA) for the Chicago Metropolitan Area: Initial Estimation - slides

    EPA Science Inventory

    Because of its computational simplicity, Ecological Footprint Analysis (EFA) has been extensively deployed for assessing the sustainability of various environmental systems. In general, EFA aims at capturing the impacts of human activity on the environment by computing the amount...

  1. Comparaison de la performance environnementale de la production thermique d'electricite avec et sans sequestration geologique du dioxyde de carbone

    NASA Astrophysics Data System (ADS)

    Bellerive, Nathalie

    The research project hypothesis is that CO2 capture and sequestration technologies (CSC) leads to a significant decrease in global warming, but increases the impact of all other aspects of the study. This is because other processes used for CO2 capture and sequestration require additional quantities of raw materials and energy. Two other objectives are described in this project. The first is the modeling of an Integrated Gasification Combined Cycle power plant for which there is no known generic data. The second is to select the right hypothesis regarding electrical production technologies, CO2 capture, compression and transportation by pipeline and finally sequestration. "Life Cycle Assessment" (LCA) analyses were chosen for this research project. LCA is an exhaustive quantitative method used to evaluate potential environmental impacts associated with a product, a service or an activity from resource extraction to waste elimination. This tool is governed by ISO 14 040 through ISO 14 049 and is sustained by the Society of Environmental Toxicology and Chemistry (SETAC) and the United Nations Environment Program (UNEP). Two power plants were studied, the Integrated Gasification Combined Cycle (IGCC) power plant and the Natural Gas Combined Cycle (NGCC) power plant. In order to sequester CO2 in geological formation, it is necessary to extract CO2from emission flows. For the IGCC power plant, CO 2 was captured before the burning phase. For the NGCC power plant, the capture was done during the afterburning phase. Once the CO2 was isolated, it was compressed and directed through a transportation pipe 1 000 km in length on the ground surface and in the sea. It is hypothesized that the power plant is 300 km from the shore and the sequestration platform 700 km from France's shore, in the North Sea. The IGCC power plant modeling and data selection regarding CO2 capture and sequestration were done by using primary data from the industry and the Ecoinvent generic database (Version 1.2). This database was selected due to its European source. Finally, technical calculations and literature were used to complete the data inventory. This was validated by electrical experts in order to increase data and modeling precision. Results were similar for IGCC and NGCC power plants using Impact 2002+, an impacts analysis method. Global warming potential decreased by 67% with the implementation of CO2 capture and sequestration compared to systems without CSC. Results for all others impacts categories, demonstrated an increase from 16% to 116% in relative proportions compared to systems without CSC. The main contributor was the additional quantity of energy required to operate CO2 capture and compression facilities. This additional energy negatively affected the power plant's global efficiency because of the increase in the quantity of fossil fuel that needed to be extracted and consumed. The increase in other impacts was mainly due to additional electricity, fossil fuel (for extracting, treatment and transportation) and additional emissions generated during power plant operations. A scenario analysis was done to study the sensitivity and variability of uncertain data during the software modeling process of a power plant. Data on power plant efficiency is the most variable and sensitive during modeling, followed by the length of the transportation pipe and the leaking rate during CO2 sequestration. This result analysis is interesting because it led to the maximum efficiency scenario with capture (with a short CO 2 transportation distance and a low leaking rate) obtaining better results on all impact category indicators, compared to the minimum efficiency scenario without capture. In fact, positive results on all category indicators were possible during the system comparison between the two cases (with and without capture). (Abstract shortened by UMI.)

  2. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the

  3. Innovative Use of Cr(VI) Plume Depictions and Pump-and-Treat Capture Analysis to Estimate Risks of Contaminant Discharge to Surface Water at Hanford Reactor Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Chuck W.; Hanson, James P.; Ivarson, Kristine A.

    2015-01-14

    The Hanford Site nuclear reactor operations required large quantities of high-quality cooling water, which was treated with chemicals including sodium dichromate dihydrate for corrosion control. Cooling water leakage, as well as intentional discharge of cooling water to ground during upset conditions, produced extensive groundwater recharge mounds consisting largely of contaminated cooling water and resulted in wide distribution of hexavalent chromium (Cr[VI]) contamination in the unconfined aquifer. The 2013 Cr(VI) groundwater plumes in the 100 Areas cover approximately 6 km2 (1500 acres), primarily in the 100-HR-3 and 100-KR-4 groundwater operable units (OUs). The Columbia River is a groundwater discharge boundary; wheremore » the plumes are adjacent to the Columbia River there remains a potential to discharge Cr(VI) to the river at concentrations above water quality criteria. The pump-and-treat systems along the River Corridor are operating with two main goals: 1) protection of the Columbia River, and 2) recovery of contaminant mass. An evaluation of the effectiveness of the pump-and-treat systems was needed to determine if the Columbia River was protected from contamination, and also to determine where additional system modifications may be needed. In response to this need, a technique for assessing the river protection was developed which takes into consideration seasonal migration of the plume and hydraulic performance of the operating well fields. Groundwater contaminant plume maps are generated across the Hanford Site on an annual basis. The assessment technique overlays the annual plume and the capture efficiency maps for the various pump and treat systems. The river protection analysis technique was prepared for use at the Hanford site and is described in detail in M.J. Tonkin, 2013. Interpolated capture frequency maps, based on mapping dynamic water level observed in observation wells and derived water levels in the vicinity of extraction and injection wells, are developed initially. Second, simulated capture frequency maps are developed, based on transport modelling results. Both interpolated and simulated capture frequency maps are based on operation of the systems over a full year. These two capture maps are then overlaid on the plume distribution maps for inspection of the relative orientation of the contaminant plumes with the capture frequency. To quantify the relative degree of protection of the river from discharges of Cr(VI) (and conversely, the degree of threat) at any particular location, a systematic method of evaluating and mapping the plume/capture relationship was developed. By comparing the spatial relationship between contaminant plumes and hydraulic capture frequency, an index of relative protectiveness is developed and the results posted on the combined plume/capture plan view map. Areas exhibiting lesser degrees of river protection are identified for remedial process optimization actions to control plumes and prevent continuing discharge of Cr(VI) to the river.« less

  4. CO2 Capture from the Air: Technology Assessment and Implications for Climate Policy

    NASA Astrophysics Data System (ADS)

    Keith, D. W.

    2002-05-01

    It is physically possible to capture CO2 directly from the air and immobilize it in geological structures. Today, there are no large-scale technologies that achieve air capture at reasonable cost. Yet, strong arguments suggest that it will comparatively easy to develop practical air capture technologies on the timescales relevant to climate policy [1]. This paper first analyzes the cost of air capture and then assesses the implications for climate policy. We first analyze the lower bound on the cost needed for air capture, describing the thermodynamic and physical limits to the use of energy and land. We then compare the costs of air capture to the cost of capture from combustion exhaust streams. While the intrinsic minimum energy requirement is larger for air capture, we argue that air capture has important structural advantages, such as the reduction of transport costs and the larger potential for economies of scale. These advantages suggest that, in the long-run air capture be competitive with other methods of achieving deep emissions reductions. We provide a preliminary engineering-economic analysis of an air capture system based on CaO to CaCO3 chemical looping [1]. We analyze the possibility of doing the calcination in a modified pressurized fluidized bed combustor (PFBC) burning coal in a CO2 rich atmosphere with oxygen supplied by an air separation unit. The CaCO3-to-coal ratio would be ~2:1 and the system would be nearly thermally neutral. PFBC systems have been demonstrated at capacities of over 100 MW. Such systems already include CaCO3 injection for sulfur control, and operate at suitable temperatures and pressures for calcination. We assess the potential to recover heat from the dissolution of CaO in order to reduce the overall energy requirements. We analyze the possibility of adapting existing large water/air heat exchangers for use as contacting systems to capture CO2 from the air using the calcium hydroxide solution. The implications of air capture for global climate policy are examined using DIAM [2], a stylized integrated assessment model. We find that air capture can fundamentally alter the temporal dynamics of global warming mitigation. The reason for this is that air capture differs from conventional mitigation in three key aspects. First, it removes emissions from any part of the economy with equal ease or difficulty, so its cost provides an absolute cap on the cost of mitigation. Second, it permits reduction in concentrations faster than the natural carbon cycle: the effects of irreversibility are thus partly alleviated. Third, because it is less coupled with the energy system, air capture may offer stronger economies of scale and smaller adjustment costs than the more conventional mitigation technologies. Air capture limits the total cost of a worst-case climate scenario. In an optimal sequential decision framework with uncertainty, existence of air capture decreases the need for near-term precautionary abatement. Like geoengineering, air capture thus poses a moral hazard. 1. S. Elliott, et al. Compensation of atmospheric CO2 buildup through engineered chemical sinkage. Geophys. Res. Let., 28:1235-1238, 2001. 2. Minh Ha-Duong, Michael J. Grubb, and Jean-Charles Hourcade. Influence of socioeconomic inertia and uncertainty on optimal CO2-emission abatement. Nature, 390: 270-274, 1997.

  5. Identifying sources of heterogeneity in capture probabilities: An example using the Great Tit Parus major

    USGS Publications Warehouse

    Senar, J.C.; Conroy, M.J.; Carrascal, L.M.; Domenech, J.; Mozetich, I.; Uribe, F.

    1999-01-01

    Heterogeneous capture probabilities are a common problem in many capture-recapture studies. Several methods of detecting the presence of such heterogeneity are currently available, and stratification of data has been suggested as the standard method to avoid its effects. However, few studies have tried to identify sources of heterogeneity, or whether there are interactions among sources. The aim of this paper is to suggest an analytical procedure to identify sources of capture heterogeneity. We use data on the sex and age of Great Tits captured in baited funnel traps, at two localities differing in average temperature. We additionally use 'recapture' data obtained by videotaping at feeder (with no associated trap), where the tits ringed with different colours were recorded. This allowed us to test whether individuals in different classes (age, sex and condition) are not trapped because of trap shyness or because o a reduced use of the bait. We used logistic regression analysis of the capture probabilities to test for the effects of age, sex, condition, location and 'recapture method. The results showed a higher recapture probability in the colder locality. Yearling birds (either males or females) had the highest recapture prob abilities, followed by adult males, while adult females had the lowest recapture probabilities. There was no effect of the method of 'recapture' (trap or video tape), which suggests that adult females are less often captured in traps no because of trap-shyness but because of less dependence on supplementary food. The potential use of this methodological approach in other studies is discussed.

  6. Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjostrom, Sharon

    2016-06-02

    ADA completed a DOE-sponsored program titled Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture under program DE-FE0004343. During this program, sorbents were analyzed for use in a post-combustion CO 2 capture process. A supported amine sorbent was selected based upon superior performance to adsorb a greater amount of CO 2 than the activated carbon sorbents tested. When the most ideal sorbent at the time was selected, it was characterized and used to create a preliminary techno-economic analysis (TEA). A preliminary 550 MW coal-fired power plant using Illinois #6 bituminous coal was designed with a solid sorbentmore » CO 2 capture system using the selected supported amine sorbent to both facilitate the TEA and to create the necessary framework to scale down the design to a 1 MWe equivalent slipstream pilot facility. The preliminary techno-economic analysis showed promising results and potential for improved performance for CO 2 capture compared to conventional MEA systems. As a result, a 1 MWe equivalent solid sorbent system was designed, constructed, and then installed at a coal-fired power plant in Alabama. The pilot was designed to capture 90% of the CO 2 from the incoming flue gas at 1 MWe net electrical generating equivalent. Testing was not possible at the design conditions due to changes in sorbent handling characteristics at post-regenerator temperatures that were not properly incorporated into the pilot design. Thus, severe pluggage occurred at nominally 60% of the design sorbent circulation rate with heated sorbent, although no handling issues were noted when the system was operated prior to bringing the regenerator to operating temperature. Testing within the constraints of the pilot plant resulted in 90% capture of the incoming CO 2 at a flow rate equivalent of 0.2 to 0.25 MWe net electrical generating equivalent. The reduction in equivalent flow rate at 90% capture was primarily the result of sorbent circulation limitations at operating temperatures combined with pre-loading of the sorbent with CO 2 prior to entering the adsorber. Specifically, CO 2-rich gas was utilized to convey sorbent from the regenerator to the adsorber. This gas was nominally 45°C below the regenerator temperature during testing. ADA’s post-combustion capture system with modifications to overcome pilot constraints, in conjunction with incorporating a sorbent with CO 2 working capacity of 15 g CO 2/100 g sorbent and a contact time of 10 to 15 minutes or less with flue gas could provide significant cost and performance benefits when compared to an MEA system.« less

  7. Loss of pace capture on the ablation line: a new marker for complete radiofrequency lesions to achieve pulmonary vein isolation.

    PubMed

    Steven, Daniel; Reddy, Vivek Y; Inada, Keiichi; Roberts-Thomson, Kurt C; Seiler, Jens; Stevenson, William G; Michaud, Gregory F

    2010-03-01

    Catheter ablation procedures for atrial fibrillation (AF) often involve circumferential antral isolation of pulmonary veins (PV). Inability to reliably identify conduction gaps on the ablation line necessitates placing additional lesions within the intended lesion set. This pilot study investigated the relationship between loss of pace capture directly along the ablation line and electrogram criteria for PV isolation (PVI). Using a 3-dimensional anatomic mapping system and irrigated-tip radiofrequency (RF) ablation catheter, lesions were placed in the PV antra to encircle ipsilateral vein pairs until pace capture at 10 mA/2 ms no longer occurred along the line. During ablation, a circular mapping catheter was placed in an ipsilateral PV, but the electrograms were not revealed until loss-of-pace capture. The procedural end point was PVI (entrance and exit block). Thirty patients (57 +/- 12 years; 15 male [50%]) undergoing PVI in 2 centers (3 primary operators) were included (left atrial diameter 40 +/- 4 mm, left ventricular ejection fraction 60 +/- 7%). All patients reached the end points of complete PVI and loss of pace capture. When PV electrograms were revealed after loss of pace capture along the line, PVI was present in 57 of 60 (95%) vein pairs. In the remaining 3 of 60 (5%) PV pairs, further RF applications achieved PVI. The procedure duration was 237 +/- 46 minutes, with a fluoroscopy time of 23 +/- 9 minutes. Analysis of the blinded PV electrograms revealed that even after PVI was achieved, additional sites of pace capture were present on the ablation line in 30 of 60 (50%) of the PV pairs; 10 +/- 4 additional RF lesions were necessary to fully achieve loss of pace capture. After ablation, the electrogram amplitude was lower at unexcitable sites (0.25 +/- 0.15 mV vs. 0.42 +/- 0.32 mV, P < .001), but there was substantial overlap with pace capture sites, suggesting that electrogram amplitude lacks specificity for identifying pace capture sites. Complete loss of pace capture directly along the circumferential ablation line correlates with entrance block in 95% of vein pairs and can be achieved without circular mapping catheter guidance. Thus, pace capture along the ablation line can be used to identify conduction gaps. Interestingly, more RF ablation energy was required to achieve loss of pace capture along the ablation line than for entrance block into PVs. Further study is warranted to determine whether this method results in more durable ablation lesions that reduce recurrence of AF. Copyright 2010 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  8. Trajectory determinations and collection of micrometeoroids on the space station. Report of the Workshop on Micrometeorite Capture Experiments

    NASA Technical Reports Server (NTRS)

    Hoerz, F. (Editor)

    1986-01-01

    Summaries of papers presented at the Workshop on Micrometeorite Capture Experiments are compiled. The goals of the workshop were to define the scientific objectives and the resulting performance requirements of a potential Space Station facility and to identify the major elements of a coherent development program that would generate the desired capabilities within the next decade. Specific topics include cosmic dust and space debris collection techniques, particle trajectory and source determination, and specimen analysis methods.

  9. Smart Hydrogel Particles: Biomarker Harvesting: One-step affinity purification, size exclusion, and protection against degradation

    PubMed Central

    Luchini, Alessandra; Geho, David H.; Bishop, Barney; Tran, Duy; Xia, Cassandra; Dufour, Robert; Jones, Clint; Espina, Virginia; Patanarut, Alexis; Zhu, Weidong; Ross, Mark; Tessitore, Alessandra; Petricoin, Emanuel; Liotta, Lance A.

    2010-01-01

    Disease-associated blood biomarkers exist in exceedingly low concentrations within complex mixtures of high-abundance proteins such as albumin. We have introduced an affinity bait molecule into N-isopropylacrylamide to produce a particle that will perform three independent functions within minutes, in one step, in solution: a) molecular size sieving b) affinity capture of all solution phase target molecules, and c) complete protection of harvested proteins from enzymatic degradation. The captured analytes can be readily electroeluted for analysis. PMID:18076201

  10. Design of a MATLAB(registered trademark) Image Comparison and Analysis Tool for Augmentation of the Results of the Ann Arbor Distortion Test

    DTIC Science & Technology

    2016-06-25

    The equipment used in this procedure includes: Ann Arbor distortion tester with 50-line grating reticule, IQeye 720 digital video camera with 12...and import them into MATLAB. In order to digitally capture images of the distortion in an optical sample, an IQeye 720 video camera with a 12... video camera and Ann Arbor distortion tester. Figure 8. Computer interface for capturing images seen by IQeye 720 camera. Once an image was

  11. Analysis of Cadmium Based Neutron Detector Configurations

    NASA Astrophysics Data System (ADS)

    James, Brian; Rees, Lawrence; Czirr, J. Bart

    2012-10-01

    Due to national security concerns pertaining to the smuggling of special nuclear materials and a small supply of He-3 for use in neutron detectors, there is currently a need for a new kind of neutron detector. Using Monte Carlo techniques I have studied the neutron capture efficiency of an array of cadmium wedge detectors in the presence of a californium source. By using varying numbers of wedges and comparing their capture ratios we will be better able to design future detectors.

  12. Sampling and Analysis of Organic Molecules in the Plumes of Enceladus

    NASA Astrophysics Data System (ADS)

    Monroe, A. A.; Williams, P.; Anbar, A. D.; Tsou, P.

    2012-12-01

    The recent detection of organic molecules in the plumes of Enceladus, which also contain water and nitrogen (Waite et al., 2006; Matson et al., 2007), suggests that the geologically active South polar region contains habitable, subsurface water (McKay et al., 2008). Characterizing these molecules will be a high priority for any future mission to Enceladus. Sample return is highly desirable, but can it capture useful samples at Enceladus? Using Stardust mission parameters for comparison, we consider the survival of complex organic molecules during collection to assess the feasibility of one aspect of a sample return mission. A successful sample return mission must include the capability to capture and recover intact or partly intact molecules of particular astrobiological interest: lipids, amino and nucleic acids, polypeptides, and polynucleotides. The Stardust mission to comet Wild 2 successfully captured amino acids, amines, and PAHs using a combination of aerogel and Al foil (Sandford et al., 2006, 2010). For larger and more fragile molecules, particularly polypeptides and polynucleotides, low collisional damage is achieved by impact on low molecular weight surfaces. A particularly intriguing possibility is a capture surface pre-coated with organic matrices identified as ideal for analysis of various biomolecules using MALDI-MS (matrix-assisted laser desorption/ionization mass spectrometry) (Hillenkamp and Karas, 2007). MALDI is a standard technique with attomole sensitivity, exceptional mass resolution, and (bio)molecular specificity (Vestal, 2011). Capture surfaces appropriate for MALDI-MS analysis could be analyzed directly without post-return manipulation, minimizing post-capture damage to these molecules and the risk of contamination during handling. A hypothetical sample collection encounter speed of ~ 5 km/s corresponds to ~0.13 eV kinetic energy per amu. Studies of molecule survival and fragmentation exist for free hexapeptides impacting hydrocarbon surfaces in this energy range (Gu et al., 1999). Although a significant fraction of polypeptides fragment at these energies, typically only a subset of all the peptide bonds are cleaved, preserving some sequence information (Gu et al., 1999). Molecules encapsulated in ice grains may also be encountered and collected. It has been demonstrated that polypeptides and even nucleic acids can survive ice grain impacts at these energies because ice grain vaporization absorbs much of the impact energy (Aksyonov and Williams, 2001). For either scenario—isolated molecule or ice grain impact—molecules or significant fragments will mostly depart the initial impact surface at low energies and can be collected on adjacent capture surfaces. These preliminary considerations suggest that molecular sample return from Enceladus is feasible and would allow characterization with the full sensitivity and resolving power of modern terrestrial biomolecular mass spectrometry.

  13. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  14. Verification of Orthogrid Finite Element Modeling Techniques

    NASA Technical Reports Server (NTRS)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  15. Bad data packet capture device

    DOEpatents

    Chen, Dong; Gara, Alan; Heidelberger, Philip; Vranas, Pavlos

    2010-04-20

    An apparatus and method for capturing data packets for analysis on a network computing system includes a sending node and a receiving node connected by a bi-directional communication link. The sending node sends a data transmission to the receiving node on the bi-directional communication link, and the receiving node receives the data transmission and verifies the data transmission to determine valid data and invalid data and verify retransmissions of invalid data as corresponding valid data. A memory device communicates with the receiving node for storing the invalid data and the corresponding valid data. A computing node communicates with the memory device and receives and performs an analysis of the invalid data and the corresponding valid data received from the memory device.

  16. Analyzing crime scene videos

    NASA Astrophysics Data System (ADS)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  17. Using Musical Intervals to Demonstrate Superposition of Waves and Fourier Analysis

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2013-01-01

    What follows is a description of a demonstration of superposition of waves and Fourier analysis using a set of four tuning forks mounted on resonance boxes and oscilloscope software to create, capture and analyze the waveforms and Fourier spectra of musical intervals.

  18. Inferring species interactions through joint mark–recapture analysis

    USGS Publications Warehouse

    Yackulic, Charles B.; Korman, Josh; Yard, Michael D.; Dzul, Maria C.

    2018-01-01

    Introduced species are frequently implicated in declines of native species. In many cases, however, evidence linking introduced species to native declines is weak. Failure to make strong inferences regarding the role of introduced species can hamper attempts to predict population viability and delay effective management responses. For many species, mark–recapture analysis is the more rigorous form of demographic analysis. However, to our knowledge, there are no mark–recapture models that allow for joint modeling of interacting species. Here, we introduce a two‐species mark–recapture population model in which the vital rates (and capture probabilities) of one species are allowed to vary in response to the abundance of the other species. We use a simulation study to explore bias and choose an approach to model selection. We then use the model to investigate species interactions between endangered humpback chub (Gila cypha) and introduced rainbow trout (Oncorhynchus mykiss) in the Colorado River between 2009 and 2016. In particular, we test hypotheses about how two environmental factors (turbidity and temperature), intraspecific density dependence, and rainbow trout abundance are related to survival, growth, and capture of juvenile humpback chub. We also project the long‐term effects of different rainbow trout abundances on adult humpback chub abundances. Our simulation study suggests this approach has minimal bias under potentially challenging circumstances (i.e., low capture probabilities) that characterized our application and that model selection using indicator variables could reliably identify the true generating model even when process error was high. When the model was applied to rainbow trout and humpback chub, we identified negative relationships between rainbow trout abundance and the survival, growth, and capture probability of juvenile humpback chub. Effects on interspecific interactions on survival and capture probability were strongly supported, whereas support for the growth effect was weaker. Environmental factors were also identified to be important and in many cases stronger than interspecific interactions, and there was still substantial unexplained variation in growth and survival rates. The general approach presented here for combining mark–recapture data for two species is applicable in many other systems and could be modified to model abundance of the invader via other modeling approaches.

  19. Feasibility of single-beat full-volume capture real-time three-dimensional echocardiography and auto-contouring algorithm for quantification of left ventricular volume: validation with cardiac magnetic resonance imaging.

    PubMed

    Chang, Sung-A; Lee, Sang-Chol; Kim, Eun-Young; Hahm, Seung-Hee; Jang, Shin Yi; Park, Sung-Ji; Choi, Jin-Oh; Park, Seung Woo; Choe, Yeon Hyeon; Oh, Jae K

    2011-08-01

    With recent developments in echocardiographic technology, a new system using real-time three-dimensional echocardiography (RT3DE) that allows single-beat acquisition of the entire volume of the left ventricle and incorporates algorithms for automated border detection has been introduced. Provided that these techniques are acceptably reliable, three-dimensional echocardiography may be much more useful for clinical practice. The aim of this study was to evaluate the feasibility and accuracy of left ventricular (LV) volume measurements by RT3DE using the single-beat full-volume capture technique. One hundred nine consecutive patients scheduled for cardiac magnetic resonance imaging and RT3DE using the single-beat full-volume capture technique on the same day were recruited. LV end-systolic volume, end-diastolic volume, and ejection fraction were measured using an auto-contouring algorithm from data acquired on RT3DE. The data were compared with the same measurements obtained using cardiac magnetic resonance imaging. Volume measurements on RT3DE with single-beat full-volume capture were feasible in 84% of patients. Both interobserver and intraobserver variability of three-dimensional measurements of end-systolic and end-diastolic volumes showed excellent agreement. Pearson's correlation analysis showed a close correlation of end-systolic and end-diastolic volumes between RT3DE and cardiac magnetic resonance imaging (r = 0.94 and r = 0.91, respectively, P < .0001 for both). Bland-Altman analysis showed reasonable limits of agreement. After application of the auto-contouring algorithm, the rate of successful auto-contouring (cases requiring minimal manual corrections) was <50%. RT3DE using single-beat full-volume capture is an easy and reliable technique to assess LV volume and systolic function in clinical practice. However, the image quality and low frame rate still limit its application for dilated left ventricles, and the automated volume analysis program needs more development to make it clinically efficacious. Copyright © 2011 American Society of Echocardiography. Published by Mosby, Inc. All rights reserved.

  20. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  1. Development of esMOCA Biomechanic, Motion Capture Instrumentation for Biomechanics Analysis

    NASA Astrophysics Data System (ADS)

    Arendra, A.; Akhmad, S.

    2018-01-01

    This study aims to build motion capture instruments using inertial measurement unit sensors to assist in the analysis of biomechanics. Sensors used are accelerometer and gyroscope. Estimation of orientation sensors is done by digital motion processing in each sensor nodes. There are nine sensor nodes attached to the upper limbs. This sensor is connected to the pc via a wireless sensor network. The development of kinematics and inverse dynamamic models of the upper limb is done in simulink simmechanic. The kinematic model receives streaming data of sensor nodes mounted on the limbs. The output of the kinematic model is the pose of each limbs and visualized on display. The dynamic inverse model outputs the reaction force and reaction moment of each joint based on the limb motion input. Model validation in simulink with mathematical model of mechanical analysis showed results that did not differ significantly

  2. An enrichment method based on synergistic and reversible covalent interactions for large-scale analysis of glycoproteins.

    PubMed

    Xiao, Haopeng; Chen, Weixuan; Smeekens, Johanna M; Wu, Ronghu

    2018-04-27

    Protein glycosylation is ubiquitous in biological systems and essential for cell survival. However, the heterogeneity of glycans and the low abundance of many glycoproteins complicate their global analysis. Chemical methods based on reversible covalent interactions between boronic acid and glycans have great potential to enrich glycopeptides, but the binding affinity is typically not strong enough to capture low-abundance species. Here, we develop a strategy using dendrimer-conjugated benzoboroxole to enhance the glycopeptide enrichment. We test the performance of several boronic acid derivatives, showing that benzoboroxole markedly increases glycopeptide coverage from human cell lysates. The enrichment is further improved by conjugating benzoboroxole to a dendrimer, which enables synergistic benzoboroxole-glycan interactions. This robust and simple method is highly effective for sensitive glycoproteomics analysis, especially capturing low-abundance glycopeptides. Importantly, the enriched glycopeptides remain intact, making the current method compatible with mass-spectrometry-based approaches to identify glycosylation sites and glycan structures.

  3. Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykes, K.; Graf, P.; Scott, G.

    2015-01-01

    The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less

  4. Value flow mapping: Using networks to inform stakeholder analysis

    NASA Astrophysics Data System (ADS)

    Cameron, Bruce G.; Crawley, Edward F.; Loureiro, Geilson; Rebentisch, Eric S.

    2008-02-01

    Stakeholder theory has garnered significant interest from the corporate community, but has proved difficult to apply to large government programs. A detailed value flow exercise was conducted to identify the value delivery mechanisms among stakeholders for the current Vision for Space Exploration. We propose a method for capturing stakeholder needs that explicitly recognizes the outcomes required of the value creating organization. The captured stakeholder needs are then translated into input-output models for each stakeholder, which are then aggregated into a network model. Analysis of this network suggests that benefits are infrequently linked to the root provider of value. Furthermore, it is noted that requirements should not only be written to influence the organization's outputs, but also to influence the propagation of benefit further along the value chain. A number of future applications of this model to systems architecture and requirement analysis are discussed.

  5. Bench Scale Development and Testing of Aerogel Sorbents for CO 2 Capture Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Begag, Redouane

    The primary objective of this project was scaling up and evaluating a novel Amine Functionalized Aerogel (AFA) sorbent in a bench scale fluidized bed reactor. The project team (Aspen Aerogels, University of Akron, ADA-ES, and Longtail Consulting) has carried out numerous tests and optimization studies to demonstrate the CO 2 capture performance of the AFA sorbent in all its forms: powder, pellet, and bead. The CO 2 capture target performance of the AFA sorbent (all forms) were set at > 12 wt.% and > 6 wt.% for total and working CO 2 capacity, respectively (@ 40 °C adsorption / 100more » – 120 °C desorption). The optimized AFA powders outperformed the performance targets by more than 30%, for the total CO 2 capacity (14 - 20 wt.%), and an average of 10 % more for working CO 2 capacity (6.6 – 7.0 wt.%, and could be as high as 9.6 wt. % when desorbed at 120 °C). The University of Akron developed binder formulations, pellet production methods, and post treatment technology for increased resistance to attrition and flue gas contaminants. In pellet form the AFA total CO 2 capacity was ~ 12 wt.% (over 85% capacity retention of that of the powder), and there was less than 13% degradation in CO 2 capture capacity after 20 cycles in the presence of 40 ppm SO 2. ADA-ES assessed the performance of the AFA powder, pellet, and bead by analyzing sorption isotherms, water uptake analysis, cycling stability, jet cup attrition and crush tests. At bench scale, the hydrodynamic and heat transfer properties of the AFA sorbent pellet in fluidized bed conditions were evaluated at Particulate Solid Research, Inc. (PSRI). After the process design requirements were completed, by Longtail Consulting LLC, a techno-economic analysis was achieved using guidance from The National Energy Technology Laboratory (NETL) report. This report provides the necessary framework to estimate costs for a temperature swing post combustion CO 2 capture process using a bituminous coal fired, super-critical steam cycle power plant producing 550 MWe net generation with 90% CO 2 capture using a methylethylamine (MEA) solvent. Using the NETL report as guidance, the designed CO 2 capture system was analyzed on a cost basis to determine relative cost estimates between the benchmark MEA system and the AFA sorbent system.« less

  6. Using timed event sequential data in nursing research.

    PubMed

    Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony

    2015-01-01

    Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.

  7. Constraining the calculation of U 234 , 236 , 238 ( n , γ ) cross sections with measurements of the γ -ray spectra at the DANCE facility

    DOE PAGES

    Ullmann, J. L.; Kawano, T.; Baramsai, B.; ...

    2017-08-31

    The cross section for neutron capture in the continuum region has been difficult to calculate accurately. Previous results for 238 U show that including an M 1 scissors-mode contribution to the photon strength function resulted in very good agreement between calculation and measurement. Our paper extends that analysis to 234 , 236 U by using γ -ray spectra measured with the Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center to constrain the photon strength function used to calculate the capture cross section. Calculations using a strong scissors-mode contribution reproduced the measured γ -ray spectramore » and were in excellent agreement with the reported cross sections for all three isotopes.« less

  8. Progress on the Europium Neutron-Capture Study using DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agvaanluvsan, U; Becker, J A; Macri, R A

    2006-09-05

    The accurate measurement of neutron-capture cross sections of the Eu isotopes is important for many reasons including nuclear astrophysics and nuclear diagnostics. Neutron capture excitation functions of {sup 151,153}Eu targets were measured recently using a 4{pi} {gamma}-ray calorimeter array DANCE located at the Los Alamos Neutron Science Center for E{sub n} = 0.1-100 keV. The progress on the data analysis efforts is given in the present paper. The {gamma}-ray multiplicity distributions for the Eu targets and Be backing are significantly different. The {gamma}-ray multiplicity distribution is found to be the same for different neutron energies for both {sup 151}Eu andmore » {sup 153}Eu. The statistical simulation to model the {gamma}-ray decay cascade is summarized.« less

  9. Constraining the calculation of 234,236,238U (n ,γ ) cross sections with measurements of the γ -ray spectra at the DANCE facility

    NASA Astrophysics Data System (ADS)

    Ullmann, J. L.; Kawano, T.; Baramsai, B.; Bredeweg, T. A.; Couture, A.; Haight, R. C.; Jandel, M.; O'Donnell, J. M.; Rundberg, R. S.; Vieira, D. J.; Wilhelmy, J. B.; Krtička, M.; Becker, J. A.; Chyzh, A.; Wu, C. Y.; Mitchell, G. E.

    2017-08-01

    The cross section for neutron capture in the continuum region has been difficult to calculate accurately. Previous results for 238U show that including an M 1 scissors-mode contribution to the photon strength function resulted in very good agreement between calculation and measurement. This paper extends that analysis to U,236234 by using γ -ray spectra measured with the Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center to constrain the photon strength function used to calculate the capture cross section. Calculations using a strong scissors-mode contribution reproduced the measured γ -ray spectra and were in excellent agreement with the reported cross sections for all three isotopes.

  10. Development of Multifunctional Fluorescent–Magnetic Nanoprobes for Selective Capturing and Multicolor Imaging of Heterogeneous Circulating Tumor Cells

    PubMed Central

    2016-01-01

    Circulating tumor cells (CTC) are highly heterogeneous in nature due to epithelial–mesenchymal transition (EMT), which is the major obstacle for CTC analysis via “liquid biopsy”. This article reports the development of a new class of multifunctional fluorescent–magnetic multicolor nanoprobes for targeted capturing and accurate identification of heterogeneous CTC. A facile design approach for the synthesis and characterization of bioconjugated multifunctonal nanoprobes that exhibit excellent magnetic properties and emit very bright and photostable multicolor fluorescence at red, green, and blue under 380 nm excitation is reported. Experimental data presented show that the multifunctional multicolor nanoprobes can be used for targeted capture and multicolor fluorescence mapping of heterogeneous CTC and can distinguish targeted CTC from nontargeted cells. PMID:27255574

  11. Final Scientific/Technical Report Carbon Capture and Storage Training Northwest - CCSTNW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Workman, James

    This report details the activities of the Carbon Capture and Storage Training Northwest (CCSTNW) program 2009 to 2013. The CCSTNW created, implemented, and provided Carbon Capture and Storage (CCS) training over the period of the program. With the assistance of an expert advisory board, CCSTNW created curriculum and conducted three short courses, more than three lectures, two symposiums, and a final conference. The program was conducted in five phases; 1) organization, gap analysis, and form advisory board; 2) develop list serves, website, and tech alerts; 3) training needs survey; 4) conduct lectures, courses, symposiums, and a conference; 5) evaluation surveysmore » and course evaluations. This program was conducted jointly by Environmental Outreach and Stewardship Alliance (dba. Northwest Environmental Training Center – NWETC) and Pacific Northwest National Laboratories (PNNL).« less

  12. Experiential and Outdoor Education: The Participant Experience Shared through Mind Maps

    ERIC Educational Resources Information Center

    Jirásek, Ivo; Plevová, Irena; Jirásková, Miroslava; Dvorácková, Adéla

    2016-01-01

    This paper describes an analysis of mind maps capturing the experiences of the participants in an experiential and outdoor education course. The method of mind mapping is usually limited to a quantitative scoring analysis and comparative content analysis of concepts. As a consequence, the visual elements of the information are usually ignored, but…

  13. GeneLab

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Thompson, Terri G.

    2015-01-01

    NASA GeneLab is expected to capture and distribute omics data and experimental and process conditions most relevant to research community in their statistical and theoretical analysis of NASAs omics data.

  14. A New Method for Computing Three-Dimensional Capture Fraction in Heterogeneous Regional Systems using the MODFLOW Adjoint Code

    NASA Astrophysics Data System (ADS)

    Clemo, T. M.; Ramarao, B.; Kelly, V. A.; Lavenue, M.

    2011-12-01

    Capture is a measure of the impact of groundwater pumping upon groundwater and surface water systems. The computation of capture through analytical or numerical methods has been the subject of articles in the literature for several decades (Bredehoeft et al., 1982). Most recently Leake et al. (2010) described a systematic way to produce capture maps in three-dimensional systems using a numerical perturbation approach in which capture from streams was computed using unit rate pumping at many locations within a MODFLOW model. The Leake et al. (2010) method advances the current state of computing capture. A limitation stems from the computational demand required by the perturbation approach wherein days or weeks of computational time might be required to obtain a robust measure of capture. In this paper, we present an efficient method to compute capture in three-dimensional systems based upon adjoint states. The efficiency of the adjoint method will enable uncertainty analysis to be conducted on capture calculations. The USGS and INTERA have collaborated to extend the MODFLOW Adjoint code (Clemo, 2007) to include stream-aquifer interaction and have applied it to one of the examples used in Leake et al. (2010), the San Pedro Basin MODFLOW model. With five layers and 140,800 grid blocks per layer, the San Pedro Basin model, provided an ideal example data set to compare the capture computed from the perturbation and the adjoint methods. The capture fraction map produced from the perturbation method for the San Pedro Basin model required significant computational time to compute and therefore the locations for the pumping wells were limited to 1530 locations in layer 4. The 1530 direct simulations of capture require approximately 76 CPU hours. Had capture been simulated in each grid block in each layer, as is done in the adjoint method, the CPU time would have been on the order of 4 years. The MODFLOW-Adjoint produced the capture fraction map of the San Pedro Basin model at 704,000 grid blocks (140,800 grid blocks x 5 layers) in just 6 minutes. The capture fraction maps from the perturbation and adjoint methods agree closely. The results of this study indicate that the adjoint capture method and its associated computational efficiency will enable scientists and engineers facing water resource management decisions to evaluate the sensitivity and uncertainty of impacts to regional water resource systems as part of groundwater supply strategies. Bredehoeft, J.D., S.S. Papadopulos, and H.H. Cooper Jr, Groundwater: The water budget myth. In Scientific Basis of Water-Resources Management, ed. National Research Council (U.S.), Geophysical Study Committee, 51-57. Washington D.C.: National Academy Press, 1982. Clemo, Tom, MODFLOW-2005 Ground-Water Model-Users Guide to Adjoint State based Sensitivity Process (ADJ), BSU CGISS 07-01, Center for the Geophysical Investigation of the Shallow Subsurface, Boise State University, 2007. Leake, S.A., H.W. Reeves, and J.E. Dickinson, A New Capture Fraction Method to Map How Pumpage Affects Surface Water Flow, Ground Water, 48(5), 670-700, 2010.

  15. Literature mining, gene-set enrichment and pathway analysis for target identification in Behçet's disease.

    PubMed

    Wilson, Paul; Larminie, Christopher; Smith, Rona

    2016-01-01

    To use literature mining to catalogue Behçet's associated genes, and advanced computational methods to improve the understanding of the pathways and signalling mechanisms that lead to the typical clinical characteristics of Behçet's patients. To extend this technique to identify potential treatment targets for further experimental validation. Text mining methods combined with gene enrichment tools, pathway analysis and causal analysis algorithms. This approach identified 247 human genes associated with Behçet's disease and the resulting disease map, comprising 644 nodes and 19220 edges, captured important details of the relationships between these genes and their associated pathways, as described in diverse data repositories. Pathway analysis has identified how Behçet's associated genes are likely to participate in innate and adaptive immune responses. Causal analysis algorithms have identified a number of potential therapeutic strategies for further investigation. Computational methods have captured pertinent features of the prominent disease characteristics presented in Behçet's disease and have highlighted NOD2, ICOS and IL18 signalling as potential therapeutic strategies.

  16. Capturing tumor heterogeneity and clonal evolution in solid cancers using circulating tumor DNA analysis.

    PubMed

    Perdigones, Nieves; Murtaza, Muhammed

    2017-06-01

    Circulating tumor DNA analysis has emerged as a potential noninvasive alternative to tissue biopsies for tumor genotyping in patients with metastatic cancer. This is particularly attractive in cases where tissue biopsies are contraindicated or repeat genotyping after progression on treatment is required. However, tissue and plasma analysis results are not always concordant and clinical interpretation of discordant results is not completely understood. Discordant results could arise due to analytical limits of assays used for tumor and plasma DNA analysis or due to low overall contribution of tumor-specific DNA in plasma. Once these factors are ruled out, tissue-plasma concordance and quantitative levels of somatic mutations in plasma can capture tumor heterogeneity. During longitudinal follow-up of patients, this feature can be leveraged to track subclonal evolution and to guide combination or sequential adaptive treatment. Here, we summarize recent results evaluating the opportunities and limitations of circulating tumor DNA analysis in the context of tumor heterogeneity and subclonal evolution in patients with advanced cancers. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Multivariate Analysis of Remains of Molluscan Foods Consumed by Latest Pleistocene and Holocene Humans in Nerja Cave, Málaga, Spain

    NASA Astrophysics Data System (ADS)

    Serrano, Francisco; Guerra-Merchán, Antonio; Lozano-Francisco, Carmen; Vera-Peláez, José Luis

    1997-09-01

    Nerja Cave is a karstic cavity used by humans from Late Paleolithic to post-Chalcolithic times. Remains of molluscan foods in the uppermost Pleistocene and Holocene sediments were studied with cluster analysis and principal components analysis, in both Qand Rmodes. The results from cluster analysis distinguished interval groups mainly in accordance with chronology and distinguished assemblages of species mainly according to habitat. Significant changes in the shellfish diet through time were revealed. In the Late Magdalenian, most molluscs consumed consisted of pulmonate gastropods and species from sandy sea bottoms. The Epipaleolithic diet was more varied and included species from rocky shorelines. From the Neolithic onward most molluscs consumed were from rocky shorelines. From the principal components analysis in Qmode, the first factor reflected mainly changes in the predominant capture environment, probably because of major paleogeographic changes. The second factor may reflect selective capture along rocky coastlines during certain times. The third factor correlated well with the sea-surface temperature curve in the western Mediterranean (Alboran Sea) during the late Quaternary.

  18. SMART USE OF COMPUTER-AIDED SPERM ANALYSIS (CASA) TO CHARACTERIZE SPERM MOTION

    EPA Science Inventory

    Computer-aided sperm analysis (CASA) has evolved over the past fifteen years to provide an objective, practical means of measuring and characterizing the velocity and parttern of sperm motion. CASA instruments use video frame-grabber boards to capture multiple images of spermato...

  19. Biomechanics Analysis of Combat Sport (Silat) By Using Motion Capture System

    NASA Astrophysics Data System (ADS)

    Zulhilmi Kaharuddin, Muhammad; Badriah Khairu Razak, Siti; Ikram Kushairi, Muhammad; Syawal Abd. Rahman, Mohamed; An, Wee Chang; Ngali, Z.; Siswanto, W. A.; Salleh, S. M.; Yusup, E. M.

    2017-01-01

    ‘Silat’ is a Malay traditional martial art that is practiced in both amateur and in professional levels. The intensity of the motion spurs the scientific research in biomechanics. The main purpose of this abstract is to present the biomechanics method used in the study of ‘silat’. By using the 3D Depth Camera motion capture system, two subjects are to perform ‘Jurus Satu’ in three repetitions each. One subject is set as the benchmark for the research. The videos are captured and its data is processed using the 3D Depth Camera server system in the form of 16 3D body joint coordinates which then will be transformed into displacement, velocity and acceleration components by using Microsoft excel for data calculation and Matlab software for simulation of the body. The translated data obtained serves as an input to differentiate both subjects’ execution of the ‘Jurus Satu’. Nine primary movements with the addition of five secondary movements are observed visually frame by frame from the simulation obtained to get the exact frame that the movement takes place. Further analysis involves the differentiation of both subjects’ execution by referring to the average mean and standard deviation of joints for each parameter stated. The findings provide useful data for joints kinematic parameters as well as to improve the execution of ‘Jurus Satu’ and to exhibit the process of learning a movement that is relatively unknown by the use of a motion capture system.

  20. Accuracy of human motion capture systems for sport applications; state-of-the-art review.

    PubMed

    van der Kruk, Eline; Reijne, Marco M

    2018-05-09

    Sport research often requires human motion capture of an athlete. It can, however, be labour-intensive and difficult to select the right system, while manufacturers report on specifications which are determined in set-ups that largely differ from sport research in terms of volume, environment and motion. The aim of this review is to assist researchers in the selection of a suitable motion capture system for their experimental set-up for sport applications. An open online platform is initiated, to support (sport)researchers in the selection of a system and to enable them to contribute and update the overview. systematic review; Method: Electronic searches in Scopus, Web of Science and Google Scholar were performed, and the reference lists of the screened articles were scrutinised to determine human motion capture systems used in academically published studies on sport analysis. An overview of 17 human motion capture systems is provided, reporting the general specifications given by the manufacturer (weight and size of the sensors, maximum capture volume, environmental feasibilities), and calibration specifications as determined in peer-reviewed studies. The accuracy of each system is plotted against the measurement range. The overview and chart can assist researchers in the selection of a suitable measurement system. To increase the robustness of the database and to keep up with technological developments, we encourage researchers to perform an accuracy test prior to their experiment and to add to the chart and the system overview (online, open access).

  1. Relationship between mosquito (Diptera: Culicidae) landing rates on a human subject and numbers captured using CO2-baited light traps.

    PubMed

    Barnard, D R; Knue, G J; Dickerson, C Z; Bernier, U R; Kline, D L

    2011-06-01

    Capture rates of insectary-reared female Aedes albopictus (Skuse), Anopheles quadrimaculatus Say, Culex nigripalpus Theobald, Culex quinquefasciatus Say and Aedes triseriatus (Say) in CDC-type light traps (LT) supplemented with CO2 and using the human landing (HL) collection method were observed in matched-pair experiments in outdoor screened enclosures. Mosquito responses were compared on a catch-per-unit-effort basis using regression analysis with LT and HL as the dependent and independent variables, respectively. The average number of mosquitoes captured in 1 min by LT over a 24-h period was significantly related to the average number captured in 1 min by HL only for Cx. nigripalpus and Cx. quinquefasciatus. Patterns of diel activity indicated by a comparison of the mean response to LT and HL at eight different times in a 24-h period were not superposable for any species. The capture rate efficiency of LT when compared with HL was ≤15% for all mosquitoes except Cx. quinquefasciatus (43%). Statistical models of the relationship between mosquito responses to each collection method indicate that, except for Ae. albopictus, LT and HL capture rates are significantly related only during certain times of the diel period. Estimates of mosquito activity based on observations made between sunset and sunrise were most precise in this regard for An. quadrimaculatus and Cx. nigripalpus, as were those between sunrise and sunset for Cx. quinquefasciatus and Ae. triseriatus.

  2. Detection of sepsis in patient blood samples using CD64 expression in a microfluidic cell separation device.

    PubMed

    Zhang, Ye; Li, Wenjie; Zhou, Yun; Johnson, Amanda; Venable, Amanda; Hassan, Ahmed; Griswold, John; Pappas, Dimitri

    2017-12-18

    A microfluidic affinity separation device was developed for the detection of sepsis in critical care patients. An affinity capture method was developed to capture cells based on changes in CD64 expression in a single, simple microfluidic chip for sepsis detection. Both sepsis patient samples and a laboratory CD64+ expression model were used to validate the microfluidic assay. Flow cytometry analysis showed that the chip cell capture had a linear relationship with CD64 expression in laboratory models. The Sepsis Chip detected an increase in upregulated neutrophil-like cells when the upregulated cell population is as low as 10% of total cells spiked into commercially available aseptic blood samples. In a proof of concept study, blood samples obtained from sepsis patients within 24 hours of diagnosis were tested on the chip to further validate its performance. On-chip CD64+ cell capture from 10 patient samples (619 ± 340 cells per chip) was significantly different from control samples (32 ± 11 cells per chip) and healthy volunteer samples (228 ± 95 cells per chip). In addition, the on-chip cell capture has a linear relationship with CD64 expression indicating our approach can be used to measure CD64 expression based on total cell capture on Sepsis Chip. Our method has proven to be sensitive, accurate, rapid, and cost-effective. Therefore, this device is a promising detection platform for neutrophil activation and sepsis diagnosis.

  3. C3 and C4 biomass allocation responses to elevated CO2 and nitrogen: contrasting resource capture strategies

    USGS Publications Warehouse

    White, K.P.; Langley, J.A.; Cahoon, D.R.; Megonigal, J.P.

    2012-01-01

    Plants alter biomass allocation to optimize resource capture. Plant strategy for resource capture may have important implications in intertidal marshes, where soil nitrogen (N) levels and atmospheric carbon dioxide (CO2) are changing. We conducted a factorial manipulation of atmospheric CO2 (ambient and ambient + 340 ppm) and soil N (ambient and ambient + 25 g m-2 year-1) in an intertidal marsh composed of common North Atlantic C3 and C4 species. Estimation of C3 stem turnover was used to adjust aboveground C3 productivity, and fine root productivity was partitioned into C3-C4 functional groups by isotopic analysis. The results suggest that the plants follow resource capture theory. The C3 species increased aboveground productivity under the added N and elevated CO2 treatment (P 2 alone. C3 fine root production decreased with added N (P 2 (P = 0.0481). The C4 species increased growth under high N availability both above- and belowground, but that stimulation was diminished under elevated CO2. The results suggest that the marsh vegetation allocates biomass according to resource capture at the individual plant level rather than for optimal ecosystem viability in regards to biomass influence over the processes that maintain soil surface elevation in equilibrium with sea level.

  4. Vision drives accurate approach behavior during prey capture in laboratory mice

    PubMed Central

    Hoy, Jennifer L.; Yavorska, Iryna; Wehr, Michael; Niell, Cristopher M.

    2016-01-01

    Summary The ability to genetically identify and manipulate neural circuits in the mouse is rapidly advancing our understanding of visual processing in the mammalian brain [1,2]. However, studies investigating the circuitry that underlies complex ethologically-relevant visual behaviors in the mouse have been primarily restricted to fear responses [3–5]. Here, we show that a laboratory strain of mouse (Mus musculus, C57BL/6J) robustly pursues, captures and consumes live insect prey, and that vision is necessary for mice to perform the accurate orienting and approach behaviors leading to capture. Specifically, we differentially perturbed visual or auditory input in mice and determined that visual input is required for accurate approach, allowing maintenance of bearing to within 11 degrees of the target on average during pursuit. While mice were able to capture prey without vision, the accuracy of their approaches and capture rate dramatically declined. To better explore the contribution of vision to this behavior, we developed a simple assay that isolated visual cues and simplified analysis of the visually guided approach. Together, our results demonstrate that laboratory mice are capable of exhibiting dynamic and accurate visually-guided approach behaviors, and provide a means to estimate the visual features that drive behavior within an ethological context. PMID:27773567

  5. Bench-Scale Silicone Process for Low-Cost CO{sub 2} Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Benjamin; Genovese, Sarah; Perry, Robert

    2013-12-31

    A bench-scale system was designed and built to test an aminosilicone-based solvent. A model was built of the bench-scale system and this model was scaled up to model the performance of a carbon capture unit, using aminosilicones, for CO{sub 2} capture and sequestration (CCS) for a pulverized coal (PC) boiler at 550 MW. System and economic analysis for the carbon capture unit demonstrates that the aminosilicone solvent has significant advantages relative to a monoethanol amine (MEA)-based system. The CCS energy penalty for MEA is 35.9% and the energy penalty for aminosilicone solvent is 30.4% using a steam temperature of 395more » °C (743 °F). If the steam temperature is lowered to 204 °C (400 °F), the energy penalty for the aminosilicone solvent is reduced to 29%. The increase in cost of electricity (COE) over the non-capture case for MEA is ~109% and increase in COE for aminosilicone solvent is ~98 to 103% depending on the solvent cost at a steam temperature of 395 °C (743 °F). If the steam temperature is lowered to 204 °C (400 °F), the increase in COE for the aminosilicone solvent is reduced to ~95-100%.« less

  6. Emerging trends in Lassa fever: redefining the role of immunoglobulin M and inflammation in diagnosing acute infection.

    PubMed

    Branco, Luis M; Grove, Jessica N; Boisen, Matt L; Shaffer, Jeffrey G; Goba, Augustine; Fullah, Mohammed; Momoh, Mambu; Grant, Donald S; Garry, Robert F

    2011-10-24

    Lassa fever (LF) is a devastating hemorrhagic viral disease that is endemic to West Africa and responsible for thousands of human deaths each year. Analysis of humoral immune responses (IgM and IgG) by antibody-capture ELISA (Ab-capture ELISA) and Lassa virus (LASV) viremia by antigen-capture ELISA (Ag-capture ELISA) in suspected patients admitted to the Kenema Government Hospital (KGH) Lassa Fever Ward (LFW) in Sierra Leone over the past five years is reshaping our understanding of acute LF. Analyses in LF survivors indicated that LASV-specific IgM persists for months to years after initial infection. Furthermore, exposure to LASV appeared to be more prevalent in historically non-endemic areas of West Africa with significant percentages of reportedly healthy donors IgM and IgG positive in LASV-specific Ab-capture ELISA. We found that LF patients who were Ag positive were more likely to die than suspected cases who were only IgM positive. Analysis of metabolic and immunological parameters in Ag positive LF patients revealed a strong correlation between survival and low levels of IL-6, -8, -10, CD40L, BUN, ALP, ALT, and AST. Despite presenting to the hospital with fever and in some instances other symptoms consistent with LF, the profiles of Ag negative IgM positive individuals were similar to those of normal donors and nonfatal (NF) LF cases, suggesting that IgM status cannot necessarily be considered a diagnostic marker of acute LF in suspected cases living in endemic areas of West Africa. Only LASV viremia assessed by Ag-capture immunoassay, nucleic acid detection or virus isolation should be used to diagnose acute LASV infection in West Africans. LASV-specific IgM serostatus cannot be considered a diagnostic marker of acute LF in suspected cases living in endemic areas of West Africa. By applying these criteria, we identified a dysregulated metabolic and pro-inflammatory response profile conferring a poor prognosis in acute LF. In addition to suggesting that the current diagnostic paradigm for acute LF should be reconsidered, these studies present new opportunities for therapeutic interventions based on potential prognostic markers in LF.

  7. Development and Testing of The Lunar Resource Prospector Drill

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Paulsen, G.; Kleinhenz, J.; Smith, J. T.; Quinn, J.

    2017-12-01

    The goal of the Lunar Resource Prospector (RP) mission is to capture and identify volatiles species within the top one meter layer of the lunar surface. The RP drill has been designed to 1. Generate cuttings and place them on the surface for analysis by the Near InfraRed Volatiles Spectrometer Subsystem (NIRVSS), and 2. Capture cuttings and transfer them to the Oxygen and Volatile Extraction Node (OVEN) coupled with the Lunar Advanced Volatiles Analysis (LAVA) subsystem. The RP drill is based on the TRL4 Mars Icebreaker drill and TRL5 LITA drill developed for capturing samples of ice and ice cemented ground on Mars, and represents over a decade of technology development effort. The TRL6 RP drill weighs approximately 15 kg and is rated at just over 500 Watt. The drill consists of: 1. Rotary-Percussive Drill Head, 2. Sampling Auger, 3. Brushing Station, 4. Feed Stage, and 5. Deployment Stage. To reduce sample handling complexity, the drill auger is designed to capture cuttings as opposed to cores. High sampling efficiency is possible through a dual design of the auger. The lower section has deep and low pitch flutes for retaining of cuttings. The upper section has been designed to efficiently move the cuttings out of the hole. The drill uses a "bite" sampling approach where samples are captured in 10 cm depth intervals. The first generation, TRL4 Icebreaker drill was tested in Mars chamber as well as in Antarctica and the Arctic. It demonstrated drilling at 1-1-100-100 level (1 meter in 1 hour with 100 Watt and 100 N Weight on Bit) in ice, ice cemented ground, soil, and rocks. The second generation, TRL5 LITA drill was deployed on a Carnegie Mellon University rover, called Zoe, and tested in Atacama, Antarctica, the Arctic, and Greenland. The tests demonstrated fully autonomous sample acquisition and delivery to a carousel. The modified LITA drill was tested in NASA GRC's lunar vacuum chamber at <10^-5 torr and <200 K. It demonstrated successful capture and transfer of volatile rich frozen samples to a crucible for analysis. The modified LITA drill has also been successfully vibration tested at NASA KSC. The drill was integrated with RP rover at NASA JSC and successfully tested in a lab and in the field, as well as on a large vibration table and steep slope. The latest TRL6 RP drill is currently undergoing testing at NASA GRC lunar chamber facilities.

  8. Near-term deployment of carbon capture and sequestration from biorefineries in the United States.

    PubMed

    Sanchez, Daniel L; Johnson, Nils; McCoy, Sean T; Turner, Peter A; Mach, Katharine J

    2018-05-08

    Capture and permanent geologic sequestration of biogenic CO 2 emissions may provide critical flexibility in ambitious climate change mitigation. However, most bioenergy with carbon capture and sequestration (BECCS) technologies are technically immature or commercially unavailable. Here, we evaluate low-cost, commercially ready CO 2 capture opportunities for existing ethanol biorefineries in the United States. The analysis combines process engineering, spatial optimization, and lifecycle assessment to consider the technical, economic, and institutional feasibility of near-term carbon capture and sequestration (CCS). Our modeling framework evaluates least cost source-sink relationships and aggregation opportunities for pipeline transport, which can cost-effectively transport small CO 2 volumes to suitable sequestration sites; 216 existing US biorefineries emit 45 Mt CO 2 annually from fermentation, of which 60% could be captured and compressed for pipeline transport for under $25/tCO 2 A sequestration credit, analogous to existing CCS tax credits, of $60/tCO 2 could incent 30 Mt of sequestration and 6,900 km of pipeline infrastructure across the United States. Similarly, a carbon abatement credit, analogous to existing tradeable CO 2 credits, of $90/tCO 2 can incent 38 Mt of abatement. Aggregation of CO 2 sources enables cost-effective long-distance pipeline transport to distant sequestration sites. Financial incentives under the low-carbon fuel standard in California and recent revisions to existing federal tax credits suggest a substantial near-term opportunity to permanently sequester biogenic CO 2 This financial opportunity could catalyze the growth of carbon capture, transport, and sequestration; improve the lifecycle impacts of conventional biofuels; support development of carbon-negative fuels; and help fulfill the mandates of low-carbon fuel policies across the United States. Copyright © 2018 the Author(s). Published by PNAS.

  9. Species characterization and responses of subcortical insects to trap-logs and ethanol in a hardwood biomass plantation: Subcortical insects in hardwood plantations

    DOE PAGES

    Coyle, David R.; Brissey, Courtney L.; Gandhi, Kamal J. K.

    2015-01-02

    1. We characterized subcortical insect assemblages in economically important eastern cottonwood (Populus deltoides Bartr.), sycamore (Platanus occidentalis L.) and sweetgum (Liquidambar styraciflua L.) plantations in the southeastern U.S.A. Furthermore, we compared insect responses between freshly-cut plant material by placing traps directly over cut hardwood logs (trap-logs), traps baited with ethanol lures and unbaited (control) traps. 2. We captured a total of 15 506 insects representing 127 species in four families in 2011 and 2013. Approximately 9% and 62% of total species and individuals, respectively, and 23% and 79% of total Scolytinae species and individuals, respectively, were non-native to North America.more » 3. We captured more Scolytinae using cottonwood trap-logs compared with control traps in both years, although this was the case with sycamore and sweetgum only in 2013. More woodborers were captured using cottonwood and sweetgum trap-logs compared with control traps in both years, although only with sycamore in 2013. 4. Ethanol was an effective lure for capturing non-native Scolytinae; however, not all non-native species were captured using ethanol lures. Ambrosiophilus atratus (Eichhoff) and Hypothenemus crudiae (Panzer) were captured with both trap-logs and control traps, whereas Coccotrypes distinctus (Motschulsky) and Xyleborus glabratus Eichhoff were only captured on trap-logs. 5. Indicator species analysis revealed that certain scolytines [e.g. Cnestus mutilates (Blandford) and Xylosandrus crassiusculus (Motschulsky)] showed significant associations with trap-logs or ethanol baits in poplar or sweetgum trap-logs. In general, the species composition of subcortical insects, especially woodboring insects, was distinct among the three tree species and between those associated with trap-logs and control traps.« less

  10. The system-wide economics of a carbon dioxide capture, utilization, and storage network: Texas Gulf Coast with pure CO2-EOR flood

    NASA Astrophysics Data System (ADS)

    King, Carey W.; Gülen, Gürcan; Cohen, Stuart M.; Nuñez-Lopez, Vanessa

    2013-09-01

    This letter compares several bounding cases for understanding the economic viability of capturing large quantities of anthropogenic CO2 from coal-fired power generators within the Electric Reliability Council of Texas electric grid and using it for pure CO2 enhanced oil recovery (EOR) in the onshore coastal region of Texas along the Gulf of Mexico. All captured CO2 in excess of that needed for EOR is sequestered in saline formations at the same geographic locations as the oil reservoirs but at a different depth. We analyze the extraction of oil from the same set of ten reservoirs within 20- and five-year time frames to describe how the scale of the carbon dioxide capture, utilization, and storage (CCUS) network changes to meet the rate of CO2 demand for oil recovery. Our analysis shows that there is a negative system-wide net present value (NPV) for all modeled scenarios. The system comes close to breakeven economics when capturing CO2 from three coal-fired power plants to produce oil via CO2-EOR over 20 years and assuming no CO2 emissions penalty. The NPV drops when we consider a larger network to produce oil more quickly (21 coal-fired generators with CO2 capture to produce 80% of the oil within five years). Upon applying a CO2 emissions penalty of 602009/tCO2 to fossil fuel emissions to ensure that coal-fired power plants with CO2 capture remain in baseload operation, the system economics drop significantly. We show near profitability for the cash flow of the EOR operations only; however, this situation requires relatively cheap electricity prices during operation.

  11. Near-term deployment of carbon capture and sequestration from biorefineries in the United States

    PubMed Central

    Johnson, Nils; McCoy, Sean T.; Turner, Peter A.; Mach, Katharine J.

    2018-01-01

    Capture and permanent geologic sequestration of biogenic CO2 emissions may provide critical flexibility in ambitious climate change mitigation. However, most bioenergy with carbon capture and sequestration (BECCS) technologies are technically immature or commercially unavailable. Here, we evaluate low-cost, commercially ready CO2 capture opportunities for existing ethanol biorefineries in the United States. The analysis combines process engineering, spatial optimization, and lifecycle assessment to consider the technical, economic, and institutional feasibility of near-term carbon capture and sequestration (CCS). Our modeling framework evaluates least cost source–sink relationships and aggregation opportunities for pipeline transport, which can cost-effectively transport small CO2 volumes to suitable sequestration sites; 216 existing US biorefineries emit 45 Mt CO2 annually from fermentation, of which 60% could be captured and compressed for pipeline transport for under $25/tCO2. A sequestration credit, analogous to existing CCS tax credits, of $60/tCO2 could incent 30 Mt of sequestration and 6,900 km of pipeline infrastructure across the United States. Similarly, a carbon abatement credit, analogous to existing tradeable CO2 credits, of $90/tCO2 can incent 38 Mt of abatement. Aggregation of CO2 sources enables cost-effective long-distance pipeline transport to distant sequestration sites. Financial incentives under the low-carbon fuel standard in California and recent revisions to existing federal tax credits suggest a substantial near-term opportunity to permanently sequester biogenic CO2. This financial opportunity could catalyze the growth of carbon capture, transport, and sequestration; improve the lifecycle impacts of conventional biofuels; support development of carbon-negative fuels; and help fulfill the mandates of low-carbon fuel policies across the United States. PMID:29686063

  12. US Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  13. Species characterization and responses of subcortical insects to trap-logs and ethanol in a hardwood biomass plantation: Subcortical insects in hardwood plantations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coyle, David R.; Brissey, Courtney L.; Gandhi, Kamal J. K.

    1. We characterized subcortical insect assemblages in economically important eastern cottonwood (Populus deltoides Bartr.), sycamore (Platanus occidentalis L.) and sweetgum (Liquidambar styraciflua L.) plantations in the southeastern U.S.A. Furthermore, we compared insect responses between freshly-cut plant material by placing traps directly over cut hardwood logs (trap-logs), traps baited with ethanol lures and unbaited (control) traps. 2. We captured a total of 15 506 insects representing 127 species in four families in 2011 and 2013. Approximately 9% and 62% of total species and individuals, respectively, and 23% and 79% of total Scolytinae species and individuals, respectively, were non-native to North America.more » 3. We captured more Scolytinae using cottonwood trap-logs compared with control traps in both years, although this was the case with sycamore and sweetgum only in 2013. More woodborers were captured using cottonwood and sweetgum trap-logs compared with control traps in both years, although only with sycamore in 2013. 4. Ethanol was an effective lure for capturing non-native Scolytinae; however, not all non-native species were captured using ethanol lures. Ambrosiophilus atratus (Eichhoff) and Hypothenemus crudiae (Panzer) were captured with both trap-logs and control traps, whereas Coccotrypes distinctus (Motschulsky) and Xyleborus glabratus Eichhoff were only captured on trap-logs. 5. Indicator species analysis revealed that certain scolytines [e.g. Cnestus mutilates (Blandford) and Xylosandrus crassiusculus (Motschulsky)] showed significant associations with trap-logs or ethanol baits in poplar or sweetgum trap-logs. In general, the species composition of subcortical insects, especially woodboring insects, was distinct among the three tree species and between those associated with trap-logs and control traps.« less

  14. Impact of Domain Analysis on Reuse Methods

    DTIC Science & Technology

    1989-11-06

    return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality

  15. A Graph Oriented Approach for Network Forensic Analysis

    ERIC Educational Resources Information Center

    Wang, Wei

    2010-01-01

    Network forensic analysis is a process that analyzes intrusion evidence captured from networked environment to identify suspicious entities and stepwise actions in an attack scenario. Unfortunately, the overwhelming amount and low quality of output from security sensors make it difficult for analysts to obtain a succinct high-level view of complex…

  16. Digital Radiographic Image Processing and Analysis.

    PubMed

    Yoon, Douglas C; Mol, André; Benn, Douglas K; Benavides, Erika

    2018-07-01

    This article describes digital radiographic imaging and analysis from the basics of image capture to examples of some of the most advanced digital technologies currently available. The principles underlying the imaging technologies are described to provide a better understanding of their strengths and limitations. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Spectrum-Based and Collaborative Network Topology Analysis and Visualization

    ERIC Educational Resources Information Center

    Hu, Xianlin

    2013-01-01

    Networks are of significant importance in many application domains, such as World Wide Web and social networks, which often embed rich topological information. Since network topology captures the organization of network nodes and links, studying network topology is very important to network analysis. In this dissertation, we study networks by…

  18. Valuing vaccines using value of statistical life measures.

    PubMed

    Laxminarayan, Ramanan; Jamison, Dean T; Krupnick, Alan J; Norheim, Ole F

    2014-09-03

    Vaccines are effective tools to improve human health, but resources to pursue all vaccine-related investments are lacking. Benefit-cost and cost-effectiveness analysis are the two major methodological approaches used to assess the impact, efficiency, and distributional consequences of disease interventions, including those related to vaccinations. Childhood vaccinations can have important non-health consequences for productivity and economic well-being through multiple channels, including school attendance, physical growth, and cognitive ability. Benefit-cost analysis would capture such non-health benefits; cost-effectiveness analysis does not. Standard cost-effectiveness analysis may grossly underestimate the benefits of vaccines. A specific willingness-to-pay measure is based on the notion of the value of a statistical life (VSL), derived from trade-offs people are willing to make between fatality risk and wealth. Such methods have been used widely in the environmental and health literature to capture the broader economic benefits of improving health, but reservations remain about their acceptability. These reservations remain mainly because the methods may reflect ability to pay, and hence be discriminatory against the poor. However, willingness-to-pay methods can be made sensitive to income distribution by using appropriate income-sensitive distributional weights. Here, we describe the pros and cons of these methods and how they compare against standard cost-effectiveness analysis using pure health metrics, such as quality-adjusted life years (QALYs) and disability-adjusted life years (DALYs), in the context of vaccine priorities. We conclude that if appropriately used, willingness-to-pay methods will not discriminate against the poor, and they can capture important non-health benefits such as financial risk protection, productivity gains, and economic wellbeing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Application of laser-capture microdissection to analysis of gene expression in the testis.

    PubMed

    Sluka, Pavel; O'Donnell, Liza; McLachlan, Robert I; Stanton, Peter G

    2008-01-01

    The isolation and molecular analysis of highly purified cell populations from complex, heterogeneous tissues has been a challenge for many years. Spermatogenesis in the testis is a particularly difficult process to study given the unique multiple cellular associations within the seminiferous epithelium, making the isolation of specific cell types difficult. Laser-capture microdissection (LCM) is a recently developed technique that enables the isolation of individual cell populations from complex tissues. This technology has enhanced our ability to directly examine gene expression in enriched testicular cell populations by routine methods of gene expression analysis, such as real-time RT-PCR, differential display, and gene microarrays. The application of LCM has however introduced methodological hurdles that have not been encountered with more conventional molecular analyses of whole tissue. In particular, tissue handling (i.e. fixation, storage, and staining), consumables (e.g. slide choice), staining reagents (conventional H&E vs. fluorescence), extraction methods, and downstream applications have all required re-optimisation to facilitate differential gene expression analysis using the small amounts of material obtained using LCM. This review will discuss three critical issues that are essential for successful procurement of cells from testicular tissue sections; tissue morphology, capture success, and maintenance of molecular integrity. The importance of these issues will be discussed with specific reference to the two most commonly used LCM systems; the Arcturus PixCell IIe and PALM systems. The rat testis will be used as a model, and emphasis will be placed on issues of tissue handling, processing, and staining methods, including the application of fluorescence techniques to assist in the identification of cells of interest for the purposes of mRNA expression analysis.

  20. The Function of Neuroendocrine Cells in Prostate Cancer

    DTIC Science & Technology

    2013-04-01

    integration site. We then performed deep sequencing and aligned reads to the genome. Our analysis revealed that both histological phenotypes are derived from...lentiviral integration site analysis . (B) Laser capture microdissection was performed on individual glands containing both squamous and...lentiviral integration site analysis . LTR: long terminal repeat (viral DNA), PCR: polymerase chain reaction. (D) Venn diagrams depict shared lentiviral

  1. Neutron Activation Analysis of Water - A Review

    NASA Technical Reports Server (NTRS)

    Buchanan, John D.

    1971-01-01

    Recent developments in this field are emphasized. After a brief review of basic principles, topics discussed include sources of neutrons, pre-irradiation physical and chemical treatment of samples, neutron capture and gamma-ray analysis, and selected applications. Applications of neutron activation analysis of water have increased rapidly within the last few years and may be expected to increase in the future.

  2. The utilization of forensic science and criminal profiling for capturing serial killers.

    PubMed

    White, John H; Lester, David; Gentile, Matthew; Rosenbleeth, Juliana

    2011-06-15

    Movies and nightly television shows appear to emphasize highly efficient regimens in forensic science and criminal investigative analysis (profiling) that result in capturing serial killers and other perpetrators of homicide. Although some of the shows are apocryphal and unrealistic, they reflect major advancements that have been made in the fields of forensic science and criminal psychology during the past two decades that have helped police capture serial killers. Some of the advancements are outlined in this paper. In a study of 200 serial killers, we examined the variables that led to police focusing their attention on specific suspects. We developed 12 categories that describe how serial killers come to the attention of the police. The results of the present study indicate that most serial killers are captured as a result of citizens and surviving victims contributing information that resulted in police investigations that led to an arrest. The role of forensic science appears to be important in convicting the perpetrator, but not necessarily in identifying the perpetrator. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. Simulation of mercury capture by sorbent injection using a simplified model.

    PubMed

    Zhao, Bingtao; Zhang, Zhongxiao; Jin, Jing; Pan, Wei-Ping

    2009-10-30

    Mercury pollution by fossil fuel combustion or solid waste incineration is becoming the worldwide environmental concern. As an effective control technology, powdered sorbent injection (PSI) has been successfully used for mercury capture from flue gas with advantages of low cost and easy operation. In order to predict the mercury capture efficiency for PSI more conveniently, a simplified model, which is based on the theory of mass transfer, isothermal adsorption and mass balance, is developed in this paper. The comparisons between theoretical results of this model and experimental results by Meserole et al. [F.B. Meserole, R. Chang, T.R. Carrey, J. Machac, C.F.J. Richardson, Modeling mercury removal by sorbent injection, J. Air Waste Manage. Assoc. 49 (1999) 694-704] demonstrate that the simplified model is able to provide good predictive accuracy. Moreover, the effects of key parameters including the mass transfer coefficient, sorbent concentration, sorbent physical property and sorbent adsorption capacity on mercury adsorption efficiency are compared and evaluated. Finally, the sensitive analysis of impact factor indicates that the injected sorbent concentration plays most important role for mercury capture efficiency.

  4. Electron Beam Analysis of Micrometeoroids Captured in Aerogel as Stardust Analogues

    NASA Technical Reports Server (NTRS)

    Graham, G. A.; Sheffield-Parker, J.; Bradley, P.; Kearsley, A. T.; Dai, Z. R.; Mayo, S. C.; Teslich, N.; Snead, C.; Westphal, A. J.; Ishii, H.

    2005-01-01

    In January 2004, NASA s Stardust spacecraft passed through the tail of Comet 81P/Wild-2. The on-board dust flux monitor instrument indicated that numerous micro- and nano-meter sized cometary dust particles were captured by the dedicated silica aerogel capture cell. The collected cometary particles will be returned to Earth in January 2006. Current Stardust analogues are: (i) Light-gas-gun accelerated individual mineral grains and carbonaceous meteoritic material in aerogels at the Stardust encounter velocity ca.approximately 6 kilometers per second. (ii) Aerogels exposed in low-Earth orbit (LEO) containing preserved cosmic dust grains. Studies of these impacts offer insight into the potential state of the captured cometary dust by Stardust and the suitability of various analytical techniques. A number of papers have discussed the application of sophisticated synchrotron analytical techniques to analyze Stardust particles. Yet much of the understanding gained on the composition and mineralogy of interplanetary dust particles (IDPs) has come from electron microscopy studies. Here we discuss the application of scanning electron microscopy (SEM) for Stardust during the preliminary phase of post-return investigations.

  5. A Computerized Data-Capture System for Animal Biosafety Level 4 Laboratories

    PubMed Central

    Bente, Dennis A; Friesen, Jeremy; White, Kyle; Koll, Jordan; Kobinger, Gary P

    2011-01-01

    The restrictive nature of an Animal Biosafety Level 4 (ABSL4) laboratory complicates even simple clinical evaluation including data capture. Typically, clinical data are recorded on paper during procedures, faxed out of the ABSL4, and subsequently manually entered into a computer. This system has many disadvantages including transcriptional errors. Here, we describe the development of a highly customizable, tablet-PC-based computerized data-capture system, allowing reliable collection of observational and clinical data from experimental animals in a restrictive biocontainment setting. A multidisciplinary team with skills in containment laboratory animal science, database design, and software engineering collaborated on the development of this system. The goals were to design an easy-to-use and flexible user interface on a touch-screen tablet PC with user-supportable processes for recovery, full auditing capabilities, and cost effectiveness. The system simplifies data capture, reduces the necessary time in an ABSL4 environment, offers timely reporting and review of data, facilitates statistical analysis, reduces potential of erroneous data entry, improves quality assurance of animal care, and advances the use and refinement of humane endpoints. PMID:22330712

  6. Determination of the effective sample thickness via radiative capture

    DOE PAGES

    Hurst, A. M.; Summers, N. C.; Szentmiklosi, L.; ...

    2015-09-14

    Our procedure for determining the effective thickness of non-uniform irregular-shaped samples via radiative capture is described. In this technique, partial γ-ray production cross sections of a compound nucleus produced in a neutron-capture reaction are measured using Prompt Gamma Activation Analysis and compared to their corresponding standardized absolute values. For the low-energy transitions, the measured cross sections are lower than their standard values due to significant photoelectric absorption of the γ rays within the bulk-sample volume itself. Using standard theoretical techniques, the amount of γ-ray self absorption and neutron self shielding can then be calculated by iteratively varying the sample thicknessmore » until the observed cross sections converge with the known standards. The overall attenuation provides a measure of the effective sample thickness illuminated by the neutron beam. This procedure is illustrated through radiative neutron capture using powdered oxide samples comprising enriched 186W and 182W from which their tungsten-equivalent effective thicknesses are deduced to be 0.077(3) mm and 0.042(8) mm, respectively.« less

  7. Front page or "buried" beneath the fold? Media coverage of carbon capture and storage.

    PubMed

    Boyd, Amanda D; Paveglio, Travis B

    2014-05-01

    Media can affect public views and opinions on science, policy and risk issues. This is especially true of a controversial emerging technology that is relatively unknown. The study presented here employs a media content analysis of carbon capture and storage (CCS), one potential strategy to reduce greenhouse gas emissions. The authors analyzed all mentions of CCS in two leading Canadian national newspapers and two major western regional newspapers from the first article that discussed CCS in 2004 to the end of 2009 (825 articles). An in-depth content analysis was conducted to examine factors relating to risk from CCS, how the technology was portrayed and if coverage was negatively or positively biased. We conclude by discussing the possible impact of media coverage on support or opposition to CCS adoption.

  8. Compact mass spectrometer for plasma discharge ion analysis

    DOEpatents

    Tuszewski, M.G.

    1997-07-22

    A mass spectrometer and methods are disclosed for mass spectrometry which are useful in characterizing a plasma. This mass spectrometer for determining type and quantity of ions present in a plasma is simple, compact, and inexpensive. It accomplishes mass analysis in a single step, rather than the usual two-step process comprised of ion extraction followed by mass filtering. Ions are captured by a measuring element placed in a plasma and accelerated by a known applied voltage. Captured ions are bent into near-circular orbits by a magnetic field such that they strike a collector, producing an electric current. Ion orbits vary with applied voltage and proton mass ratio of the ions, so that ion species may be identified. Current flow provides an indication of quantity of ions striking the collector. 7 figs.

  9. Studies of mobile dust in scrape-off layer plasmas using silica aerogel collectors

    NASA Astrophysics Data System (ADS)

    Bergsåker, H.; Ratynskaia, S.; Litnovsky, A.; Ogata, D.; Sahle, W.

    2011-08-01

    Dust capture with ultralow density silica aerogel collectors is a new method, which allows time resolved in situ capture of dust particles in the scrape-off layers of fusion devices, without substantially damaging the particles. Particle composition and morphology, particle flux densities and particle velocity distributions can be determined through appropriate analysis of the aerogel surfaces after exposure. The method has been applied in comparative studies of intrinsic dust in the TEXTOR tokamak and in the Extrap T2R reversed field pinch. The analysis methods have been mainly optical microscopy and SEM. The method is shown to be applicable in both devices and the results are tentatively compared between the two plasma devices, which are very different in terms of edge plasma conditions, time scale, geometry and wall materials.

  10. High-frequency health data and spline functions.

    PubMed

    Martín-Rodríguez, Gloria; Murillo-Fort, Carlos

    2005-03-30

    Seasonal variations are highly relevant for health service organization. In general, short run movements of medical magnitudes are important features for managers in this field to make adequate decisions. Thus, the analysis of the seasonal pattern in high-frequency health data is an appealing task. The aim of this paper is to propose procedures that allow the analysis of the seasonal component in this kind of data by means of spline functions embedded into a structural model. In the proposed method, useful adaptions of the traditional spline formulation are developed, and the resulting procedures are capable of capturing periodic variations, whether deterministic or stochastic, in a parsimonious way. Finally, these methodological tools are applied to a series of daily emergency service demand in order to capture simultaneous seasonal variations in which periods are different.

  11. Fully-coupled analysis of jet mixing problems. Part 1. Shock-capturing model, SCIPVIS

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Wolf, D. E.

    1984-01-01

    A computational model, SCIPVIS, is described which predicts the multiple cell shock structure in imperfectly expanded, turbulent, axisymmetric jets. The model spatially integrates the parabolized Navier-Stokes jet mixing equations using a shock-capturing approach in supersonic flow regions and a pressure-split approximation in subsonic flow regions. The regions are coupled using a viscous-characteristic procedure. Turbulence processes are represented via the solution of compressibility-corrected two-equation turbulence models. The formation of Mach discs in the jet and the interactive analysis of the wake-like mixing process occurring behind Mach discs is handled in a rigorous manner. Calculations are presented exhibiting the fundamental interactive processes occurring in supersonic jets and the model is assessed via comparisons with detailed laboratory data for a variety of under- and overexpanded jets.

  12. Linear combination reading program for capture gamma rays

    USGS Publications Warehouse

    Tanner, Allan B.

    1971-01-01

    This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).

  13. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  14. Compact mass spectrometer for plasma discharge ion analysis

    DOEpatents

    Tuszewski, Michel G.

    1997-01-01

    A mass spectrometer and methods for mass spectrometry which are useful in characterizing a plasma. This mass spectrometer for determining type and quantity of ions present in a plasma is simple, compact, and inexpensive. It accomplishes mass analysis in a single step, rather than the usual two-step process comprised of ion extraction followed by mass filtering. Ions are captured by a measuring element placed in a plasma and accelerated by a known applied voltage. Captured ions are bent into near-circular orbits by a magnetic field such that they strike a collector, producing an electric current. Ion orbits vary with applied voltage and proton mass ratio of the ions, so that ion species may be identified. Current flow provides an indication of quantity of ions striking the collector.

  15. Adaptive Shape Functions and Internal Mesh Adaptation for Modelling Progressive Failure in Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Stapleton, Scott; Gries, Thomas; Waas, Anthony M.; Pineda, Evan J.

    2014-01-01

    Enhanced finite elements are elements with an embedded analytical solution that can capture detailed local fields, enabling more efficient, mesh independent finite element analysis. The shape functions are determined based on the analytical model rather than prescribed. This method was applied to adhesively bonded joints to model joint behavior with one element through the thickness. This study demonstrates two methods of maintaining the fidelity of such elements during adhesive non-linearity and cracking without increasing the mesh needed for an accurate solution. The first method uses adaptive shape functions, where the shape functions are recalculated at each load step based on the softening of the adhesive. The second method is internal mesh adaption, where cracking of the adhesive within an element is captured by further discretizing the element internally to represent the partially cracked geometry. By keeping mesh adaptations within an element, a finer mesh can be used during the analysis without affecting the global finite element model mesh. Examples are shown which highlight when each method is most effective in reducing the number of elements needed to capture adhesive nonlinearity and cracking. These methods are validated against analogous finite element models utilizing cohesive zone elements.

  16. The cost of carbon capture and storage for natural gas combined cycle power plants.

    PubMed

    Rubin, Edward S; Zhai, Haibo

    2012-03-20

    This paper examines the cost of CO(2) capture and storage (CCS) for natural gas combined cycle (NGCC) power plants. Existing studies employ a broad range of assumptions and lack a consistent costing method. This study takes a more systematic approach to analyze plants with an amine-based postcombustion CCS system with 90% CO(2) capture. We employ sensitivity analyses together with a probabilistic analysis to quantify costs for plants with and without CCS under uncertainty or variability in key parameters. Results for new baseload plants indicate a likely increase in levelized cost of electricity (LCOE) of $20-32/MWh (constant 2007$) or $22-40/MWh in current dollars. A risk premium for plants with CCS increases these ranges to $23-39/MWh and $25-46/MWh, respectively. Based on current cost estimates, our analysis further shows that a policy to encourage CCS at new NGCC plants via an emission tax or carbon price requires (at 95% confidence) a price of at least $125/t CO(2) to ensure NGCC-CCS is cheaper than a plant without CCS. Higher costs are found for nonbaseload plants and CCS retrofits.

  17. A microdosimetric study of {sup 10}B(n,{alpha}){sup 7}Li and {sup 157}Gd(n,{gamma}) reactions for neutron capture therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C.K.C.; Sutton, M.; Evans, T.M.

    1999-01-01

    This paper presents the microdosimetric analysis for the most interesting cell survival experiment recently performed at the Brookhaven National Laboratory (BNL). In this experiment, the cells were first treated with a gadolinium (Gd) labeled tumor-seeking boronated porphyrin (Gd-BOPP) or with BOPP alone, and then irradiated with thermal neutrons. The resulting cell-survival curves indicate that the {sup 157}Gd(n,{gamma}) reactions are very effective in cell killing. The death of a cell treated with Gd-BOPP was attributed to either the {sup 10}B(n,{alpha}){sup 7}Li reactions or the {sup 157}Gd(n,{gamma}) reactions (or both). However, the quantitative relationship between the two types of reaction and themore » cell-survival fraction was not clear. This paper presents the microdosimetric analysis for the BNL experiment based on the measured experimental parameters, and the results clearly suggest a quantitative relationship between the two types of reaction and the cell survival fraction. The results also suggest new research in gadolinium neutron capture therapy (GdNCT) which may lead to a more practical modality than the boron neutron capture therapy (BNCT) for treating cancers.« less

  18. Predicting the ultimate potential of natural gas SOFC power cycles with CO2 capture - Part A: Methodology and reference cases

    NASA Astrophysics Data System (ADS)

    Campanari, Stefano; Mastropasqua, Luca; Gazzani, Matteo; Chiesa, Paolo; Romano, Matteo C.

    2016-08-01

    Driven by the search for the highest theoretical efficiency, in the latest years several studies investigated the integration of high temperature fuel cells in natural gas fired power plants, where fuel cells are integrated with simple or modified Brayton cycles and/or with additional bottoming cycles, and CO2 can be separated via chemical or physical separation, oxy-combustion and cryogenic methods. Focusing on Solid Oxide Fuel Cells (SOFC) and following a comprehensive review and analysis of possible plant configurations, this work investigates their theoretical potential efficiency and proposes two ultra-high efficiency plant configurations based on advanced intermediate-temperature SOFCs integrated with a steam turbine or gas turbine cycle. The SOFC works at atmospheric or pressurized conditions and the resulting power plant exceeds 78% LHV efficiency without CO2 capture (as discussed in part A of the work) and 70% LHV efficiency with substantial CO2 capture (part B). The power plants are simulated at the 100 MW scale with a complete set of realistic assumptions about fuel cell (FC) performance, plant components and auxiliaries, presenting detailed energy and material balances together with a second law analysis.

  19. Do numerical rating scales and the Roland-Morris Disability Questionnaire capture changes that are meaningful to patients with persistent back pain?

    PubMed

    Hush, Julia M; Refshauge, Kathryn M; Sullivan, Gerard; De Souza, Lorraine; McAuley, James H

    2010-07-01

    To investigate patients' views about two common outcome measures used for back pain: Numerical Rating Scales for pain and the Roland-Morris Disability Questionnaire. Thirty-six working adults who had previously sought primary care for back pain and who could speak and read English. Eight focus groups were conducted to explore participants' views about the 11-point Numerical Rating Scales and the 24-item Roland-Morris Disability Questionnaire. Each group was led by a facilitator and an interview topic guide was used. Audio recordings of focus groups were transcribed verbatim. Framework analysis was used to chart participants' views and an interpretive analysis performed to explain the findings. Participants reported that neither the Roland-Morris nor the Numerical Rating Scales captured the complex personal experience of pain or relevant changes in their condition. The time-frame of assessment was identified as particularly problematic and the Roland-Morris did not capture relevant functional domains. This study provides empirical data that working adults with persistent back pain consider these clinical outcome measures largely inadequate. These measures currently used for back pain may contribute to misleading conclusions about treatment efficacy and patient recovery.

  20. Using Reconstructed POD Modes as Turbulent Inflow for LES Wind Turbine Simulations

    NASA Astrophysics Data System (ADS)

    Nielson, Jordan; Bhaganagar, Kiran; Juttijudata, Vejapong; Sirisup, Sirod

    2016-11-01

    Currently, in order to get realistic atmospheric effects of turbulence, wind turbine LES simulations require computationally expensive precursor simulations. At times, the precursor simulation is more computationally expensive than the wind turbine simulation. The precursor simulations are important because they capture turbulence in the atmosphere and as stated above, turbulence impacts the power production estimation. On the other hand, POD analysis has been shown to be capable of capturing turbulent structures. The current study was performed to determine the plausibility of using lower dimension models from POD analysis of LES simulations as turbulent inflow to wind turbine LES simulations. The study will aid the wind energy community by lowering the computational cost of full scale wind turbine LES simulations, while maintaining a high level of turbulent information and being able to quickly apply the turbulent inflow to multi turbine wind farms. This will be done by comparing a pure LES precursor wind turbine simulation with simulations that use reduced POD mod inflow conditions. The study shows the feasibility of using lower dimension models as turbulent inflow of LES wind turbine simulations. Overall the power production estimation and velocity field of the wind turbine wake are well captured with small errors.

  1. A Soft, Wearable Microfluidic Device for the Capture, Storage, and Colorimetric Sensing of Sweat

    PubMed Central

    Koh, Ahyeon; Kang, Daeshik; Xue, Yeguang; Lee, Seungmin; Pielak, Rafal M.; Kim, Jeonghyun; Hwang, Taehwan; Min, Seunghwan; Banks, Anthony; Bastien, Philippe; Manco, Megan C.; Wang, Liang; Ammann, Kaitlyn R.; Jang, Kyung-In; Won, Phillip; Han, Seungyong; Ghaffari, Roozbeh; Paik, Ungyu; Slepian, Marvin J.; Balooch, Guive; Huang, Yonggang; Rogers, John A.

    2017-01-01

    Capabilities in health monitoring via capture and quantitative chemical analysis of sweat could complement, or potentially obviate the need for, approaches based on sporadic assessment of blood samples. Established sweat monitoring technologies use simple fabric swatches and are limited to basic analysis in controlled laboratory or hospital settings. We present a collection of materials and device designs for soft, flexible and stretchable microfluidic systems, including embodiments that integrate wireless communication electronics, which can intimately and robustly bond to the surface of skin without chemical and mechanical irritation. This integration defines access points for a small set of sweat glands such that perspiration spontaneously initiates routing of sweat through a microfluidic network and set of reservoirs. Embedded chemical analyses respond in colorimetric fashion to markers such as chloride and hydronium ions, glucose and lactate. Wireless interfaces to digital image capture hardware serve as a means for quantitation. Human studies demonstrated the functionality of this microfluidic device during fitness cycling in a controlled environment and during long-distance bicycle racing in arid, outdoor conditions. The results include quantitative values for sweat rate, total sweat loss, pH and concentration of both chloride and lactate. PMID:27881826

  2. Principal network analysis: identification of subnetworks representing major dynamics using gene expression data

    PubMed Central

    Kim, Yongsoo; Kim, Taek-Kyun; Kim, Yungu; Yoo, Jiho; You, Sungyong; Lee, Inyoul; Carlson, George; Hood, Leroy; Choi, Seungjin; Hwang, Daehee

    2011-01-01

    Motivation: Systems biology attempts to describe complex systems behaviors in terms of dynamic operations of biological networks. However, there is lack of tools that can effectively decode complex network dynamics over multiple conditions. Results: We present principal network analysis (PNA) that can automatically capture major dynamic activation patterns over multiple conditions and then generate protein and metabolic subnetworks for the captured patterns. We first demonstrated the utility of this method by applying it to a synthetic dataset. The results showed that PNA correctly captured the subnetworks representing dynamics in the data. We further applied PNA to two time-course gene expression profiles collected from (i) MCF7 cells after treatments of HRG at multiple doses and (ii) brain samples of four strains of mice infected with two prion strains. The resulting subnetworks and their interactions revealed network dynamics associated with HRG dose-dependent regulation of cell proliferation and differentiation and early PrPSc accumulation during prion infection. Availability: The web-based software is available at: http://sbm.postech.ac.kr/pna. Contact: dhhwang@postech.ac.kr; seungjin@postech.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21193522

  3. Survey of fishes and environmental conditions in Abbotts Lagoon, Point Reyes National Seashore, California

    USGS Publications Warehouse

    Saiki, M.K.; Martin, B.A.

    2001-01-01

    This study was conducted to gain a better understanding of fishery resources in Abbotts Lagoon, Point Reyes National Seashore. During February/March, May, August, and November 1999, fish were sampled with floating variable-mesh gill nets and small minnow traps from as many as 14 sites in the lagoon. Water temperature, dissolved oxygen, pH, total ammonia(NH3 + NH4+), salinity, turbidity, water depth, and bottom substrate composition were also measured at each site. A total of 2,656 fish represented by eight species was captured during the study. Gill nets captured Sacramento perch, Archoplites interruptus; largemouth bass, Micropterus salmoides; Pacific herring, Clupea pallasi; prickly sculpin, Cottus asper, silver surfperch, Hyperprosopon ellipticum; longfin smelt, Spirinchus thaleichthys; and striped bass, Morone saxatilis; whereas minnow traps captured Sacramento perch; prickly sculpin; and threespine stickleback, Gasterosteus aculeatus. Cluster analysis (Ward's minimum variance method of fish catch statistics identified two major species assemblages-the first dominated by Sacramento perch and, to a lesser extent, by largemouth bass, and the second dominated by Pacific herring and threespine stickleback. Simple discriminant analysis of environmental variables indicated that salinity contributed the most towards separating the two assemblages.

  4. Dynamic analysis of Apollo-Salyut/Soyuz docking

    NASA Technical Reports Server (NTRS)

    Schliesing, J. A.

    1972-01-01

    The use of a docking-system computer program in analyzing the dynamic environment produced by two impacting spacecraft and the attitude control systems is discussed. Performance studies were conducted to determine the mechanism load and capture sensitivity to parametric changes in the initial impact conditions. As indicated by the studies, capture latching is most sensitive to vehicle angular-alinement errors and is least sensitive to lateral-miss error. As proved by load-sensitivity studies, peak loads acting on the Apollo spacecraft are considerably lower than the Apollo design-limit loads.

  5. U.S. Army Chemical Corps Historical Studies, Gas Warfare in World War I: The 5th Division Captures Frapelle, August 1918

    DTIC Science & Technology

    1958-03-01

    U. S . ARMY CHEMICAL CORPS HISTORICAL STUDIES GAS WARFARE IN WORLD WAR I THE 5t1h DIVISION CAPTURES FRAPELLF, 00 August 1918 CTD Offce t e Chef Ch mal...j JNSPECrrr / D~iJ f ; t t] GA o WARFARE IN WORLD WAR I By - --------- St ixdy Number 7 t:a . • ’ I : : . .. s U.S. ARMY CHEIICAL CORPS [ist...Analysi s Concerning the Weight of Shell . . . . . . . ... . 38 The Cost of Frapelle . . . . . . . . . . . .. 41 Medical Department Comments

  6. Measurement of the Parity-Violating directional Gamma-ray Asymmetry in Polarized Neutron Capture on ^35Cl

    NASA Astrophysics Data System (ADS)

    Fomin, Nadia

    2012-03-01

    The NPDGamma experiment aims to measure the parity-odd correlation between the neutron spin and the direction of the emitted photon in neutron-proton capture. A parity violating asymmetry (to be measured to 10-8) from this process can be directly related to the strength of the hadronic weak interaction between nucleons. As part of the commissioning runs on the Fundamental Neutron Physics beamline at the Spallation Neutron Source at ORNL, the gamma-ray asymmetry from the parity-violating capture of cold neutrons on ^35Cl was measured, primarily to check for systematic effects and false asymmtries. The current precision from existing world measurements on this asymmetry is at the level of 10-6 and we believe we can improve it. The analysis methodology as well as preliminary results will be presented.

  7. In silico screening of carbon-capture materials

    NASA Astrophysics Data System (ADS)

    Lin, Li-Chiang; Berger, Adam H.; Martin, Richard L.; Kim, Jihan; Swisher, Joseph A.; Jariwala, Kuldeep; Rycroft, Chris H.; Bhown, Abhoyjit S.; Deem, Michael W.; Haranczyk, Maciej; Smit, Berend

    2012-07-01

    One of the main bottlenecks to deploying large-scale carbon dioxide capture and storage (CCS) in power plants is the energy required to separate the CO2 from flue gas. For example, near-term CCS technology applied to coal-fired power plants is projected to reduce the net output of the plant by some 30% and to increase the cost of electricity by 60-80%. Developing capture materials and processes that reduce the parasitic energy imposed by CCS is therefore an important area of research. We have developed a computational approach to rank adsorbents for their performance in CCS. Using this analysis, we have screened hundreds of thousands of zeolite and zeolitic imidazolate framework structures and identified many different structures that have the potential to reduce the parasitic energy of CCS by 30-40% compared with near-term technologies.

  8. The Open Cluster Chemical Abundances and Mapping (OCCAM) Survey: Optical Extension for Neutron Capture Elements

    NASA Astrophysics Data System (ADS)

    Melendez, Matthew; O'Connell, Julia; Frinchaboy, Peter M.; Donor, John; Cunha, Katia M. L.; Shetrone, Matthew D.; Majewski, Steven R.; Zasowski, Gail; Pinsonneault, Marc H.; Roman-Lopes, Alexandre; Stassun, Keivan G.; APOGEE Team

    2017-01-01

    The Open Cluster Chemical Abundance & Mapping (OCCAM) survey is a systematic survey of Galactic open clusters using data primarily from the SDSS-III/APOGEE-1 survey. However, neutron capture elements are very limited in the IR region covered by APOGEE. In an effort to fully study detailed Galactic chemical evolution, we are conducting a high resolution (R~60,000) spectroscopic abundance analysis of neutron capture elements for OCCAM clusters in the optical regime to complement the APOGEE results. As part of this effort, we present Ba II, La II, Ce II and Eu II results for a few open clusters without previous abundance measurements using data obtained at McDonald Observatory with the 2.1m Otto Struve telescope and Sandiford Echelle Spectrograph.This work is supported by an NSF AAG grant AST-1311835.

  9. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  10. Enabling CCS via Low-temperature Geothermal Energy Integration for Fossil-fired Power Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Casie L.; Heldebrant, D. J.; Bearden, M. D.

    Here, among the key barriers to commercial scale deployment is the cost associated with CO 2 capture. This is particularly true for existing large, fossil-fired assets that account for a large fraction of the electricity generation fleet in developed nations, including the U.S. Fitting conventional combustion technologies with CO 2 capture systems can carry an energy penalty of thirty percent or more, resulting in an increased price of power to the grid, as well as an overall decrease in net plant output. Taken together with the positive growth in demand for electricity, this implies a need for accelerated capital build-outmore » in the power generation markets to accommodate both demand growth and decreased output at retrofitted plants. In this paper, the authors present the results of a study to assess the potential to use geothermal energy to provide boiler feedwater preheating, capturing efficiency improvements designed to offset the losses associated with CO 2 capture. Based on NETL benchmark cases and subsequent analysis of the application using site-specific data from the North Valmy power plant, several cases for CO 2 capture were evaluated. These included geothermally assisted MEA capture, CO2BOLs capture, and stand-alone hybrid power generation, compared with a baseline, no-geothermal case. Based on Case 10, and assuming 2.7 MMlb/h of geothermally sourced 150 ºC water, the parasitic power load associated with MEA capture could be offset by roughly seven percent, resulting in a small (~1 percent) overall loss to net power generation, but at levelized costs of electricity similar to the no-geothermal CCS case. For the CO 2BOLs case, the availability of 150°C geothermal fluid could allow the facility to not only offset the net power decrease associated with CO 2BOLs capture alone, but could increase nameplate capacity by two percent. The geothermally coupled CO 2BOLs case also decreases LCOE by 0.75 ¢/kWh relative to the non-hybrid CO 2BOLs case, with the improved performance over the MEA case driven by the lower regeneration temperature and associated duty for CO 2BOLs relative to MEA.« less

  11. Elliptic Capture Orbits for Missions to the Near Planets

    NASA Technical Reports Server (NTRS)

    Casal, Federico G.; Swenson, Byron L.; Mascy, Alfred C.

    1968-01-01

    Elliptic capture orbits around Mars and Venus have often been considered as means for reducing arrival and departure energy requirements for two-way missions. It had also generally been feared that the energy savings obtained by capturing a spacecraft into a highly elliptical orbit (rather than a near circular orbit of the same periapsis) would largely be offset by the penalties incurred in aligning the semi-major axis of the ellipse in such a way as to obtain the proper orientation of the departure hyperbola. This paper, presents the results of an analysis which takes into consideration the penalties arising from the requirement to match the orientation of the elliptical orbit with the asymptote of the departure hyperbola. The scientific aspects of elliptical orbits around the target planet are discussed, and it is shown that such orbits exhibit characteristics which may be considered advantageous or disadvantageous depending on the purpose of the mission. Alignment of ' the semi-major axis of the capture, ellipse relative to the, asymptote of the escape hyperbola was found not to be a critical requirement since the kinetic energy remains high over a substantial portion of the elliptical capture orbit. This 'means that the escape stage can operate efficiently even when ignited at some angle from the true periapsis point. Considerable freedom in choosing this angle is available at little propulsive cost. The resulting latitude in the choice of angles between arrival and escape asymptotes makes it possible to consider a wide variety of interplanetary transfers and planetary staytimes without the need for separate propulsive maneuvers to realign the capture ellipse before departure., Special consideration has also been g1ven to plane change maneuvers around the planet. These may be required for reasons of orbit dynamics or scientific experimentation and are not uniquely tied to elliptical captures. The sensitivity of the mass of the excursion module to the eccentricity of the capture orbit is discussed and mass-penalty diagrams are presented. It is shown that these penalties do not materially offset the large gains obtained through the use of the elliptical capture mode.

  12. Enabling CCS via Low-temperature Geothermal Energy Integration for Fossil-fired Power Generation

    DOE PAGES

    Davidson, Casie L.; Heldebrant, D. J.; Bearden, M. D.; ...

    2017-08-18

    Here, among the key barriers to commercial scale deployment is the cost associated with CO 2 capture. This is particularly true for existing large, fossil-fired assets that account for a large fraction of the electricity generation fleet in developed nations, including the U.S. Fitting conventional combustion technologies with CO 2 capture systems can carry an energy penalty of thirty percent or more, resulting in an increased price of power to the grid, as well as an overall decrease in net plant output. Taken together with the positive growth in demand for electricity, this implies a need for accelerated capital build-outmore » in the power generation markets to accommodate both demand growth and decreased output at retrofitted plants. In this paper, the authors present the results of a study to assess the potential to use geothermal energy to provide boiler feedwater preheating, capturing efficiency improvements designed to offset the losses associated with CO 2 capture. Based on NETL benchmark cases and subsequent analysis of the application using site-specific data from the North Valmy power plant, several cases for CO 2 capture were evaluated. These included geothermally assisted MEA capture, CO2BOLs capture, and stand-alone hybrid power generation, compared with a baseline, no-geothermal case. Based on Case 10, and assuming 2.7 MMlb/h of geothermally sourced 150 ºC water, the parasitic power load associated with MEA capture could be offset by roughly seven percent, resulting in a small (~1 percent) overall loss to net power generation, but at levelized costs of electricity similar to the no-geothermal CCS case. For the CO 2BOLs case, the availability of 150°C geothermal fluid could allow the facility to not only offset the net power decrease associated with CO 2BOLs capture alone, but could increase nameplate capacity by two percent. The geothermally coupled CO 2BOLs case also decreases LCOE by 0.75 ¢/kWh relative to the non-hybrid CO 2BOLs case, with the improved performance over the MEA case driven by the lower regeneration temperature and associated duty for CO 2BOLs relative to MEA.« less

  13. The Use of Nanotrap Particles Technology in Capturing HIV-1 Virions and Viral Proteins from Infected Cells

    PubMed Central

    Sampey, Gavin; Shafagati, Nazly; Van Duyne, Rachel; Iordanskiy, Sergey; Kehn-Hall, Kylene; Liotta, Lance; Petricoin, Emanuel; Young, Mary; Lepene, Benjamin; Kashanchi, Fatah

    2014-01-01

    HIV-1 infection results in a chronic but incurable illness since long-term HAART can keep the virus to an undetectable level. However, discontinuation of therapy rapidly increases viral burden. Moreover, patients under HAART frequently develop various metabolic disorders and HIV-associated neuronal disease. Today, the main challenge of HIV-1 research is the elimination of the residual virus in infected individuals. The current HIV-1 diagnostics are largely comprised of serological and nucleic acid based technologies. Our goal is to integrate the nanotrap technology into a standard research tool that will allow sensitive detection of HIV-1 infection. This study demonstrates that majority of HIV-1 virions in culture supernatants and Tat/Nef proteins spiked in culture medium can be captured by nanotrap particles. To determine the binding affinities of different baits, we incubated target molecules with nanotrap particles at room temperature. After short sequestration, materials were either eluted or remained attached to nanotrap particles prior to analysis. The unique affinity baits of nanotrap particles preferentially bound HIV-1 materials while excluded albumin. A high level capture of Tat or Tat peptide by NT082 and NT084 particles was measured by western blot (WB). Intracellular Nef protein was captured by NT080, while membrane-associated Nef was captured by NT086 and also detected by WB. Selective capture of HIV-1 particles by NT073 and NT086 was measured by reverse transcriptase assay, while capture of infectious HIV-1 by these nanoparticles was demonstrated by functional transactivation in TZM-bl cells. We also demonstrated specific capture of HIV-1 particles and exosomes-containing TAR-RNA in patients' serum by NT086 and NT082 particles, respectively, using specific qRT-PCR. Collectively, our data indicate that certain types of nanotrap particles selectively capture specific HIV-1 molecules, and we propose to use this technology as a platform to enhance HIV-1 detection by concentrating viral proteins and infectious virions from infected samples. PMID:24820173

  14. MotionExplorer: exploratory search in human motion capture data based on hierarchical aggregation.

    PubMed

    Bernard, Jürgen; Wilhelm, Nils; Krüger, Björn; May, Thorsten; Schreck, Tobias; Kohlhammer, Jörn

    2013-12-01

    We present MotionExplorer, an exploratory search and analysis system for sequences of human motion in large motion capture data collections. This special type of multivariate time series data is relevant in many research fields including medicine, sports and animation. Key tasks in working with motion data include analysis of motion states and transitions, and synthesis of motion vectors by interpolation and combination. In the practice of research and application of human motion data, challenges exist in providing visual summaries and drill-down functionality for handling large motion data collections. We find that this domain can benefit from appropriate visual retrieval and analysis support to handle these tasks in presence of large motion data. To address this need, we developed MotionExplorer together with domain experts as an exploratory search system based on interactive aggregation and visualization of motion states as a basis for data navigation, exploration, and search. Based on an overview-first type visualization, users are able to search for interesting sub-sequences of motion based on a query-by-example metaphor, and explore search results by details on demand. We developed MotionExplorer in close collaboration with the targeted users who are researchers working on human motion synthesis and analysis, including a summative field study. Additionally, we conducted a laboratory design study to substantially improve MotionExplorer towards an intuitive, usable and robust design. MotionExplorer enables the search in human motion capture data with only a few mouse clicks. The researchers unanimously confirm that the system can efficiently support their work.

  15. Proteomic analysis of laser-captured paraffin-embedded tissues: a molecular portrait of head and neck cancer progression.

    PubMed

    Patel, Vyomesh; Hood, Brian L; Molinolo, Alfredo A; Lee, Norman H; Conrads, Thomas P; Braisted, John C; Krizman, David B; Veenstra, Timothy D; Gutkind, J Silvio

    2008-02-15

    Squamous cell carcinoma of the head and neck (HNSCC), the sixth most prevalent cancer among men worldwide, is associated with poor prognosis, which has improved only marginally over the past three decades. A proteomic analysis of HNSCC lesions may help identify novel molecular targets for the early detection, prevention, and treatment of HNSCC. Laser capture microdissection was combined with recently developed techniques for protein extraction from formalin-fixed paraffin-embedded (FFPE) tissues and a novel proteomics platform. Approximately 20,000 cells procured from FFPE tissue sections of normal oral epithelium and well, moderately, and poorly differentiated HNSCC were processed for mass spectrometry and bioinformatic analysis. A large number of proteins expressed in normal oral epithelium and HNSCC, including cytokeratins, intermediate filaments, differentiation markers, and proteins involved in stem cell maintenance, signal transduction, migration, cell cycle regulation, growth and angiogenesis, matrix degradation, and proteins with tumor suppressive and oncogenic potential, were readily detected. Of interest, the relative expression of many of these molecules followed a distinct pattern in normal squamous epithelia and well, moderately, and poorly differentiated HNSCC tumor tissues. Representative proteins were further validated using immunohistochemical studies in HNSCC tissue sections and tissue microarrays. The ability to combine laser capture microdissection and in-depth proteomic analysis of FFPE tissues provided a wealth of information regarding the nature of the proteins expressed in normal squamous epithelium and during HNSCC progression, which may allow the development of novel biomarkers of diagnostic and prognostic value and the identification of novel targets for therapeutic intervention in HNSCC.

  16. Immuno-affinity Capture Followed by TMPP N-Terminus Tagging to Study Catabolism of Therapeutic Proteins.

    PubMed

    Kullolli, Majlinda; Rock, Dan A; Ma, Ji

    2017-02-03

    Characterization of in vitro and in vivo catabolism of therapeutic proteins has increasingly become an integral part of discovery and development process for novel proteins. Unambiguous and efficient identification of catabolites can not only facilitate accurate understanding of pharmacokinetic profiles of drug candidates, but also enables follow up protein engineering to generate more catabolically stable molecules with improved properties (pharmacokinetics and pharmacodynamics). Immunoaffinity capture (IC) followed by top-down intact protein analysis using either matrix-assisted laser desorption/ionization or electrospray ionization mass spectrometry analysis have been the primary methods of choice for catabolite identification. However, the sensitivity and efficiency of these methods is not always sufficient for characterization of novel proteins from complex biomatrices such as plasma or serum. In this study a novel bottom-up targeted protein workflow was optimized for analysis of proteolytic degradation of therapeutic proteins. Selective and sensitive tagging of the alpha-amine at the N-terminus of proteins of interest was performed by immunoaffinity capture of therapeutic protein and its catabolites followed by on-bead succinimidyloxycarbonylmethyl tri-(2,4,6-trimethoxyphenyl N-terminus (TMPP-NTT) tagging. The positively charged hydrophobic TMPP tag facilitates unambiguous sequence identification of all N-terminus peptides from complex tryptic digestion samples via data dependent liquid chromatgraphy-tandem mass spectroscopy. Utility of the workflow was illustrated by definitive analysis of in vitro catabolic profile of neurotensin human Fc (NTs-huFc) protein in mouse serum. The results from this study demonstrated that the IC-TMPP-NTT workflow is a simple and efficient method for catabolite formation in therapeutic proteins.

  17. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  18. Pilot testing of a membrane system for postcombustion CO 2 capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Tim; Kniep, Jay; Wei, Xiaotong

    2015-09-30

    This final report summarizes work conducted for the U.S. Department of Energy, National Energy Technology Laboratory (DOE) to scale up an efficient post-combustion CO 2 capture membrane process to the small pilot test stage (award number DE-FE0005795). The primary goal of this research program was to design, fabricate, and operate a membrane CO 2 capture system to treat coal-derived flue gas containing 20 tonnes CO 2/day (20 TPD). Membrane Technology and Research (MTR) conducted this project in collaboration with Babcock and Wilcox (B&W), the Electric Power Research Institute (EPRI), WorleyParsons (WP), the Illinois Sustainable Technology Center (ISTC), Enerkem (EK), andmore » the National Carbon Capture Center (NCCC). In addition to the small pilot design, build and slipstream testing at NCCC, other project efforts included laboratory membrane and module development at MTR, validation field testing on a 1 TPD membrane system at NCCC, boiler modeling and testing at B&W, a techno-economic analysis (TEA) by EPRI/WP, a case study of the membrane technology applied to a ~20 MWe power plant by ISTC, and an industrial CO 2 capture test at an Enerkem waste-to-biofuel facility. The 20 TPD small pilot membrane system built in this project successfully completed over 1,000 hours of operation treating flue gas at NCCC. The Polaris™ membranes used on this system demonstrated stable performance, and when combined with over 10,000 hours of operation at NCCC on a 1 TPD system, the risk associated with uncertainty in the durability of postcombustion capture membranes has been greatly reduced. Moreover, next-generation Polaris membranes with higher performance and lower cost were validation tested on the 1 TPD system. The 20 TPD system also demonstrated successful operation of a new low-pressure-drop sweep module that will reduce parasitic energy losses at full scale by as much as 10 MWe. In modeling and pilot boiler testing, B&W confirmed the viability of CO 2 recycle to the boiler as envisioned in the MTR process design. The impact of this CO 2 recycle on boiler efficiency was quantified and incorporated into a TEA of the membrane capture process applied to a full-scale power plant. As with previous studies, the TEA showed the membrane process to be lower cost than the conventional solvent capture process even at 90% CO 2capture. A sensitivity study indicates that the membrane capture cost decreases significantly if the 90% capture requirement is relaxed. Depending on the process design, a minimum capture cost is achieved at 30-60% capture, values that would meet proposed CO 2 emission regulations for coal-fired power plants. In summary, this project has successfully advanced the MTR membrane capture process through small pilot testing (technology readiness level 6). The technology is ready for future scale-up to the 10 MWe size.« less

  19. Updated (BP3) Technical and Economic Feasibility Study - Electrochemical Membrane for Carbon Dioxide Capture and Power Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghezel-Ayagh, Hossein

    This topical report summarizes the results of an updated Technical & Economic Feasibility Study (T&EFS) which was conducted in Budget Period 3 of the project to evaluate the performance and cost of the Electrochemical Membrane (ECM)-based CO 2 capture system. The ECM technology is derived from commercially available inorganic membranes; the same used in FuelCell Energy’s commercial fuel cell power plants and sold under the trade name Direct FuelCell® (DFC®). The ECM stacks are utilized in the Combined Electric Power (generation) And Carbon dioxide Separation (CEPACS) systems which can be deployed as add-ons to conventional power plants (Pulverized Coal, Combinedmore » Cycle, etc.) or industrial facilities to simultaneously produce power while capturing >90% of the CO 2 from the flue gas. In this study, an ECM-based CEPACS plant was designed to capture and compress >90% of the CO 2 (for sequestration or beneficial use) from the flue gas of a reference 550 MW (nominal, net AC) Pulverized Coal (PC) Rankine Cycle (Subcritical steam) power plant. ECM performance was updated based on bench scale ECM stack test results. The system process simulations were performed to generate the CEPACS plant performance estimates. The performance assessment included estimation of the parasitic power consumption for CO 2 capture and compression, and the efficiency impact on the PC plant. While the ECM-based CEPACS system for the 550 MW PC plant captures 90% of CO 2 from the flue gas, it generates additional (net AC) power after compensating for the auxiliary power requirements of CO 2 capture and compression. An equipment list, ECM stacks packaging design, and CEPACS plant layout were developed to facilitate the economic analysis. Vendor quotes were also solicited. The economic feasibility study included estimation of CEPACS plant capital cost, cost of electricity (COE) analyses and estimation of cost per ton of CO 2 captured. The incremental COE for the ECM-based CO 2 capture is expected to meet U.S. DOE’s target of 35%. This study has indicated that CEPACS systems offer significant benefits with respect to cost, performance, water consumption and emissions to environment. The realization of these benefits will provide a single solution to carbon dioxide capture in addition to meeting the increasing demand for electricity.« less

  20. Updated (BP3) Technical and Economic Feasibility Study - Electrochemical Membrane for Carbon Dioxide Capture and Power Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghezel-Ayagh, Hossein

    This topical report summarizes the results of an updated Technical & Economic Feasibility Study (T&EFS) which was conducted in Budget Period 3 of the project to evaluate the performance and cost of the Electrochemical Membrane (ECM)-based CO2 capture system. The ECM technology is derived from commercially available inorganic membranes; the same used in FuelCell Energy’s commercial fuel cell power plants and sold under the trade name Direct FuelCell® (DFC®). The ECM stacks are utilized in the Combined Electric Power (generation) And Carbon dioxide Separation (CEPACS) systems which can be deployed as add-ons to conventional power plants (Pulverized Coal, Combined Cycle,more » etc.) or industrial facilities to simultaneously produce power while capturing >90% of the CO2 from the flue gas. In this study, an ECM-based CEPACS plant was designed to capture and compress >90% of the CO2 (for sequestration or beneficial use) from the flue gas of a reference 550 MW (nominal, net AC) Pulverized Coal (PC) Rankine Cycle (Subcritical steam) power plant. ECM performance was updated based on bench scale ECM stack test results. The system process simulations were performed to generate the CEPACS plant performance estimates. The performance assessment included estimation of the parasitic power consumption for CO2 capture and compression, and the efficiency impact on the PC plant. While the ECM-based CEPACS system for the 550 MW PC plant captures 90% of CO2 from the flue gas, it generates additional (net AC) power after compensating for the auxiliary power requirements of CO2 capture and compression. An equipment list, ECM stacks packaging design, and CEPACS plant layout were developed to facilitate the economic analysis. Vendor quotes were also solicited. The economic feasibility study included estimation of CEPACS plant capital cost, cost of electricity (COE) analyses and estimation of cost per ton of CO2 captured. The incremental COE for the ECM-based CO2 capture is expected to meet U.S. DOE’s target of 35%. This study has indicated that CEPACS systems offer significant benefits with respect to cost, performance, water consumption and emissions to environment. The realization of these benefits will provide a single solution to carbon dioxide capture in addition to meeting the increasing demand for electricity.« less

  1. Reproducibility of combinatorial peptide ligand libraries for proteome capture evaluated by selected reaction monitoring.

    PubMed

    Di Girolamo, Francesco; Righetti, Pier Giorgio; Soste, Martin; Feng, Yuehan; Picotti, Paola

    2013-08-26

    Systems biology studies require the capability to quantify with high precision proteins spanning a broad range of abundances across multiple samples. However, the broad range of protein expression in cells often precludes the detection of low-abundance proteins. Different sample processing techniques can be applied to increase proteome coverage. Among these, combinatorial (hexa)peptide ligand libraries (CPLLs) bound to solid matrices have been used to specifically capture and detect low-abundance proteins in complex samples. To assess whether CPLL capture can be applied in systems biology studies involving the precise quantitation of proteins across a multitude of samples, we evaluated its performance across the whole range of protein abundances in Saccharomyces cerevisiae. We used selected reaction monitoring assays for a set of target proteins covering a broad abundance range to quantitatively evaluate the precision of the approach and its capability to detect low-abundance proteins. Replicated CPLL-isolates showed an average variability of ~10% in the amount of the isolated proteins. The high reproducibility of the technique was not dependent on the abundance of the protein or the amount of beads used for the capture. However, the protein-to-bead ratio affected the enrichment of specific proteins. We did not observe a normalization effect of CPLL beads on protein abundances. However, CPLLs enriched for and depleted specific sets of proteins and thus changed the abundances of proteins from a whole proteome extract. This allowed the identification of ~400 proteins otherwise undetected in an untreated sample, under the experimental conditions used. CPLL capture is thus a useful tool to increase protein identifications in proteomic experiments, but it should be coupled to the analysis of untreated samples, to maximize proteome coverage. Our data also confirms that CPLL capture is reproducible and can be confidently used in quantitative proteomic experiments. Combinatorial hexapeptide ligand libraries (CPLLs) bound to solid matrices have been proposed to specifically capture and detect low-abundance proteins in complex samples. To assess whether the CPLL capture can be confidently applied in systems biology studies involving the precise quantitation of proteins across a broad range of abundances and a multitude of samples, we evaluated its reproducibility and performance features. Using selected reaction monitoring assays for proteins covering the whole range of abundances we show that the technique is reproducible and compatible with quantitative proteomic studies. However, the protein-to-bead ratio affects the enrichment of specific proteins and CPLLs depleted specific sets of proteins from a whole proteome extract. Our results suggest that CPLL-based analyses should be coupled to the analysis of untreated samples, to maximize proteome coverage. Overall, our data confirms the applicability of CPLLs in systems biology research and guides the correct use of this technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Describing content in middle school science curricula

    NASA Astrophysics Data System (ADS)

    Schwarz-Ballard, Jennifer A.

    As researchers and designers, we intuitively recognize differences between curricula and describe them in terms of design strategy: project-based, laboratory-based, modular, traditional, and textbook, among others. We assume that practitioners recognize the differences in how each requires that students use knowledge, however these intuitive differences have not been captured or systematically described by the existing languages for describing learning goals. In this dissertation I argue that we need new ways of capturing relationships among elements of content, and propose a theory that describes some of the important differences in how students reason in differently designed curricula and activities. Educational researchers and curriculum designers have taken a variety of approaches to laying out learning goals for science. Through an analysis of existing descriptions of learning goals I argue that to describe differences in the understanding students come away with, they need to (1) be specific about the form of knowledge, (2) incorporate both the processes through which knowledge is used and its form, and (3) capture content development across a curriculum. To show the value of inquiry curricula, learning goals need to incorporate distinctions among the variety of ways we ask students to use knowledge. Here I propose the Epistemic Structures Framework as one way to describe differences in students reasoning that are not captured by existing descriptions of learning goals. The usefulness of the Epistemic Structures framework is demonstrated in the four curriculum case study examples in Part II of this work. The curricula in the case studies represent a range of content coverage, curriculum structure, and design rationale. They serve both to illustrate the Epistemic Structures analysis process and make the case that it does in fact describe learning goals in a way that captures important differences in students reasoning in differently designed curricula. Describing learning goals in terms of Epistemic Structures provides one way to define what we mean when we talk about "project-based" curricula and demonstrate its "value added" to educators, administrators and policy makers.

  3. The Morphosyntax of the Turkish Caustive Construction

    ERIC Educational Resources Information Center

    Key, Gregory

    2013-01-01

    This dissertation is an analysis of the morphosyntax of the Turkish causative construction within the framework of Distributed Morphology (DM). It is an attempt to capture a range of different phenomena in a principled way within this framework. Important aspects of DM for the analysis herein include the syntactic derivation of words; the…

  4. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    ERIC Educational Resources Information Center

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  5. The Use of Cognitive Task Analysis to Capture Expertise for Tracheal Extubation Training in Anesthesiology

    ERIC Educational Resources Information Center

    Embrey, Karen K.

    2012-01-01

    Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…

  6. 77 FR 14712 - Approval and Promulgation of Air Quality Implementation Plans; Massachusetts; Determination of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... Massachusetts Department of Environmental Protection performed a missing data analysis for each site with low... Massachusetts missing data analysis used a combination of meteorology and air quality data for ozone monitors... with missing ozone data, the ozone levels, if captured, would have been below the 1997 8-hour ozone...

  7. Aggregation Bias and the Analysis of Necessary and Sufficient Conditions in fsQCA

    ERIC Educational Resources Information Center

    Braumoeller, Bear F.

    2017-01-01

    Fuzzy-set qualitative comparative analysis (fsQCA) has become one of the most prominent methods in the social sciences for capturing causal complexity, especially for scholars with small- and medium-"N" data sets. This research note explores two key assumptions in fsQCA's methodology for testing for necessary and sufficient…

  8. Discovering Reliable Sources of Biochemical Thermodynamic Data to Aid Students' Understanding

    ERIC Educational Resources Information Center

    Me´ndez, Eduardo; Cerda´, María F.

    2016-01-01

    Students of physical chemistry in biochemical disciplines need biochemical examples to capture the need, not always understood, of a difficult area in their studies. The use of thermodynamic data in the chemical reference state may lead to incorrect interpretations in the analysis of biochemical examples when the analysis does not include relevant…

  9. Latent Class Analysis of Conduct Problems of Elementary Students Receiving Special Education Services

    ERIC Educational Resources Information Center

    Toupin, Jean; Déry, Michèle; Verlaan, Pierrette; Lemelin, Jean-Pascal; Lecocq, Aurélie; Jagiellowicz, Jadwiga

    2016-01-01

    Students with conduct problems (CPs) may present heterogeneity in terms of behavioral manifestations and service needs. Previous studies using Latent Class Analysis (LCA) to capture this heterogeneity have been conducted mostly with community samples and have often applied a narrow definition of CP. Considering this context, this study…

  10. Set of new draft methods for the analysis of organic disinfection by-products, including 551 and 552. Draft report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-01-01

    The set of documents discusses the new draft methods (EPA method 551, EPA method 552) for the analysis of disinfection byproducts contained in drinking water. The methods use the techniques of liquid/liquid extraction and gas chromatography with electron capture detection.

  11. Armoured spiderman: morphological and behavioural adaptations of a specialised araneophagous predator (Araneae: Palpimanidae).

    PubMed

    Pekár, Stano; Sobotník, Jan; Lubin, Yael

    2011-07-01

    In a predator-prey system where both intervenients come from the same taxon, one can expect a strong selection on behavioural and morphological traits involved in prey capture. For example, in specialised snake-eating snakes, the predator is unaffetced by the venom of the prey. We predicted that similar adaptations should have evolved in spider-eating (araneophagous) spiders. We investigated potential and actual prey of two Palpimanus spiders (P. gibbulus, P. orientalis) to support the prediction that these are araneophagous predators. Specific behavioural adaptations were investigated using a high-speed camera during staged encounters with prey, while morphological adaptations were investigated using electron microscopy. Both Palpimanus species captured a wide assortment of spider species from various guilds but also a few insect species. Analysis of the potential prey suggested that Palpimanus is a retreat-invading predator that actively searches for spiders that hide in a retreat. Behavioural capture adaptations include a slow, stealthy approach to the prey followed by a very fast attack. Morphological capture adaptations include scopulae on forelegs used in grabbing prey body parts, stout forelegs to hold the prey firmly, and an extremely thick cuticle all over the body preventing injury from a counter bite of the prey. Palpimanus overwhelmed prey that was more than 200% larger than itself. In trials with another araneophagous spider, Cyrba algerina (Salticidae), Palpimanus captured C. algerina in more than 90% of cases independent of the size ratio between the spiders. Evidence indicates that both Palpimanus species possesses remarkable adaptations that increase its efficiency in capturing spider prey.

  12. Attentional capture by alcohol-related stimuli may be activated involuntarily by top-down search goals.

    PubMed

    Brown, Chris R H; Duka, Theodora; Forster, Sophie

    2018-04-25

    Previous research has found that the attention of social drinkers is preferentially oriented towards alcohol-related stimuli (attentional capture). This is argued to play a role in escalating craving for alcohol that can result in hazardous drinking. According to incentive theories of drug addiction, the stimuli associated with the drug reward acquire learned incentive salience and grab attention. However, it is not clear whether the mechanism by which this bias is created is a voluntary or an automatic one, although some evidence suggests a stimulus-driven mechanism. Here, we test for the first time whether this attentional capture could reflect an involuntary consequence of a goal-driven mechanism. Across three experiments, participants were given search goals to detect either an alcoholic or a non-alcoholic object (target) in a stream of briefly presented objects unrelated to the target. Prior to the target, a task-irrelevant parafoveal distractor appeared. This could either be congruent or incongruent with the current search goal. Applying a meta-analysis, we combined the results across the three experiments and found consistent evidence of goal-driven attentional capture, whereby alcohol distractors impeded target detection when the search goal was for alcohol. By contrast, alcohol distractors did not interfere with target detection, whilst participants were searching for a non-alcoholic category. A separate experiment revealed that the goal-driven capture effect was not found when participants held alcohol features active in memory but did not intentionally search for them. These findings suggest a strong goal-driven account of attentional capture by alcohol cues in social drinkers.

  13. Whole-exome sequencing for mutation detection in pediatric disorders of insulin secretion: Maturity onset diabetes of the young and congenital hyperinsulinism.

    PubMed

    Johnson, S R; Leo, P J; McInerney-Leo, A M; Anderson, L K; Marshall, M; McGown, I; Newell, F; Brown, M A; Conwell, L S; Harris, M; Duncan, E L

    2018-06-01

    To assess the utility of whole-exome sequencing (WES) for mutation detection in maturity-onset diabetes of the young (MODY) and congenital hyperinsulinism (CHI). MODY and CHI are the two commonest monogenic disorders of glucose-regulated insulin secretion in childhood, with 13 causative genes known for MODY and 10 causative genes identified for CHI. The large number of potential genes makes comprehensive screening using traditional methods expensive and time-consuming. Ten subjects with MODY and five with CHI with known mutations underwent WES using two different exome capture kits (Nimblegen SeqCap EZ Human v3.0 Exome Enrichment Kit, Nextera Rapid Capture Exome Kit). Analysis was blinded to previously identified mutations, and included assessment for large deletions. The target capture of five exome capture technologies was also analyzed using sequencing data from >2800 unrelated samples. Four of five MODY mutations were identified using Nimblegen (including a large deletion in HNF1B). Although targeted, one mutation (in INS) had insufficient coverage for detection. Eleven of eleven mutations (six MODY, five CHI) were identified using Nextera Rapid (including the previously missed mutation). On reconciliation, all mutations concorded with previous data and no additional variants in MODY genes were detected. There were marked differences in the performance of the capture technologies. WES can be useful for screening for MODY/CHI mutations, detecting both point mutations and large deletions. However, capture technologies require careful selection. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Pulling out the 1%: Whole-Genome Capture for the Targeted Enrichment of Ancient DNA Sequencing Libraries

    PubMed Central

    Carpenter, Meredith L.; Buenrostro, Jason D.; Valdiosera, Cristina; Schroeder, Hannes; Allentoft, Morten E.; Sikora, Martin; Rasmussen, Morten; Gravel, Simon; Guillén, Sonia; Nekhrizov, Georgi; Leshtakov, Krasimir; Dimitrova, Diana; Theodossiev, Nikola; Pettener, Davide; Luiselli, Donata; Sandoval, Karla; Moreno-Estrada, Andrés; Li, Yingrui; Wang, Jun; Gilbert, M. Thomas P.; Willerslev, Eske; Greenleaf, William J.; Bustamante, Carlos D.

    2013-01-01

    Most ancient specimens contain very low levels of endogenous DNA, precluding the shotgun sequencing of many interesting samples because of cost. Ancient DNA (aDNA) libraries often contain <1% endogenous DNA, with the majority of sequencing capacity taken up by environmental DNA. Here we present a capture-based method for enriching the endogenous component of aDNA sequencing libraries. By using biotinylated RNA baits transcribed from genomic DNA libraries, we are able to capture DNA fragments from across the human genome. We demonstrate this method on libraries created from four Iron Age and Bronze Age human teeth from Bulgaria, as well as bone samples from seven Peruvian mummies and a Bronze Age hair sample from Denmark. Prior to capture, shotgun sequencing of these libraries yielded an average of 1.2% of reads mapping to the human genome (including duplicates). After capture, this fraction increased substantially, with up to 59% of reads mapped to human and enrichment ranging from 6- to 159-fold. Furthermore, we maintained coverage of the majority of regions sequenced in the precapture library. Intersection with the 1000 Genomes Project reference panel yielded an average of 50,723 SNPs (range 3,062–147,243) for the postcapture libraries sequenced with 1 million reads, compared with 13,280 SNPs (range 217–73,266) for the precapture libraries, increasing resolution in population genetic analyses. Our whole-genome capture approach makes it less costly to sequence aDNA from specimens containing very low levels of endogenous DNA, enabling the analysis of larger numbers of samples. PMID:24568772

  15. Armoured spiderman: morphological and behavioural adaptations of a specialised araneophagous predator (Araneae: Palpimanidae)

    NASA Astrophysics Data System (ADS)

    Pekár, Stano; Šobotník, Jan; Lubin, Yael

    2011-07-01

    In a predator-prey system where both intervenients come from the same taxon, one can expect a strong selection on behavioural and morphological traits involved in prey capture. For example, in specialised snake-eating snakes, the predator is unaffetced by the venom of the prey. We predicted that similar adaptations should have evolved in spider-eating (araneophagous) spiders. We investigated potential and actual prey of two Palpimanus spiders ( P. gibbulus, P. orientalis) to support the prediction that these are araneophagous predators. Specific behavioural adaptations were investigated using a high-speed camera during staged encounters with prey, while morphological adaptations were investigated using electron microscopy. Both Palpimanus species captured a wide assortment of spider species from various guilds but also a few insect species. Analysis of the potential prey suggested that Palpimanus is a retreat-invading predator that actively searches for spiders that hide in a retreat. Behavioural capture adaptations include a slow, stealthy approach to the prey followed by a very fast attack. Morphological capture adaptations include scopulae on forelegs used in grabbing prey body parts, stout forelegs to hold the prey firmly, and an extremely thick cuticle all over the body preventing injury from a counter bite of the prey. Palpimanus overwhelmed prey that was more than 200% larger than itself. In trials with another araneophagous spider, Cyrba algerina (Salticidae), Palpimanus captured C. algerina in more than 90% of cases independent of the size ratio between the spiders. Evidence indicates that both Palpimanus species possesses remarkable adaptations that increase its efficiency in capturing spider prey.

  16. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  17. Determination of iodopropynyl butylcarbamate in cosmetic formulations utilizing pulsed splitless injection, gas chromatography with electron capture detector.

    PubMed

    Palmer, Kevin B; LaFon, William; Burford, Mark D

    2017-09-22

    Current analytical methodology for iodopropynyl butylcarbamate (IPBC) analysis focuses on the use of liquid chromatography and mass spectrometer (LC-MS), but the high instrumentation and operator investment required has resulted in the need for a cost effective alternative methodology. Past publications investigating gas chromatography with electron capture detector (GC-ECD) for IPBC quantitation proved largely unsuccessful, likely due to the preservatives limited thermal stability. The use of pulsed injection techniques commonly used for trace analysis of thermally labile pharmaceutical compounds was successfully adapted for IPBC analysis and utilizes the selectivity of GC-ECD analysis. System optimization and sample preparation improvements resulted in substantial performance and reproducibility gains. Cosmetic formulations preserved with IPBC (50-100ppm) were solvated in toluene/isopropyl alcohol and quantified over the 0.3-1.3μg/ml calibration range. The methodology was robust (relative standard deviation 4%), accurate (98% recovery), and sensitive (limit of detection 0.25ng/ml) for use in routine testing of cosmetic formulation preservation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  19. Estimation of Ground Reaction Forces and Moments During Gait Using Only Inertial Motion Capture

    PubMed Central

    Karatsidis, Angelos; Bellusci, Giovanni; Schepers, H. Martin; de Zee, Mark; Andersen, Michael S.; Veltink, Peter H.

    2016-01-01

    Ground reaction forces and moments (GRF&M) are important measures used as input in biomechanical analysis to estimate joint kinetics, which often are used to infer information for many musculoskeletal diseases. Their assessment is conventionally achieved using laboratory-based equipment that cannot be applied in daily life monitoring. In this study, we propose a method to predict GRF&M during walking, using exclusively kinematic information from fully-ambulatory inertial motion capture (IMC). From the equations of motion, we derive the total external forces and moments. Then, we solve the indeterminacy problem during double stance using a distribution algorithm based on a smooth transition assumption. The agreement between the IMC-predicted and reference GRF&M was categorized over normal walking speed as excellent for the vertical (ρ = 0.992, rRMSE = 5.3%), anterior (ρ = 0.965, rRMSE = 9.4%) and sagittal (ρ = 0.933, rRMSE = 12.4%) GRF&M components and as strong for the lateral (ρ = 0.862, rRMSE = 13.1%), frontal (ρ = 0.710, rRMSE = 29.6%), and transverse GRF&M (ρ = 0.826, rRMSE = 18.2%). Sensitivity analysis was performed on the effect of the cut-off frequency used in the filtering of the input kinematics, as well as the threshold velocities for the gait event detection algorithm. This study was the first to use only inertial motion capture to estimate 3D GRF&M during gait, providing comparable accuracy with optical motion capture prediction. This approach enables applications that require estimation of the kinetics during walking outside the gait laboratory. PMID:28042857

  20. Size and deformability based separation of circulating tumor cells from castrate resistant prostate cancer patients using resettable cell traps.

    PubMed

    Qin, Xi; Park, Sunyoung; Duffy, Simon P; Matthews, Kerryn; Ang, Richard R; Todenhöfer, Tilman; Abdi, Hamid; Azad, Arun; Bazov, Jenny; Chi, Kim N; Black, Peter C; Ma, Hongshen

    2015-05-21

    The enumeration and capture of circulating tumor cells (CTCs) are potentially of great clinical value as they offer a non-invasive means to access tumor materials to diagnose disease and monitor treatment efficacy. Conventional immunoenrichment of CTCs may fail to capture cells with low surface antigen expression. Micropore filtration presents a compelling label-free alternative that enriches CTCs using their biophysical rather than biochemical characteristics. However, this strategy is prone to clogging of the filter microstructure, which dramatically reduces the selectivity after processing large numbers of cells. Here, we use the resettable cell trap (RCT) mechanism to separate cells based on their size and deformability using an adjustable aperture that can be periodically cleared to prevent clogging. After separation, the output sample is stained and analyzed using multi-spectral analysis, which provides a more sensitive and unambiguous method to identify CTC biomarkers than traditional immunofluorescence. We tested the RCT device using blood samples obtained from 22 patients with metastatic castrate-resistant prostate cancer while comparing the results with the established CellSearch® system. The RCT mechanism was able to capture ≥5 CTCs in 18/22 (82%) patients with a mean count of 257 in 7.5 ml of whole blood, while the CellSearch system found ≥5 CTCs in 9/22 (41%) patients with a mean count of 25. The ~10× improvement in the CTC capture rate provides significantly more materials for subsequent analysis of these cells such as immunofluorescence, propagation by tissue culture, and genetic profiling.

  1. Variance associated with walking velocity during force platform gait analysis of a heterogeneous sample of clinically normal dogs.

    PubMed

    Piazza, Alexander M; Binversie, Emily E; Baker, Lauren A; Nemke, Brett; Sample, Susannah J; Muir, Peter

    2017-04-01

    OBJECTIVE To determine whether walking at specific ranges of absolute and relative (V*) velocity would aid efficient capture of gait trial data with low ground reaction force (GRF) variance in a heterogeneous sample of dogs. ANIMALS 17 clinically normal dogs of various breeds, ages, and sexes. PROCEDURES Each dog was walked across a force platform at its preferred velocity, with controlled acceleration within 0.5 m/s 2 . Ranges in V* were created for height at the highest point of the shoulders (withers; WHV*). Variance effects from 8 walking absolute velocity ranges and associated WHV* ranges were examined by means of repeated-measures ANCOVA. RESULTS The individual dog effect provided the greatest contribution to variance. Narrow velocity ranges typically resulted in capture of a smaller percentage of valid trials and were not consistently associated with lower variance. The WHV* range of 0.33 to 0.46 allowed capture of valid trials efficiently, with no significant effects on peak vertical force and vertical impulse. CONCLUSIONS AND CLINICAL RELEVANCE Dogs with severe lameness may be unable to trot or may have a decline in mobility with gait trial repetition. Gait analysis involving evaluation of individual dogs at their preferred absolute velocity, such that dogs are evaluated at a similar V*, may facilitate efficient capture of valid trials without significant effects on GRF. Use of individual velocity ranges derived from a WHV* range of 0.33 to 0.46 can account for heterogeneity and appears suitable for use in clinical trials involving dogs at a walking gait.

  2. Dominant modes of variability in large-scale Birkeland currents

    NASA Astrophysics Data System (ADS)

    Cousins, E. D. P.; Matsuo, Tomoko; Richmond, A. D.; Anderson, B. J.

    2015-08-01

    Properties of variability in large-scale Birkeland currents are investigated through empirical orthogonal function (EOF) analysis of 1 week of data from the Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE). Mean distributions and dominant modes of variability are identified for both the Northern and Southern Hemispheres. Differences in the results from the two hemispheres are observed, which are attributed to seasonal differences in conductivity (the study period occurred near solstice). A universal mean and set of dominant modes of variability are obtained through combining the hemispheric results, and it is found that the mean and first three modes of variability (EOFs) account for 38% of the total observed squared magnetic perturbations (δB2) from both hemispheres. The mean distribution represents a standard Region 1/Region 2 (R1/R2) morphology of currents and EOF 1 captures the strengthening/weakening of the average distribution and is well correlated with the north-south component of the interplanetary magnetic field (IMF). EOF 2 captures a mixture of effects including the expansion/contraction and rotation of the (R1/R2) currents; this mode correlates only weakly with possible external driving parameters. EOF 3 captures changes in the morphology of the currents in the dayside cusp region and is well correlated with the dawn-dusk component of the IMF. The higher-order EOFs capture more complex, smaller-scale variations in the Birkeland currents and appear generally uncorrelated with external driving parameters. The results of the EOF analysis described here are used for describing error covariance in a data assimilation procedure utilizing AMPERE data, as described in a companion paper.

  3. Horizon: The Portable, Scalable, and Reusable Framework for Developing Automated Data Management and Product Generation Systems

    NASA Astrophysics Data System (ADS)

    Huang, T.; Alarcon, C.; Quach, N. T.

    2014-12-01

    Capture, curate, and analysis are the typical activities performed at any given Earth Science data center. Modern data management systems must be adaptable to heterogeneous science data formats, scalable to meet the mission's quality of service requirements, and able to manage the life-cycle of any given science data product. Designing a scalable data management doesn't happen overnight. It takes countless hours of refining, refactoring, retesting, and re-architecting. The Horizon data management and workflow framework, developed at the Jet Propulsion Laboratory, is a portable, scalable, and reusable framework for developing high-performance data management and product generation workflow systems to automate data capturing, data curation, and data analysis activities. The NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC)'s Data Management and Archive System (DMAS) is its core data infrastructure that handles capturing and distribution of hundreds of thousands of satellite observations each day around the clock. DMAS is an application of the Horizon framework. The NASA Global Imagery Browse Services (GIBS) is NASA's Earth Observing System Data and Information System (EOSDIS)'s solution for making high-resolution global imageries available to the science communities. The Imagery Exchange (TIE), an application of the Horizon framework, is a core subsystem for GIBS responsible for data capturing and imagery generation automation to support the EOSDIS' 12 distributed active archive centers and 17 Science Investigator-led Processing Systems (SIPS). This presentation discusses our ongoing effort in refining, refactoring, retesting, and re-architecting the Horizon framework to enable data-intensive science and its applications.

  4. Mars Rock Analysis Briefing

    NASA Image and Video Library

    2013-03-12

    Paul Mahaffy (right), principal investigator for Curiosity's Sample Analysis at Mars (SAM) investigation at NASA's Goddard Space Flight Center in Maryland, demonstrates how the SAM instrument drilled and captured rock samples on the surface of Mars at a news conference, Tuesday, March 12, 2013 at NASA Headquarters in Washington. The analysis of the rock sample collected shows ancient Mars could have supported living microbes. Photo Credit: (NASA/Carla Cioffi)

  5. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  6. Development of Real-Time Image and In Situ Data Analysis at Sea

    DTIC Science & Technology

    1991-10-16

    for continuous capture from multiple satellites. The Blackhole System is the analysis machine used either by researchers to process/analyze their...Orbital Tracker and the antenna subsystem was overhauled. THE BLACKHOLE ANALYSIS SYSTEM A new HP9000/350 workstation was installed at SSOC to perform...L 4)L Scripps Satellite Oceanography Center Blackhole System Diagram (Analysis Machine) HP 350 Workstation Motorola 68020 CPU 2 - 512 MB hard disks

  7. Use of regularized principal component analysis to model anatomical changes during head and neck radiation therapy for treatment adaptation and response assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, Mikhail A., E-mail: chetvertkov@wayne

    2016-10-15

    Purpose: To develop standard (SPCA) and regularized (RPCA) principal component analysis models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients and assess their potential use in adaptive radiation therapy, and for extracting quantitative information for treatment response assessment. Methods: Planning CT images of ten H&N patients were artificially deformed to create “digital phantom” images, which modeled systematic anatomical changes during radiation therapy. Artificial deformations closely mirrored patients’ actual deformations and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and syntheticmore » CBCTs (i.e., digital phantoms) and between pCT and clinical CBCTs. Patient-specific SPCA and RPCA models were built from these synthetic and clinical DVF sets. EigenDVFs (EDVFs) having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Results: Principal component analysis (PCA) models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade PCA’s ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. Conclusions: Leading EDVFs from the both PCA approaches have the potential to capture systematic anatomical change during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in a treatment course, or based on population models.« less

  8. [The estimation of possibilities for the application of the laser capture microdissection technology for the molecular-genetic expert analysis (genotyping) of human chromosomal DNA].

    PubMed

    Ivanov, P L; Leonov, S N; Zemskova, E Iu

    2012-01-01

    The present study was designed to estimate the possibilities of application of the laser capture microdissection (LCM) technology for the molecular-genetic expert analysis (genotyping) of human chromosomal DNA. The experimental method employed for the purpose was the multiplex multilocus analysis of autosomal DNA polymorphism in the preparations of buccal epitheliocytes obtained by LCM. The key principles of the study were the application of physical methods for contrast enhancement of the micropreparations (such as phase-contrast microscopy and dark-field microscopy) and PCR-compatible cell lysis. Genotyping was carried out with the use of AmpFISTR Minifiler TM PCR Amplification Kits ("Applied Biosynthesis", USA). It was shown that the technique employed in the present study ensures reliable genotyping of human chromosomal DNA in the pooled preparations containing 10-20 dissected diploid cells each. This result fairly well agrees with the calculated sensitivity of the method. A few practical recommendations are offered.

  9. Modeling abundance using multinomial N-mixture models

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.

  10. Quantitative Detection of Trace Explosive Vapors by Programmed Temperature Desorption Gas Chromatography-Electron Capture Detector

    PubMed Central

    Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.

    2014-01-01

    The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416

  11. Quantitative detection of trace explosive vapors by programmed temperature desorption gas chromatography-electron capture detector.

    PubMed

    Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L

    2014-07-25

    The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.

  12. [Development of an original computer program FISHMet: use for molecular cytogenetic diagnosis and genome mapping by fluorescent in situ hybridization (FISH)].

    PubMed

    Iurov, Iu B; Khazatskiĭ, I A; Akindinov, V A; Dovgilov, L V; Kobrinskiĭ, B A; Vorsanova, S G

    2000-08-01

    Original software FISHMet has been developed and tried for improving the efficiency of diagnosis of hereditary diseases caused by chromosome aberrations and for chromosome mapping by fluorescent in situ hybridization (FISH) method. The program allows creation and analysis of pseudocolor chromosome images and hybridization signals in the Windows 95 system, allows computer analysis and editing of the results of pseudocolor hybridization in situ, including successive imposition of initial black-and-white images created using fluorescent filters (blue, green, and red), and editing of each image individually or of a summary pseudocolor image in BMP, TIFF, and JPEG formats. Components of image computer analysis system (LOMO, Leitz Ortoplan, and Axioplan fluorescent microscopes, COHU 4910 and Sanyo VCB-3512P CCD cameras, Miro-Video, Scion LG-3 and VG-5 image capture maps, and Pentium 100 and Pentium 200 computers) and specialized software for image capture and visualization (Scion Image PC and Video-Cup) have been used with good results in the study.

  13. A stochastic evolutionary model generating a mixture of exponential distributions

    NASA Astrophysics Data System (ADS)

    Fenner, Trevor; Levene, Mark; Loizou, George

    2016-02-01

    Recent interest in human dynamics has stimulated the investigation of the stochastic processes that explain human behaviour in various contexts, such as mobile phone networks and social media. In this paper, we extend the stochastic urn-based model proposed in [T. Fenner, M. Levene, G. Loizou, J. Stat. Mech. 2015, P08015 (2015)] so that it can generate mixture models, in particular, a mixture of exponential distributions. The model is designed to capture the dynamics of survival analysis, traditionally employed in clinical trials, reliability analysis in engineering, and more recently in the analysis of large data sets recording human dynamics. The mixture modelling approach, which is relatively simple and well understood, is very effective in capturing heterogeneity in data. We provide empirical evidence for the validity of the model, using a data set of popular search engine queries collected over a period of 114 months. We show that the survival function of these queries is closely matched by the exponential mixture solution for our model.

  14. Discomfort Evaluation of Truck Ingress/Egress Motions Based on Biomechanical Analysis

    PubMed Central

    Choi, Nam-Chul; Lee, Sang Hun

    2015-01-01

    This paper presents a quantitative discomfort evaluation method based on biomechanical analysis results for human body movement, as well as its application to an assessment of the discomfort for truck ingress and egress. In this study, the motions of a human subject entering and exiting truck cabins with different types, numbers, and heights of footsteps were first measured using an optical motion capture system and load sensors. Next, the maximum voluntary contraction (MVC) ratios of the muscles were calculated through a biomechanical analysis of the musculoskeletal human model for the captured motion. Finally, the objective discomfort was evaluated using the proposed discomfort model based on the MVC ratios. To validate this new discomfort assessment method, human subject experiments were performed to investigate the subjective discomfort levels through a questionnaire for comparison with the objective discomfort levels. The validation results showed that the correlation between the objective and subjective discomforts was significant and could be described by a linear regression model. PMID:26067194

  15. An experimental protocol for the definition of upper limb anatomical frames on children using magneto-inertial sensors.

    PubMed

    Ricci, L; Formica, D; Tamilia, E; Taffoni, F; Sparaci, L; Capirci, O; Guglielmelli, E

    2013-01-01

    Motion capture based on magneto-inertial sensors is a technology enabling data collection in unstructured environments, allowing "out of the lab" motion analysis. This technology is a good candidate for motion analysis of children thanks to the reduced weight and size as well as the use of wireless communication that has improved its wearability and reduced its obtrusivity. A key issue in the application of such technology for motion analysis is its calibration, i.e. a process that allows mapping orientation information from each sensor to a physiological reference frame. To date, even if there are several calibration procedures available for adults, no specific calibration procedures have been developed for children. This work addresses this specific issue presenting a calibration procedure for motion capture of thorax and upper limbs on healthy children. Reported results suggest comparable performance with similar studies on adults and emphasize some critical issues, opening the way to further improvements.

  16. Predicting the ultimate potential of natural gas SOFC power cycles with CO2 capture - Part B: Applications

    NASA Astrophysics Data System (ADS)

    Campanari, Stefano; Mastropasqua, Luca; Gazzani, Matteo; Chiesa, Paolo; Romano, Matteo C.

    2016-09-01

    An important advantage of solid oxide fuel cells (SOFC) as future systems for large scale power generation is the possibility of being efficiently integrated with processes for CO2 capture. Focusing on natural gas power generation, Part A of this work assessed the performances of advanced pressurised and atmospheric plant configurations (SOFC + GT and SOFC + ST, with fuel cell integration within a gas turbine or a steam turbine cycle) without CO2 separation. This Part B paper investigates such kind of power cycles when applied to CO2 capture, proposing two ultra-high efficiency plant configurations based on advanced intermediate-temperature SOFCs with internal reforming and low temperature CO2 separation process. The power plants are simulated at the 100 MW scale with a set of realistic assumptions about FC performances, main components and auxiliaries, and show the capability of exceeding 70% LHV efficiency with high CO2 capture (above 80%) and a low specific primary energy consumption for the CO2 avoided (1.1-2.4 MJ kg-1). Detailed results are presented in terms of energy and material balances, and a sensitivity analysis of plant performance is developed vs. FC voltage and fuel utilisation to investigate possible long-term improvements. Options for further improvement of the CO2 capture efficiency are also addressed.

  17. Genetic analysis of Chinese families reveals a novel truncation allele of the retinitis pigmentosa GTPase regulator gene

    PubMed Central

    Hu, Fang; Zeng, Xiang-Yun; Liu, Lin-Lin; Luo, Yao-Ling; Jiang, Yi-Ping; Wang, Hui; Xie, Jing; Hu, Cheng-Quan; Gan, Lin; Huang, Liang

    2014-01-01

    AIM To make comprehensive molecular diagnosis for retinitis pigmentosa (RP) patients in a consanguineous Han Chinese family using next generation sequencing based Capture-NGS screen technology. METHODS A five-generation Han Chinese family diagnosed as non-syndromic X-linked recessive RP (XLRP) was recruited, including four affected males, four obligate female carriers and eleven unaffected family members. Capture-NGS was performed using a custom designed capture panel covers 163 known retinal disease genes including 47 RP genes, followed by the validation of detected mutation using Sanger sequencing in all recruited family members. RESULTS Capture-NGS in one affected 47-year-old male reveals a novel mutation, c.2417_2418insG:p.E806fs, in exon ORF15 of RP GTPase regulator (RPGR) gene results in a frameshift change that results in a premature stop codon and a truncated protein product. The mutation was further validated in three of four affected males and two of four female carriers but not in the other unaffected family members. CONCLUSION We have identified a novel mutation, c.2417_2418insG:p.E806fs, in a Han Chinese family with XLRP. Our findings expand the mutation spectrum of RPGR and the phenotypic spectrum of XLRP in Han Chinese families, and confirms Capture-NGS could be an effective and economic approach for the comprehensive molecular diagnosis of RP. PMID:25349787

  18. NEW NEUTRON-CAPTURE MEASUREMENTS IN 23 OPEN CLUSTERS. I. THE r -PROCESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Overbeek, Jamie C.; Friel, Eileen D.; Jacobson, Heather R., E-mail: joverbee@indiana.edu

    2016-06-20

    Neutron-capture elements, those with Z > 35, are the least well understood in terms of nucleosynthesis and formation environments. The rapid neutron-capture, or r -process, elements are formed in the environments and/or remnants of massive stars, while the slow neutron-capture, or s -process, elements are primarily formed in low-mass AGB stars. These elements can provide much information about Galactic star formation and enrichment, but observational data are limited. We have assembled a sample of 68 stars in 23 open clusters that we use to probe abundance trends for six neutron-capture elements (Eu, Gd, Dy, Mo, Pr, and Nd) with clustermore » age and location in the disk of the Galaxy. In order to keep our analysis as homogeneous as possible, we use an automated synthesis fitting program, which also enables us to measure multiple (3–10) lines for each element. We find that the pure r -process elements (Eu, Gd, and Dy) have positive trends with increasing cluster age, while the mixed r - and s -process elements (Mo, Pr, and Nd) have insignificant trends consistent with zero. Pr, Nd, Eu, Gd, and Dy have similar, slight (although mostly statistically significant) gradients of ∼0.04 dex kpc{sup −1}. The mixed elements also appear to have nonlinear relationships with R {sub GC}.« less

  19. Echolocating bats use future-target information for optimal foraging.

    PubMed

    Fujioka, Emyo; Aihara, Ikkyu; Sumiya, Miwa; Aihara, Kazuyuki; Hiryu, Shizuko

    2016-04-26

    When seeing or listening to an object, we aim our attention toward it. While capturing prey, many animal species focus their visual or acoustic attention toward the prey. However, for multiple prey items, the direction and timing of attention for effective foraging remain unknown. In this study, we adopted both experimental and mathematical methodology with microphone-array measurements and mathematical modeling analysis to quantify the attention of echolocating bats that were repeatedly capturing airborne insects in the field. Here we show that bats select rational flight paths to consecutively capture multiple prey items. Microphone-array measurements showed that bats direct their sonar attention not only to the immediate prey but also to the next prey. In addition, we found that a bat's attention in terms of its flight also aims toward the next prey even when approaching the immediate prey. Numerical simulations revealed a possibility that bats shift their flight attention to control suitable flight paths for consecutive capture. When a bat only aims its flight attention toward its immediate prey, it rarely succeeds in capturing the next prey. These findings indicate that bats gain increased benefit by distributing their attention among multiple targets and planning the future flight path based on additional information of the next prey. These experimental and mathematical studies allowed us to observe the process of decision making by bats during their natural flight dynamics.

  20. A Study of Vicon System Positioning Performance.

    PubMed

    Merriaux, Pierre; Dupuis, Yohan; Boutteau, Rémi; Vasseur, Pascal; Savatier, Xavier

    2017-07-07

    Motion capture setups are used in numerous fields. Studies based on motion capture data can be found in biomechanical, sport or animal science. Clinical science studies include gait analysis as well as balance, posture and motor control. Robotic applications encompass object tracking. Today's life applications includes entertainment or augmented reality. Still, few studies investigate the positioning performance of motion capture setups. In this paper, we study the positioning performance of one player in the optoelectronic motion capture based on markers: Vicon system. Our protocol includes evaluations of static and dynamic performances. Mean error as well as positioning variabilities are studied with calibrated ground truth setups that are not based on other motion capture modalities. We introduce a new setup that enables directly estimating the absolute positioning accuracy for dynamic experiments contrary to state-of-the art works that rely on inter-marker distances. The system performs well on static experiments with a mean absolute error of 0.15 mm and a variability lower than 0.025 mm. Our dynamic experiments were carried out at speeds found in real applications. Our work suggests that the system error is less than 2 mm. We also found that marker size and Vicon sampling rate must be carefully chosen with respect to the speed encountered in the application in order to reach optimal positioning performance that can go to 0.3 mm for our dynamic study.

  1. Scanning electron microscope automatic defect classification of process induced defects

    NASA Astrophysics Data System (ADS)

    Wolfe, Scott; McGarvey, Steve

    2017-03-01

    With the integration of high speed Scanning Electron Microscope (SEM) based Automated Defect Redetection (ADR) in both high volume semiconductor manufacturing and Research and Development (R and D), the need for reliable SEM Automated Defect Classification (ADC) has grown tremendously in the past few years. In many high volume manufacturing facilities and R and D operations, defect inspection is performed on EBeam (EB), Bright Field (BF) or Dark Field (DF) defect inspection equipment. A comma separated value (CSV) file is created by both the patterned and non-patterned defect inspection tools. The defect inspection result file contains a list of the inspection anomalies detected during the inspection tools' examination of each structure, or the examination of an entire wafers surface for non-patterned applications. This file is imported into the Defect Review Scanning Electron Microscope (DRSEM). Following the defect inspection result file import, the DRSEM automatically moves the wafer to each defect coordinate and performs ADR. During ADR the DRSEM operates in a reference mode, capturing a SEM image at the exact position of the anomalies coordinates and capturing a SEM image of a reference location in the center of the wafer. A Defect reference image is created based on the Reference image minus the Defect image. The exact coordinates of the defect is calculated based on the calculated defect position and the anomalies stage coordinate calculated when the high magnification SEM defect image is captured. The captured SEM image is processed through either DRSEM ADC binning, exporting to a Yield Analysis System (YAS), or a combination of both. Process Engineers, Yield Analysis Engineers or Failure Analysis Engineers will manually review the captured images to insure that either the YAS defect binning is accurately classifying the defects or that the DRSEM defect binning is accurately classifying the defects. This paper is an exploration of the feasibility of the utilization of a Hitachi RS4000 Defect Review SEM to perform Automatic Defect Classification with the objective of the total automated classification accuracy being greater than human based defect classification binning when the defects do not require multiple process step knowledge for accurate classification. The implementation of DRSEM ADC has the potential to improve the response time between defect detection and defect classification. Faster defect classification will allow for rapid response to yield anomalies that will ultimately reduce the wafer and/or the die yield.

  2. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    PubMed

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Top-Down Hydrogen-Deuterium Exchange Analysis of Protein Structures Using Ultraviolet Photodissociation.

    PubMed

    Brodie, Nicholas I; Huguet, Romain; Zhang, Terry; Viner, Rosa; Zabrouskov, Vlad; Pan, Jingxi; Petrotchenko, Evgeniy V; Borchers, Christoph H

    2018-03-06

    Top-down hydrogen-deuterium exchange (HDX) analysis using electron capture or transfer dissociation Fourier transform mass spectrometry (FTMS) is a powerful method for the analysis of secondary structure of proteins in solution. The resolution of the method is a function of the degree of fragmentation of backbone bonds in the proteins. While fragmentation is usually extensive near the N- and C-termini, electron capture (ECD) or electron transfer dissociation (ETD) fragmentation methods sometimes lack good coverage of certain regions of the protein, most often in the middle of the sequence. Ultraviolet photodissociation (UVPD) is a recently developed fast-fragmentation technique, which provides extensive backbone fragmentation that can be complementary in sequence coverage to the aforementioned electron-based fragmentation techniques. Here, we explore the application of electrospray ionization (ESI)-UVPD FTMS on an Orbitrap Fusion Lumos Tribrid mass spectrometer to top-down HDX analysis of proteins. We have incorporated UVPD-specific fragment-ion types and fragment-ion mixtures into our isotopic envelope fitting software (HDX Match) for the top-down HDX analysis. We have shown that UVPD data is complementary to ETD, thus improving the overall resolution when used as a combined approach.

  4. Uncertainty Requirement Analysis for the Orbit, Attitude, and Burn Performance of the 1st Lunar Orbit Insertion Maneuver

    NASA Astrophysics Data System (ADS)

    Song, Young-Joo; Bae, Jonghee; Kim, Young-Rok; Kim, Bang-Yeop

    2016-12-01

    In this study, the uncertainty requirements for orbit, attitude, and burn performance were estimated and analyzed for the execution of the 1st lunar orbit insertion (LOI) maneuver of the Korea Pathfinder Lunar Orbiter (KPLO) mission. During the early design phase of the system, associate analysis is an essential design factor as the 1st LOI maneuver is the largest burn that utilizes the onboard propulsion system; the success of the lunar capture is directly affected by the performance achieved. For the analysis, the spacecraft is assumed to have already approached the periselene with a hyperbolic arrival trajectory around the moon. In addition, diverse arrival conditions and mission constraints were considered, such as varying periselene approach velocity, altitude, and orbital period of the capture orbit after execution of the 1st LOI maneuver. The current analysis assumed an impulsive LOI maneuver, and two-body equations of motion were adapted to simplify the problem for a preliminary analysis. Monte Carlo simulations were performed for the statistical analysis to analyze diverse uncertainties that might arise at the moment when the maneuver is executed. As a result, three major requirements were analyzed and estimated for the early design phase. First, the minimum requirements were estimated for the burn performance to be captured around the moon. Second, the requirements for orbit, attitude, and maneuver burn performances were simultaneously estimated and analyzed to maintain the 1st elliptical orbit achieved around the moon within the specified orbital period. Finally, the dispersion requirements on the B-plane aiming at target points to meet the target insertion goal were analyzed and can be utilized as reference target guidelines for a mid-course correction (MCC) maneuver during the transfer. More detailed system requirements for the KPLO mission, particularly for the spacecraft bus itself and for the flight dynamics subsystem at the ground control center, are expected to be prepared and established based on the current results, including a contingency trajectory design plan.

  5. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Pesticide Factsheets

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  6. Automated and connected vehicle implications and analysis.

    DOT National Transportation Integrated Search

    2017-05-01

    Automated and connected vehicles (ACV) and, in particular, autonomous vehicles have captured : the interest of the public, industry and transportation authorities. ACVs can significantly reduce : accidents, fuel consumption, pollution and the costs o...

  7. A Helitron transposon reconstructed from bats reveals a novel mechanism of genome shuffling in eukaryotes.

    PubMed

    Grabundzija, Ivana; Messing, Simon A; Thomas, Jainy; Cosby, Rachel L; Bilic, Ilija; Miskey, Csaba; Gogol-Döring, Andreas; Kapitonov, Vladimir; Diem, Tanja; Dalda, Anna; Jurka, Jerzy; Pritham, Ellen J; Dyda, Fred; Izsvák, Zsuzsanna; Ivics, Zoltán

    2016-03-02

    Helitron transposons capture and mobilize gene fragments in eukaryotes, but experimental evidence for their transposition is lacking in the absence of an isolated active element. Here we reconstruct Helraiser, an ancient element from the bat genome, and use this transposon as an experimental tool to unravel the mechanism of Helitron transposition. A hairpin close to the 3'-end of the transposon functions as a transposition terminator. However, the 3'-end can be bypassed by the transposase, resulting in transduction of flanking sequences to new genomic locations. Helraiser transposition generates covalently closed circular intermediates, suggestive of a replicative transposition mechanism, which provides a powerful means to disseminate captured transcriptional regulatory signals across the genome. Indeed, we document the generation of novel transcripts by Helitron promoter capture both experimentally and by transcriptome analysis in bats. Our results provide mechanistic insight into Helitron transposition, and its impact on diversification of gene function by genome shuffling.

  8. Wearable Stretch Sensors for Motion Measurement of the Wrist Joint Based on Dielectric Elastomers.

    PubMed

    Huang, Bo; Li, Mingyu; Mei, Tao; McCoul, David; Qin, Shihao; Zhao, Zhanfeng; Zhao, Jianwen

    2017-11-23

    Motion capture of the human body potentially holds great significance for exoskeleton robots, human-computer interaction, sports analysis, rehabilitation research, and many other areas. Dielectric elastomer sensors (DESs) are excellent candidates for wearable human motion capture systems because of their intrinsic characteristics of softness, light weight, and compliance. In this paper, DESs were applied to measure all component motions of the wrist joints. Five sensors were mounted to different positions on the wrist, and each one is for one component motion. To find the best position to mount the sensors, the distribution of the muscles is analyzed. Even so, the component motions and the deformation of the sensors are coupled; therefore, a decoupling method was developed. By the decoupling algorithm, all component motions can be measured with a precision of 5°, which meets the requirements of general motion capture systems.

  9. Gold nanoparticle capture within protein crystal scaffolds.

    PubMed

    Kowalski, Ann E; Huber, Thaddaus R; Ni, Thomas W; Hartje, Luke F; Appel, Karina L; Yost, Jarad W; Ackerson, Christopher J; Snow, Christopher D

    2016-07-07

    DNA assemblies have been used to organize inorganic nanoparticles into 3D arrays, with emergent properties arising as a result of nanoparticle spacing and geometry. We report here the use of engineered protein crystals as an alternative approach to biologically mediated assembly of inorganic nanoparticles. The protein crystal's 13 nm diameter pores result in an 80% solvent content and display hexahistidine sequences on their interior. The hexahistidine sequence captures Au25(glutathione)∼17 (nitrilotriacetic acid)∼1 nanoclusters throughout a chemically crosslinked crystal via the coordination of Ni(ii) to both the cluster and the protein. Nanoparticle loading was validated by confocal microscopy and elemental analysis. The nanoparticles may be released from the crystal by exposure to EDTA, which chelates the Ni(ii) and breaks the specific protein/nanoparticle interaction. The integrity of the protein crystals after crosslinking and nanoparticle capture was confirmed by single crystal X-ray crystallography.

  10. Rapid differentiation of rocky mountain spotted fever from chickenpox, measles, and enterovirus infections and bacterial meningitis by frequency-pulsed electron capture gas-liquid chromatographic analysis of sera.

    PubMed Central

    Brooks, J B; McDade, J E; Alley, C C

    1981-01-01

    Normal sera and sera from patients with Rocky Mountain spotted fever, chickenpox, enterovirus infections, measles, and Neisseria meningitidis infections were extracted with organic solvents under acidic and basic conditions and then derivatized with trichloroethanol or heptafluorobutyric anhydride-ethanol to form electron-capturing derivatives of organic acids, alcohols, and amines. The derivatives were analyzed by frequency-pulsed electron capture gas-liquid chromatography (FPEC-GLC). There were unique differences in the FPEC-GLC profiles of sera obtained from patients with these respective diseases. With Rocky Mountain spotted fever patients, typical profiles were detected as early as 1 day after onset of disease and before antibody could be detected in the serum. Rapid diagnosis of Rocky Mountain spotted fever by FPEC-GLC could permit early and effective therapy, thus preventing many deaths from this disease. PMID:7276147

  11. Does apparent size capture attention in visual search? Evidence from the Muller-Lyer illusion.

    PubMed

    Proulx, Michael J; Green, Monique

    2011-11-23

    Is perceived size a crucial factor for the bottom-up guidance of attention? Here, a visual search experiment was used to examine whether an irrelevantly longer object can capture attention when participants were to detect a vertical target item. The longer object was created by an apparent size manipulation, the Müller-Lyer illusion; however, all objects contained the same number of pixels. The vertical target was detected more efficiently when it was also perceived as the longer item that was defined by apparent size. Further analysis revealed that the longer Müller-Lyer object received a greater degree of attentional priority than published results for other features such as retinal size, luminance contrast, and the abrupt onset of a new object. The present experiment has demonstrated for the first time that apparent size can capture attention and, thus, provide bottom-up guidance on the basis of perceived salience.

  12. Learning and robustness to catch-and-release fishing in a shark social network

    PubMed Central

    Brown, Culum; Planes, Serge

    2017-01-01

    Individuals can play different roles in maintaining connectivity and social cohesion in animal populations and thereby influence population robustness to perturbations. We performed a social network analysis in a reef shark population to assess the vulnerability of the global network to node removal under different scenarios. We found that the network was generally robust to the removal of nodes with high centrality. The network appeared also highly robust to experimental fishing. Individual shark catchability decreased as a function of experience, as revealed by comparing capture frequency and site presence. Altogether, these features suggest that individuals learnt to avoid capture, which ultimately increased network robustness to experimental catch-and-release. Our results also suggest that some caution must be taken when using capture–recapture models often used to assess population size as assumptions (such as equal probabilities of capture and recapture) may be violated by individual learning to escape recapture. PMID:28298593

  13. CliniProteus: A flexible clinical trials information management system

    PubMed Central

    Mathura, Venkatarajan S; Rangareddy, Mahendiranath; Gupta, Pankaj; Mullan, Michael

    2007-01-01

    Clinical trials involve multi-site heterogeneous data generation with complex data input-formats and forms. The data should be captured and queried in an integrated fashion to facilitate further analysis. Electronic case-report forms (eCRF) are gaining popularity since it allows capture of clinical information in a rapid manner. We have designed and developed an XML based flexible clinical trials data management framework in .NET environment that can be used for efficient design and deployment of eCRFs to efficiently collate data and analyze information from multi-site clinical trials. The main components of our system include an XML form designer, a Patient registration eForm, reusable eForms, multiple-visit data capture and consolidated reports. A unique id is used for tracking the trial, site of occurrence, the patient and the year of recruitment. Availability http://www.rfdn.org/bioinfo/CTMS/ctms.html. PMID:21670796

  14. Capture and release of cancer cells using electrospun etchable MnO2 nanofibers integrated in microchannels

    NASA Astrophysics Data System (ADS)

    Liu, Hui-qin; Yu, Xiao-lei; Cai, Bo; You, Su-jian; He, Zhao-bo; Huang, Qin-qin; Rao, Lang; Li, Sha-sha; Liu, Chang; Sun, Wei-wei; Liu, Wei; Guo, Shi-shang; Zhao, Xing-zhong

    2015-03-01

    This paper introduces a cancer cell capture/release microchip based on the self-sacrificed MnO2 nanofibers. Through electrospinning, lift-off and soft-lithography procedures, MnO2 nanofibers are tactfully fabricated in microchannels to implement enrichment and release of cancer cells in liquid samples. The MnO2 nanofiber net which mimics the extra cellular matrix can lead to high capture ability with the help of a cancer cell-specific antibody bio-conjugation. Subsequently, an effective and friendly release method is carried out by using low concentration of oxalic acid to dissolve the MnO2 nanofiber substrate while keeping high viability of those released cancer cells at the same time. It is conceivable that our microchip may have potentials in realizing biomedical analysis of circulating tumor cells for biological and clinical researches in oncology.

  15. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction.

    PubMed

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  16. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  17. Doped phosphorene for hydrogen capture: A DFT study

    NASA Astrophysics Data System (ADS)

    Zhang, Hong-ping; Hu, Wei; Du, Aijun; Lu, Xiong; Zhang, Ya-ping; Zhou, Jian; Lin, Xiaoyan; Tang, Youhong

    2018-03-01

    Hydrogen capture and storage is the core of hydrogen energy application. With its high specific surface area, direct bandgap, and variety of potential applications, phosphorene has attracted much research interest. In this study, density functional theory (DFT) is utilized to study the interactions between doped phosphorenes and hydrogen molecules. The effects of different dopants and metallic or nonmetallic atoms on phosphorene/hydrogen interactions is systematically studied by adsorption energy, electron density difference, partial density of states analysis, and Hirshfeld population. Our results indicate that the metallic dopants Pt, Co, and Ni can help to improve the hydrogen capture ability of phosphorene, whereas the nonmetallic dopants have no effect on it. Among the various metallic dopants, Pt performs very differently, such that it can help to dissociate H2 on phosphorene. Specified doped phosphorene could be a promising candidate for hydrogen storage, with behaviors superior to those of intrinsic graphene sheet.

  18. Resonant electron capture by aspartame and aspartic acid molecules.

    PubMed

    Muftakhov, M V; Shchukin, P V

    2016-12-30

    The processes for dissociative electron capture are the key mechanisms for decomposition of biomolecules, proteins in particular, under interaction with low-energy electrons. Molecules of aspartic acid and aspartame, i.e. modified dipeptides, were studied herein to define the impact of the side functional groups on peptide chain decomposition in resonant electron-molecular reactions. The processes of formation and decomposition of negative ions of both aspartame and aspartic acid were studied by mass spectrometry of negative ions under resonant electron capture. The obtained mass spectra were interpreted under thermochemical analysis by quantum chemical calculations. Main channels of negative molecular ions fragmentation were found and characteristic fragment ions were identified. The СООН fragment of the side chain in aspartic acid is shown to play a key role like the carboxyl group in amino acids and aliphatic oligopeptides. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. International Alzheimer's Disease Research Portfolio (IADRP) aims to capture global Alzheimer's disease research funding.

    PubMed

    Liggins, Charlene; Snyder, Heather M; Silverberg, Nina; Petanceska, Suzana; Refolo, Lorenzo M; Ryan, Laurie; Carrillo, Maria C

    2014-05-01

    Alzheimer's disease (AD) is a recognized international public health crisis. There is an urgent need for public and private funding agencies around the world to coordinate funding strategies and leverage existing resources to enhance and expand support of AD research. To capture and compare their existing investments in AD research and research-related resources, major funding organizations are starting to utilize the Common Alzheimer's Disease Research Ontology (CADRO) to categorize their funding information. This information is captured in the International Alzheimer's Disease Research Portfolio (IADRP) for further analysis. As of January, 2014, over fifteen organizations from the US, Canada, Europe and Australia have contributed their information. The goal of the IADRP project is to enable funding organizations to assess the changing landscape of AD research and coordinate strategies, leverage resources, and avoid duplication of effort. Copyright © 2014. Published by Elsevier Inc.

  20. DENSITY: software for analysing capture-recapture data from passive detector arrays

    USGS Publications Warehouse

    Efford, M.G.; Dawson, D.K.; Robbins, C.S.

    2004-01-01

    A general computer-intensive method is described for fitting spatial detection functions to capture-recapture data from arrays of passive detectors such as live traps and mist nets. The method is used to estimate the population density of 10 species of breeding birds sampled by mist-netting in deciduous forest at Patuxent Research Refuge, Laurel, Maryland, U.S.A., from 1961 to 1972. Total density (9.9 ? 0.6 ha-1 mean ? SE) appeared to decline over time (slope -0.41 ? 0.15 ha-1y-1). The mean precision of annual estimates for all 10 species pooled was acceptable (CV(D) = 14%). Spatial analysis of closed-population capture-recapture data highlighted deficiencies in non-spatial methodologies. For example, effective trapping area cannot be assumed constant when detection probability is variable. Simulation may be used to evaluate alternative designs for mist net arrays where density estimation is a study goal.

Top