2016-09-08
10.1118/1.4935531. A new radiation detection method relies on high-energy current (HEC) formed by secondary charged particles in the detector material...photocurrent, radiation detection , self-powered, thin-film U U U SAR 17 Dr. Joseph Wander Reset A Self-powered thin-film radiation detector using intrinsic...Program, Lowell, MA 01854 Purpose: We introduce a radiation detection method that relies on high-energy current (HEC) formed by secondary 10 charged
Air Force Maintenance Technician Performance Measurement.
1979-12-28
R G A N I Z A T IO N N A M E A N D A D R S A R E A P HO R U I T N U M B E R AFIT STUDENT AT: Arizona State Univ II. CONTROLLING OFFICE NAME AND...inflated, or provide incomplete and non -current coverage of maintenance organizations. The performance aopraisal method developed relies on subjective...highly inflated, or provided incomplete and non -current coverage of maintenance organizations. The performance appraisal method developed relied on
Alternatives for Jet Engine Control
NASA Technical Reports Server (NTRS)
Leake, R. J.; Sain, M. K.
1976-01-01
Approaches are developed as alternatives to current design methods which rely heavily on linear quadratic and Riccati equation methods. The main alternatives are discussed in two broad categories, local multivariable frequency domain methods and global nonlinear optimal methods.
Lepora, Nathan F; Blomeley, Craig P; Hoyland, Darren; Bracci, Enrico; Overton, Paul G; Gurney, Kevin
2011-11-01
The study of active and passive neuronal dynamics usually relies on a sophisticated array of electrophysiological, staining and pharmacological techniques. We describe here a simple complementary method that recovers many findings of these more complex methods but relies only on a basic patch-clamp recording approach. Somatic short and long current pulses were applied in vitro to striatal medium spiny (MS) and fast spiking (FS) neurons from juvenile rats. The passive dynamics were quantified by fitting two-compartment models to the short current pulse data. Lumped conductances for the active dynamics were then found by compensating this fitted passive dynamics within the current-voltage relationship from the long current pulse data. These estimated passive and active properties were consistent with previous more complex estimations of the neuron properties, supporting the approach. Relationships within the MS and FS neuron types were also evident, including a graduation of MS neuron properties consistent with recent findings about D1 and D2 dopamine receptor expression. Application of the method to simulated neuron data supported the hypothesis that it gives reasonable estimates of membrane properties and gross morphology. Therefore detailed information about the biophysics can be gained from this simple approach, which is useful for both classification of neuron type and biophysical modelling. Furthermore, because these methods rely upon no manipulations to the cell other than patch clamping, they are ideally suited to in vivo electrophysiology. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.
Gotti, Valeria Bisinoto; Feitosa, Victor Pinheiro; Sauro, Salvatore; Correr-Sobrinho, Lourenço; Correr, Americo Bortolazzo
2014-10-01
To evaluate the effects of an electric current-assisted application on the bond strength and interfacial morphology of self-adhesive resin cements bonded to dentin. Indirect resin composite build-ups were luted to prepared dentin surfaces using two self-adhesive resin cements (RelyX Unicem and BisCem) and an ElectroBond device under 0, 20, or 40 μA electrical current. All specimens were submitted to microtensile bond strength test and to interfacial SEM analysis. The electric current-assisted application induced no change (P > 0.05) on the overall bond strength, although RelyX Unicem showed significantly higher bond strength (P < 0.05) than BisCem. Similarly, no differences were observed in terms of interfacial integrity when using the electrical current applicator.
In-Situ Transfer Standard and Coincident-View Intercomparisons for Sensor Cross-Calibration
NASA Technical Reports Server (NTRS)
Thome, Kurt; McCorkel, Joel; Czapla-Myers, Jeff
2013-01-01
There exist numerous methods for accomplishing on-orbit calibration. Methods include the reflectance-based approach relying on measurements of surface and atmospheric properties at the time of a sensor overpass as well as invariant scene approaches relying on knowledge of the temporal characteristics of the site. The current work examines typical cross-calibration methods and discusses the expected uncertainties of the methods. Data from the Advanced Land Imager (ALI), Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), Moderate Resolution Imaging Spectroradiometer (MODIS), and Thematic Mapper (TM) are used to demonstrate the limits of relative sensor-to-sensor calibration as applied to current sensors while Landsat-5 TM and Landsat-7 ETM+ are used to evaluate the limits of in situ site characterizations for SI-traceable cross calibration. The current work examines the difficulties in trending of results from cross-calibration approaches taking into account sampling issues, site-to-site variability, and accuracy of the method. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The results show that cross calibrations with absolute uncertainties lesser than 1.5 percent (1 sigma) are currently achievable even for sensors without coincident views.
The Utility of Robust Means in Statistics
ERIC Educational Resources Information Center
Goodwyn, Fara
2012-01-01
Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…
Management system to a photovoltaic panel based on the measurement of short-circuit currents
NASA Astrophysics Data System (ADS)
Dordescu, M.
2016-12-01
This article is devoted to fundamental issues arising from operation in terms of increased energy efficiency for photovoltaic panel (PV). By measuring the current from functioning cage determine the current value prescribed amount corresponding to maximum power point results obtained by requiring proof of pregnancy with this method are the maximum energy possible, thus justifying the usefulness of this process very simple and inexpensive to implement in practice. The proposed adjustment method is much simpler and more economical than conventional methods that rely on measuring power cut.
Current neurotoxicity and developmental neurotoxicity testing methods for hazard identification rely on in vivo neurobehavior, neurophysiological, and gross pathology of the nervous system. These measures may not be sensitive enough to detect small changes caused by realistic ex...
Comparison of Virtual Oscillator and Droop Control: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B; Rodriguez, Miguel; Dhople, Sairaj
Virtual oscillator control and droop control are two techniques that can be used to ensure synchronization and power sharing of parallel inverters in islanded operation. VOC relies on the implementation of non-linear Van der Pol oscillator equations in the control system of the inverter, acting upon the time-domain instantaneous inverter current and terminal voltage. On the other hand, DC explicitly computes active and reactive power produced by the inverter and relies on limited bandwidth low-pass filters. Even though both methods can be engineered to produce the same steady-state characteristics, their dynamic performances are significantly different. This paper presents analytical andmore » experimental results that aim to compare both methods. It is shown that VOC is inherently faster and enables minimizing the circulating currents. The results are verified using three 120V, 1kW inverters.« less
Current neurotoxicity and developmental neurotoxicity testing methods for hazard identification rely on in vivo neurobehavior, neurophysiological, and gross pathology of the nervous system. These measures may not be sensitive enough to detect small changes caused by realistic ex...
A Condition Based Maintenance Approach to Forecasting B-1 Aircraft Parts
2017-03-23
1 Problem Statement...aimed at making the USAF aware of CBM methods, and recommending which techniques to consider for implementation. Problem Statement The USAF relies on... problem , this research will seek to highlight common CBM forecasting methods that are well established and evaluate its suitability with current USAF
Comparative effectiveness research methodology using secondary data: A starting user's guide.
Sun, Maxine; Lipsitz, Stuart R
2018-04-01
The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.
Colorimetric micro-assay for accelerated screening of mould inhibitors
Carol A. Clausen; Vina W. Yang
2013-01-01
Since current standard laboratory methods are time-consuming macro-assays that rely on subjective visual ratings of mould growth, rapid and quantitative laboratory methods are needed to screen potential mould inhibitors for use in and on cellulose-based products. A colorimetric micro-assay has been developed that uses XTT tetrazolium salt to enzymatically assess...
Current methods designed to control and reduce the amount of sulfur dioxide emitted into the atmosphere from coal-fired power plants and factories rely upon the reaction between SO2 and alkaline earth compounds and are called flue gas desulfurization (FGD) processes. Of these met...
Current methods for screening, testing and monitoring endocrine-disrupting chemicals (EDCs) rely relatively substantially upon moderate- to long-term assays that can, in some instances, require significant numbers of animals. Recent developments in the areas of in vitro testing...
Validation of Salivary Immunoassays for Waterborne Infections
Assessments of the health outcomes associated with exposure to fecally-contaminated water and inadequate sanitation and hygiene (WASH) currently rely upon self-reported symptoms and invasive collection of blood and stool samples. However, these methods are limited in their abilit...
Physics of Tokamak Plasma Start-up
NASA Astrophysics Data System (ADS)
Mueller, Dennis
2012-10-01
This tutorial describes and reviews the state-of-art in tokamak plasma start-up and its importance to next step devices such as ITER, a Fusion Nuclear Science Facility and a Tokamak/ST demo. Tokamak plasma start-up includes breakdown of the initial gas, ramp-up of the plasma current to its final value and the control of plasma parameters during those phases. Tokamaks rely on an inductive component, typically a central solenoid, which has enabled attainment of high performance levels that has enabled the construction of the ITER device. Optimizing the inductive start-up phase continues to be an area of active research, especially in regards to achieving ITER scenarios. A new generation of superconducting tokamaks, EAST and KSTAR, experiments on DIII-D and operation with JET's ITER-like wall are contributing towards this effort. Inductive start-up relies on transformer action to generate a toroidal loop voltage and successful start-up is determined by gas breakdown, avalanche physics and plasma-wall interaction. The goal of achieving steady-sate tokamak operation has motivated interest in other methods for start-up that do not rely on the central solenoid. These include Coaxial Helicity Injection, outer poloidal field coil start-up, and point source helicity injection, which have achieved 200, 150 and 100 kA respectively of toroidal current on closed flux surfaces. Other methods including merging reconnection startup and Electron Bernstein Wave (EBW) plasma start-up are being studied on various devices. EBW start-up generates a directed electron channel due to wave particle interaction physics while the other methods mentioned rely on magnetic helicity injection and magnetic reconnection which are being modeled and understood using NIMROD code simulations.
Development of a hazard-based method for evaluating the fire safety of passenger trains
DOT National Transportation Integrated Search
1999-01-01
The fire safety of U.S. passenger rail trains currently is addressed through small-scale flammability and smoke emission tests and performance criteria promulgated by the Federal Railroad Administration (FRA). The FRA approach relies heavily on test ...
A simple test procedure for evaluating low temperature crack resistance of asphalt concrete.
DOT National Transportation Integrated Search
2009-11-01
The current means of evaluating the low temperature cracking resistance of HMA relies on extensive test : methods that require assumptions about material behaviors and the use of complicated loading equipment. The purpose : of this study was to devel...
Nanotechnology-Based Detection and Targeted Therapy in Cancer: Nano-Bio Paradigms and Applications
Mousa, Shaker A.; Bharali, Dhruba J.
2011-01-01
The application of nanotechnology to biomedicine, particularly in cancer diagnosis and treatment, promises to have a profound impact on healthcare. The exploitation of the unique properties of nano-sized particles for cancer therapeutics is most popularly known as nanomedicine. The goals of this review are to discuss the current state of nanomedicine in the field of cancer detection and the subsequent application of nanotechnology to treatment. Current cancer detection methods rely on the patient contacting their provider when they feel ill, or relying on non-specific screening methods, which unfortunately often result in cancers being detected only after it is too late for effective treatment. Cancer treatment paradigms mainly rely on whole body treatment with chemotherapy agents, exposing the patient to medications that non-specifically kill rapidly dividing cells, leading to debilitating side effects. In addition, the use of toxic organic solvents/excipients can hamper the further effectiveness of the anticancer drug. Nanomedicine has the potential to increase the specificity of treatment of cancer cells while leaving healthy cells intact through the use of novel nanoparticles. This review discusses the use of nanoparticles such as quantum dots, nanoshells, nanocrystals, nanocells, and dendrimers for the detection and treatment of cancer. Future directions and perspectives of this cutting-edge technology are also discussed. PMID:24212938
A Self-Directed Method for Cell-Type Identification and Separation of Gene Expression Microarrays
Zuckerman, Neta S.; Noam, Yair; Goldsmith, Andrea J.; Lee, Peter P.
2013-01-01
Gene expression analysis is generally performed on heterogeneous tissue samples consisting of multiple cell types. Current methods developed to separate heterogeneous gene expression rely on prior knowledge of the cell-type composition and/or signatures - these are not available in most public datasets. We present a novel method to identify the cell-type composition, signatures and proportions per sample without need for a-priori information. The method was successfully tested on controlled and semi-controlled datasets and performed as accurately as current methods that do require additional information. As such, this method enables the analysis of cell-type specific gene expression using existing large pools of publically available microarray datasets. PMID:23990767
DOT National Transportation Integrated Search
2009-11-01
Current practice with regard to designing bridge structures to resist impact loads associated with barge collisions relies upon the : use of the American Association of State Highway and Transportation Officials (AASHTO) bridge design specifications....
A proposal: incorporating odonates into stream bioassessments using DNA barcodes
Bioassessment/biomonitoring uses the species found in an ecosystem as a way to measure the health of that ecosystem. Current methods rely mainly on mayflies, stoneflies and caddisflies as indicators for streams and rivers. Odonate larvae are also collected during sampling for bi...
Root removal to improve disease management in replanted Washington red raspberry fields
USDA-ARS?s Scientific Manuscript database
Washington leads the nation in the production of red raspberries for processing. Soilborne pathogens are a production constraint in this $61 million industry with growers relying on preplant soil fumigation for their management. However, current fumigation methods can be ineffective, leading to repl...
RAPID DETECTION METHOD FOR E.COLI, ENTEROCOCCI AND BACTEROIDES IN RECREATIONAL WATER
Current methodology for determining fecal contamination of drinking water sources and recreational waters rely on the time-consuming process of bacterial multiplication and require at least 24 hours from the time of sampling to the possible determination that the water is unsafe ...
Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality...
THE NEED FOR SPEED-RAPID METHODOLOGIES TO DETERMINE BATHING BEACH WATER QUALITY
Current methods for determining fecal contamination of recreational waters rely on the culture of bacterial indicators and require at least 24 hours to determine whether the water is unsafe for use. By the time monitoring results are available, exposures have already occurred. N...
The Use of Propensity Scores in Mediation Analysis
ERIC Educational Resources Information Center
Jo, Booil; Stuart, Elizabeth A.; MacKinnon, David P.; Vinokur, Amiram D.
2011-01-01
Mediation analysis uses measures of hypothesized mediating variables to test theory for how a treatment achieves effects on outcomes and to improve subsequent treatments by identifying the most efficient treatment components. Most current mediation analysis methods rely on untested distributional and functional form assumptions for valid…
Understanding the biological effects of exposures to chemicals in the environment relies on classical methods and emerging technologies in the areas of genomics, proteomics, and metabonomics. Linkages between the historical and newer toxicological tools are currently being devel...
Three-dimensional Imaging and Scanning: Current and Future Applications for Pathology
Farahani, Navid; Braun, Alex; Jutt, Dylan; Huffman, Todd; Reder, Nick; Liu, Zheng; Yagi, Yukako; Pantanowitz, Liron
2017-01-01
Imaging is vital for the assessment of physiologic and phenotypic details. In the past, biomedical imaging was heavily reliant on analog, low-throughput methods, which would produce two-dimensional images. However, newer, digital, and high-throughput three-dimensional (3D) imaging methods, which rely on computer vision and computer graphics, are transforming the way biomedical professionals practice. 3D imaging has been useful in diagnostic, prognostic, and therapeutic decision-making for the medical and biomedical professions. Herein, we summarize current imaging methods that enable optimal 3D histopathologic reconstruction: Scanning, 3D scanning, and whole slide imaging. Briefly mentioned are emerging platforms, which combine robotics, sectioning, and imaging in their pursuit to digitize and automate the entire microscopy workflow. Finally, both current and emerging 3D imaging methods are discussed in relation to current and future applications within the context of pathology. PMID:28966836
Toward a Robust Method of Presenting a Rich, Interconnected Deceptive Network Topology
2015-03-01
have large financial implications as peering relationships dictate the cost of each byte that flows between two ASes. Along that same vein, net...documented tool that can be replicated and tested in order to improve current topology measurement systems. Secondly, we do not rely upon BGP shenanigans
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Label-free SERS detection of Salmonella Typhimurium on DNA aptamer modified AgNR substrates
USDA-ARS?s Scientific Manuscript database
Salmonella Typhimurium is an important foodborne pathogen which causes gastroenteritis in both humans and animals. Currently available rapid methods have relied on antibodies to offer specific recognition of the pathogen from the background. As a substitute of antibodies, nucleic acid aptamers offer...
USDA-ARS?s Scientific Manuscript database
Contamination by aflatoxin, a toxic metabolite produced by Aspergillus fungi ubiquitous in California almond and pistachio orchards, results in millions of dollars of lost product annually. Current detection of aflatoxin relies on destructive, expensive and time-intensive laboratory-based methods. T...
DOT National Transportation Integrated Search
2011-12-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
TCR-Vß8 as alternative to animal testing for quantifying active SEE
USDA-ARS?s Scientific Manuscript database
Staphylococcal food poisoning is a result of ingestion of Staphylococcal enterotoxins (SEs) produced by the bacterium Staphylococcus aureus. SEs cause gastroenteritis and also cause activation of T cells and massive cytokine release. A current method for the detection of active SEs relies on its eme...
USDA-ARS?s Scientific Manuscript database
Market demands for cotton varieties with improved fiber properties also call for the development of fast, reliable analytical methods for monitoring fiber development and measuring their properties. Currently, cotton breeders rely on instrumentation that can require significant amounts of sample, w...
Current methods for determining fecal contamination of recreational waters rely on the culture of bacterial indicators and require at least 24 hours to determine whether the water is unsafe for use. By the time monitoring results are available, exposures have already occurred. N...
DOT National Transportation Integrated Search
2010-01-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
Removal of Differential Capacitive Interferences in Fast-Scan Cyclic Voltammetry.
Johnson, Justin A; Hobbs, Caddy N; Wightman, R Mark
2017-06-06
Due to its high spatiotemporal resolution, fast-scan cyclic voltammetry (FSCV) at carbon-fiber microelectrodes enables the localized in vivo monitoring of subsecond fluctuations in electroactive neurotransmitter concentrations. In practice, resolution of the analytical signal relies on digital background subtraction for removal of the large current due to charging of the electrical double layer as well as surface faradaic reactions. However, fluctuations in this background current often occur with changes in the electrode state or ionic environment, leading to nonspecific contributions to the FSCV data that confound data analysis. Here, we both explore the origin of such shifts seen with local changes in cations and develop a model to account for their shape. Further, we describe a convolution-based method for removal of the differential capacitive contributions to the FSCV current. The method relies on the use of a small-amplitude pulse made prior to the FSCV sweep that probes the impedance of the system. To predict the nonfaradaic current response to the voltammetric sweep, the step current response is differentiated to provide an estimate of the system's impulse response function and is used to convolute the applied waveform. The generated prediction is then subtracted from the observed current to the voltammetric sweep, removing artifacts associated with electrode impedance changes. The technique is demonstrated to remove select contributions from capacitive characteristics changes of the electrode both in vitro (i.e., in flow-injection analysis) and in vivo (i.e., during a spreading depression event in an anesthetized rat).
Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan
2017-01-01
Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397
Joint Command and Control of Cyber Operations: The Joint Force Cyber Component Command (JFCCC)
2012-05-04
relies so heavily on complex command and control systems and interconnectivity in general, cyber warfare has become a serious topic of interest at the...defensive cyber warfare into current and future operations and plans. In particular, Joint Task Force (JTF) Commanders must develop an optimum method to
Catalytic Assessment: Understanding How MCQs and EVS Can Foster Deep Learning
ERIC Educational Resources Information Center
Draper, Stephen W.
2009-01-01
One technology for education whose adoption is currently expanding rapidly in UK higher education is that of electronic voting systems (EVS). As with all educational technology, whether learning benefits are achieved depends not on the technology but on whether an improved teaching method is introduced with it. EVS inherently relies on the…
Using Oral Exams to Assess Communication Skills in Business Courses
ERIC Educational Resources Information Center
Burke-Smalley, Lisa A.
2014-01-01
Business, like many other fields in higher education, continues to rely largely on conventional testing methods for assessing student learning. In the current article, another evaluation approach--the oral exam--is examined as a means for building and evaluating the professional communication and oral dialogue skills needed and utilized by…
USDA-ARS?s Scientific Manuscript database
Tuberculosis (TB) in elephants is a re-emerging zoonotic disease caused primarily by Mycobacterium tuberculosis. Current methods for screening and diagnosis rely on trunk wash culture, which has serious limitations due to low test sensitivity, slow turn-around time, and variable sample quality. Inn...
Designing a Website for Parents
ERIC Educational Resources Information Center
Schleig, Elisa
2012-01-01
Early childhood educators are aware of the great importance of having parents involved and engaged in their children's education. Although personal contact is still the best, it is not possible to rely on only one method of communication with current and prospective families. A variety of strategies need to be considered to keep the communication…
Background: Numerous indicators have been used to assess the presence of fecal pollution, many relying on molecular methods such as qPCR. One of the targets frequently used, the human-associated Bacteroides 16s rRNA region, has several assays in current usage. These assays vary...
Update on the DNT In Vitro Alternative Methods Project at the USEPA
Current approaches to toxicity testing rely heavily on the use of animals, can cost millions of dollars and can take years to complete for a single chemical. To implement the predictive toxicity testing envisioned in the NAS report on Toxicity Testing in the 21st century, rapid a...
Using Dirichlet Processes for Modeling Heterogeneous Treatment Effects across Sites
ERIC Educational Resources Information Center
Miratrix, Luke; Feller, Avi; Pillai, Natesh; Pati, Debdeep
2016-01-01
Modeling the distribution of site level effects is an important problem, but it is also an incredibly difficult one. Current methods rely on distributional assumptions in multilevel models for estimation. There it is hoped that the partial pooling of site level estimates with overall estimates, designed to take into account individual variation as…
Jayaprakash, Paul T
2015-01-01
Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Nuclear electromagnetic charge and current operators in Chiral EFT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Girlanda, Luca; Marcucci, Laura Elisa; Pastore, Saori
2013-08-01
We describe our method for deriving the nuclear electromagnetic charge and current operators in chiral perturbation theory, based on time-ordered perturbation theory. We then discuss possible strategies for fixing the relevant low-energy constants, from the magnetic moments of the deuteron and of the trinucleons, and from the radiative np capture cross sections, and identify a scheme which, partly relying on {Delta} resonance saturation, leads to a reasonable pattern of convergence of the chiral expansion.
Determining the near-surface current profile from measurements of the wave dispersion relation
NASA Astrophysics Data System (ADS)
Smeltzer, Benjamin; Maxwell, Peter; Aesøy, Eirik; Ellingsen, Simen
2017-11-01
The current-induced Doppler shifts of waves can yield information about the background mean flow, providing an attractive method of inferring the current profile in the upper layer of the ocean. We present measurements of waves propagating on shear currents in a laboratory water channel, as well as theoretical investigations of inversion techniques for determining the vertical current structure. Spatial and temporal measurements of the free surface profile obtained using a synthetic Schlieren method are analyzed to determine the wave dispersion relation and Doppler shifts as a function of wavelength. The vertical current profile can then be inferred from the Doppler shifts using an inversion algorithm. Most existing algorithms rely on a priori assumptions of the shape of the current profile, and developing a method that uses less stringent assumptions is a focus of this study, allowing for measurement of more general current profiles. The accuracy of current inversion algorithms are evaluated by comparison to measurements of the mean flow profile from particle image velocimetry (PIV), and a discussion of the sensitivity to errors in the Doppler shifts is presented.
NASA Technical Reports Server (NTRS)
Diak, George R.
1989-01-01
Improved techniques for the remote sensing of the land surface energy balance (SEB) and soil moisture would greatly improve prediction of climate and weather as well as be of benefit to agriculture, hydrology and many associated fields. Most of the satellite remote sensing methods which were researched to date rely upon satellite-measured infrared surface temperatures or their time changes as a remote sensing signal. Optimistically, only four or five levels of information (wet to dry) in surface heating/evaporation are discernable by surface temperature methods and a good understanding of atmospheric conditions is necessary to bring them to this accuracy level. Skin temperature methods were researched as well as begun work on several new methods for the remote sensing of the SEB, some elements of which are applicable to current and retrospective data sources and some which will rely on instrumentation from the Earth Observing System (EOS) program in the 1990s.
Sina, Abu Ali Ibn; Howell, Sidney; Carrascosa, Laura G; Rauf, Sakandar; Shiddiky, Muhammad J A; Trau, Matt
2014-11-07
We report a simple electrochemical method referred to as "eMethylsorb" for the detection of DNA methylation. The method relies on the base dependent affinity interaction of DNA with gold. The methylation status of DNA is quantified by monitoring the electrochemical current as a function of the relative adsorption level of bisulphite treated DNA samples onto a bare gold electrode. This method can successfully distinguish methylated and unmethylated epigenotypes at single CpG resolution.
Gas turbine coatings eddy current quantitative and qualitative evaluation
NASA Astrophysics Data System (ADS)
Ribichini, Remo; Giolli, Carlo; Scrinzi, Erica
2017-02-01
Gas turbine blades (buckets) are among the most critical and expensive components of the engine. Buckets rely on protective coatings in order to withstand the harsh environment in which they operate. The thickness and the microstructure of coatings during the lifespan of a unit are fundamental to evaluate their fitness for service. A frequency scanning Eddy Current instrument can allow the measurement of the thickness and of physical properties of coatings in a Non-Destructive manner. The method employed relies on the acquisition of impedance spectra and on the inversion of the experimental data to derive the coating properties and structure using some assumptions. This article describes the experimental validation performed on several samples and real components in order to assess the performance of the instrument as a coating thickness gage. The application of the technique to support residual life assessment of serviced buckets is also presented.
A Study of Dim Object Detection for the Space Surveillance Telescope
2013-03-21
ENG-13-M-32 Abstract Current methods of dim object detection for space surveillance make use of a Gaussian log-likelihood-ratio-test-based...quantitatively comparing the efficacy of two methods for dim object detection , termed in this paper the point detector and the correlator, both of which rely... applications . It is used in national defense for detecting satellites. It is used to detecting space debris, which threatens both civilian and
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions
2017-01-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.
Nantha, Yogarabindranath Swarna
2017-11-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.
ERIC Educational Resources Information Center
Rodriguez, Christina M.; Cook, Anne E.; Jedrziewski, Chezlie T.
2012-01-01
Objective: Researchers in the child maltreatment field have traditionally relied on explicit self-reports to study factors that may exacerbate physical child abuse risk. The current investigation evaluated an implicit analog task utilizing eye tracking technology to assess both parental attributions of child misbehavior and empathy. Method: Based…
A Data Augmentation Approach to Short Text Classification
ERIC Educational Resources Information Center
Rosario, Ryan Robert
2017-01-01
Text classification typically performs best with large training sets, but short texts are very common on the World Wide Web. Can we use resampling and data augmentation to construct larger texts using similar terms? Several current methods exist for working with short text that rely on using external data and contexts, or workarounds. Our focus is…
In regulating the safety of water under SDWA and the CWA, the EPA makes decisions on what chemical contaminants to regulate and at what levels. To make these decisions the EPA needs hazard identification and dose-response information. Current methods that rely on rodent models fo...
Embedded Reasoning Supporting Aerospace IVHM
2007-01-01
c method (BIT or health assessment algorithm) which the monitoring diagnostic relies on input information tics and Astronautics In the diagram...viewing of the current health state of all monitored subsystems, while also providing a means to probe deeper in the event anomalous operation is...seeks to integrate detection , diagnostic, and prognostic capabilities with a hierarchical diagnostic reasoning architecture into a single
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level
ERIC Educational Resources Information Center
Savalei, Victoria; Rhemtulla, Mijke
2017-01-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately…
Heather E. Golden; Charles R. Lane; Devendra M. Amatya; Karl W. Bandilla; Hadas Raanan Kiperwas Kiperwas; Christopher D. Knightes; Herbert Ssegane
2014-01-01
Geographically isolated wetlands (GIW), depressional landscape features entirely surrounded by upland areas, provide a wide range of ecological functions and ecosystem services for human well-being. Current and future ecosystem management and decision-making rely on a solid scientific understanding of how hydrologic processes affect these important GIW services and...
Online Learner Satisfaction and Collaborative Learning: Evidence from Saudi Arabia
ERIC Educational Resources Information Center
Alkhalaf, Salem; Nguyen, Jeremy; Nguyen, Anne; Drew, Steve
2013-01-01
Despite the considerable potential for e-learning to improve learning outcomes, particularly for female students and students who need to rely on distance learning, feedback from current users of e-learning systems in the Kingdom of Saudi Arabia (KSA) suggests a relatively low level of satisfaction. This study adopts a mixed-methods approach in…
USDA-ARS?s Scientific Manuscript database
The RNA genome of Hop stunt viroid (HSVd) contains five to six nucleotides in a variable (V) domain, called the cachexia expression motif, which is associated with pathogenic and non-pathogenic variants in citrus. Current methods to differentiate HSVd variants rely on lengthy greenhouse biological i...
Automatic identification of bacterial types using statistical imaging methods
NASA Astrophysics Data System (ADS)
Trattner, Sigal; Greenspan, Hayit; Tepper, Gapi; Abboud, Shimon
2003-05-01
The objective of the current study is to develop an automatic tool to identify bacterial types using computer-vision and statistical modeling techniques. Bacteriophage (phage)-typing methods are used to identify and extract representative profiles of bacterial types, such as the Staphylococcus Aureus. Current systems rely on the subjective reading of plaque profiles by human expert. This process is time-consuming and prone to errors, especially as technology is enabling the increase in the number of phages used for typing. The statistical methodology presented in this work, provides for an automated, objective and robust analysis of visual data, along with the ability to cope with increasing data volumes.
An Adaptive Cross-Architecture Combination Method for Graph Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Yang; Song, Shuaiwen; Kerbyson, Darren J.
2014-06-18
Breadth-First Search (BFS) is widely used in many real-world applications including computational biology, social networks, and electronic design automation. The combination method, using both top-down and bottom-up techniques, is the most effective BFS approach. However, current combination methods rely on trial-and-error and exhaustive search to locate the optimal switching point, which may cause significant runtime overhead. To solve this problem, we design an adaptive method based on regression analysis to predict an optimal switching point for the combination method at runtime within less than 0.1% of the BFS execution time.
Why learning and development can lead to poorer recognition memory.
Hayes, Brett K; Heit, Evan
2004-08-01
Current models of inductive reasoning in children and adults assume a central role for categorical knowledge. A recent paper by Sloutsky and Fisher challenges this assumption, showing that children are more likely than adults to rely on perceptual similarity as a basis for induction, and introduces a more direct method for examining the representations activated during induction. This method has the potential to constrain models of induction in novel ways, although there are still important challenges.
An Investigation of Automatic Change Detection for Topographic Map Updating
NASA Astrophysics Data System (ADS)
Duncan, P.; Smit, J.
2012-08-01
Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.
Coutinho-Abreu, Iliano V.; Zhu, Kun Yan; Ramalho-Ortigao, Marcelo
2009-01-01
Insect-borne diseases cause significant human morbidity and mortality. Current control and preventive methods against vector-borne diseases rely mainly on insecticides. The emergence of insecticide resistance in many disease vectors highlights the necessity to develop new strategies to control these insects. Vector transgenesis and paratransgenesis are novel strategies that aim at reducing insect vectorial capacity, or seek to eliminate transmission of pathogens such as Plasmodium sp., Trypanosoma sp., and Dengue virus currently being developed. Vector transgenesis relies on direct genetic manipulation of disease vectors making them incapable of functioning as vectors of a given pathogen. Paratransgenesis focuses on utilizing genetically modified insect symbionts to express molecules within the vector that are deleterious to pathogens they transmit. Despite the many successes achieved in developing such techniques in the last several years, many significant barriers remain and need to be overcome prior to any of these approaches become a reality. Here, we highlight the current status of these strategies, pointing out advantages and constraints, and also explore issues that need to be resolved before the establishment of transgenesis and paratransgenesis as tools to prevent vector-borne diseases. PMID:19819346
Progress in the molecular diagnosis of Lyme disease.
Ružić-Sabljić, Eva; Cerar, Tjaša
2017-01-01
Current laboratory testing of Lyme borreliosis mostly relies on serological methods with known limitations. Diagnostic modalities enabling direct detection of pathogen at the onset of the clinical signs could overcome some of the limitations. Molecular methods detecting borrelial DNA seem to be the ideal solution, although there are some aspects that need to be considered. Areas covered: This review represent summary and discussion of the published data obtained from literature searches from PubMed and The National Library of Medicine (USA) together with our own experience on molecular diagnosis of Lyme disease. Expert commentary: Molecular methods are promising and currently serve as supporting diagnostic testing in Lyme borreliosis. Since the field of molecular diagnostics is under rapid development, molecular testing could become an important diagnostic modality.
First demonstration of HF-driven ionospheric currents
NASA Astrophysics Data System (ADS)
Papadopoulos, K.; Chang, C.-L.; Labenski, J.; Wallace, T.
2011-10-01
The first experimental demonstration of HF driven currents in the ionosphere at low ELF/ULF frequencies without relying in the presence of electrojets is presented. The effect was predicted by theoretical/computational means in a recent letter and given the name Ionospheric Current Drive (ICD). The effect relies on modulated F-region HF heating to generate Magneto-Sonic (MS) waves that drive Hall currents when they reach the E-region. The Hall currents inject ELF waves into the Earth-Ionosphere waveguide and helicon and Shear Alfven (SA) waves in the magnetosphere. The proof-of-concept experiments were conducted using the HAARP heater in Alaska under the BRIOCHE program. Waves between 0.1-70 Hz were measured at both near and far sites. The letter discusses the differences between ICD generated waves and those relying on modulation of electrojets.
Computing singularities of perturbation series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kvaal, Simen; Jarlebring, Elias; Michiels, Wim
2011-03-15
Many properties of current ab initio approaches to the quantum many-body problem, both perturbational and otherwise, are related to the singularity structure of the Rayleigh-Schroedinger perturbation series. A numerical procedure is presented that in principle computes the complete set of singularities, including the dominant singularity which limits the radius of convergence. The method approximates the singularities as eigenvalues of a certain generalized eigenvalue equation which is solved using iterative techniques. It relies on computation of the action of the Hamiltonian matrix on a vector and does not rely on the terms in the perturbation series. The method can be usefulmore » for studying perturbation series of typical systems of moderate size, for fundamental development of resummation schemes, and for understanding the structure of singularities for typical systems. Some illustrative model problems are studied, including a helium-like model with {delta}-function interactions for which Moeller-Plesset perturbation theory is considered and the radius of convergence found.« less
Finite-difference computations of rotor loads
NASA Technical Reports Server (NTRS)
Caradonna, F. X.; Tung, C.
1985-01-01
This paper demonstrates the current and future potential of finite-difference methods for solving real rotor problems which now rely largely on empiricism. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advance-ratio flight. Comparisons are made with experimental pressure data.
Finite-difference computations of rotor loads
NASA Technical Reports Server (NTRS)
Caradonna, F. X.; Tung, C.
1985-01-01
The current and future potential of finite difference methods for solving real rotor problems which now rely largely on empiricism are demonstrated. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advanced-ratio flight. Comparisons are made with experimental pressure data.
ERIC Educational Resources Information Center
Leming, Katie P.
2016-01-01
Previous qualitative research on educational practices designed to improve critical thinking has relied on anecdotal or student self-reports of gains in critical thinking. Unfortunately, student self-report data have been found to be unreliable proxies for measuring critical thinking gains. Therefore, in the current interpretivist study, five…
Mechanical grading of round timber beams
David W. Green; Thomas M. Gorman; James W. Evans; Joseph F. Murphy
2006-01-01
Current procedures used to sort round timber beams into structural grades rely on visual grading methods and property assignments based on modification of clear wood properties. This study provides the technical basis for mechanical grading of 228 mm (9 in.) diameter round timbers. Test results on 225 round Engelmann spruceâalpine firâlodgepole pine beams demonstrate...
Applications of fuzzy ranking methods to risk-management decisions
NASA Astrophysics Data System (ADS)
Mitchell, Harold A.; Carter, James C., III
1993-12-01
The Department of Energy is making significant improvements to its nuclear facilities as a result of more stringent regulation, internal audits, and recommendations from external review groups. A large backlog of upgrades has resulted. Currently, a prioritization method is being utilized which relies on a matrix of potential consequence and probability of occurrence. The attributes of the potential consequences considered include likelihood, exposure, public health and safety, environmental impact, site personnel safety, public relations, legal liability, and business loss. This paper describes an improved method which utilizes fuzzy multiple attribute decision methods to rank proposed improvement projects.
New Variable Porosity Flow Diverter (VPOD) Stent Design for Treatment of Cerebrovascular Aneurysms
Ionita, Ciprian; Baier, Robert; Rudin, Stephen
2012-01-01
Using flow diverting Stents for intracranial aneurysm repair has been an area of recent active research. While current commercial flow diverting stents rely on a dense mesh of braided coils for flow diversion, our group has been developing a method to selectively occlude the aneurysm neck, without endangering nearby perforator vessels. In this paper, we present a new method of fabricating the low porosity patch, a key element of such asymmetric vascular stents (AVS). PMID:22254507
Pineda, Angel R; Barrett, Harrison H
2004-02-01
The current paradigm for evaluating detectors in digital radiography relies on Fourier methods. Fourier methods rely on a shift-invariant and statistically stationary description of the imaging system. The theoretical justification for the use of Fourier methods is based on a uniform background fluence and an infinite detector. In practice, the background fluence is not uniform and detector size is finite. We study the effect of stochastic blurring and structured backgrounds on the correlation between Fourier-based figures of merit and Hotelling detectability. A stochastic model of the blurring leads to behavior similar to what is observed by adding electronic noise to the deterministic blurring model. Background structure does away with the shift invariance. Anatomical variation makes the covariance matrix of the data less amenable to Fourier methods by introducing long-range correlations. It is desirable to have figures of merit that can account for all the sources of variation, some of which are not stationary. For such cases, we show that the commonly used figures of merit based on the discrete Fourier transform can provide an inaccurate estimate of Hotelling detectability.
Passive runaway electron suppression in tokamak disruptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, H. M.; Helander, P.; Boozer, A. H.
2013-07-15
Runaway electrons created in disruptions pose a serious problem for tokamaks with large current. It would be desirable to have a runaway electron suppression method which is passive, i.e., a method that does not rely on an uncertain disruption prediction system. One option is to let the large electric field inherent in the disruption drive helical currents in the wall. This would create ergodic regions in the plasma and increase the runaway losses. Whether these regions appear at a suitable time and place to affect the formation of the runaway beam depends on disruption parameters, such as electron temperature andmore » density. We find that it is difficult to ergodize the central plasma before a beam of runaway current has formed. However, the ergodic outer region will make the Ohmic current profile contract, which can lead to instabilities that yield large runaway electron losses.« less
NASA Astrophysics Data System (ADS)
Utama, M. Iqbal Bakti; Lu, Xin; Zhan, Da; Ha, Son Tung; Yuan, Yanwen; Shen, Zexiang; Xiong, Qihua
2014-10-01
Patterning two-dimensional materials into specific spatial arrangements and geometries is essential for both fundamental studies of materials and practical applications in electronics. However, the currently available patterning methods generally require etching steps that rely on complicated and expensive procedures. We report here a facile patterning method for atomically thin MoSe2 films using stripping with an SU-8 negative resist layer exposed to electron beam lithography. Additional steps of chemical and physical etching were not necessary in this SU-8 patterning method. The SU-8 patterning was used to define a ribbon channel from a field effect transistor of MoSe2 film, which was grown by chemical vapor deposition. The narrowing of the conduction channel area with SU-8 patterning was crucial in suppressing the leakage current within the device, thereby allowing a more accurate interpretation of the electrical characterization results from the sample. An electrical transport study, enabled by the SU-8 patterning, showed a variable range hopping behavior at high temperatures.Patterning two-dimensional materials into specific spatial arrangements and geometries is essential for both fundamental studies of materials and practical applications in electronics. However, the currently available patterning methods generally require etching steps that rely on complicated and expensive procedures. We report here a facile patterning method for atomically thin MoSe2 films using stripping with an SU-8 negative resist layer exposed to electron beam lithography. Additional steps of chemical and physical etching were not necessary in this SU-8 patterning method. The SU-8 patterning was used to define a ribbon channel from a field effect transistor of MoSe2 film, which was grown by chemical vapor deposition. The narrowing of the conduction channel area with SU-8 patterning was crucial in suppressing the leakage current within the device, thereby allowing a more accurate interpretation of the electrical characterization results from the sample. An electrical transport study, enabled by the SU-8 patterning, showed a variable range hopping behavior at high temperatures. Electronic supplementary information (ESI) available: Further experiments on patterning and additional electrical characterizations data. See DOI: 10.1039/c4nr03817g
P300 brain computer interface: current challenges and emerging trends
Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea
2012-01-01
A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397
Calculation of AC loss in two-layer superconducting cable with equal currents in the layers
NASA Astrophysics Data System (ADS)
Erdogan, Muzaffer
2016-12-01
A new method for calculating AC loss of two-layer SC power transmission cables using the commercial software Comsol Multiphysics, relying on the approach of the equal partition of current between the layers is proposed. Applying the method to calculate the AC-loss in a cable composed of two coaxial cylindrical SC tubes, the results are in good agreement with the analytical ones of duoblock model. Applying the method to calculate the AC-losses of a cable composed of a cylindrical copper former, surrounded by two coaxial cylindrical layers of superconducting tapes embedded in an insulating medium with tape-on-tape and tape-on-gap configurations are compared. A good agreement between the duoblock model and the numerical results for the tape-on-gap cable is observed.
Endophytic Phytoaugmentation: Treating Wastewater and Runoff Through Augmented Phytoremediation
Redfern, Lauren K.
2016-01-01
Abstract Limited options exist for efficiently and effectively treating water runoff from agricultural fields and landfills. Traditional treatments include excavation, transport to landfills, incineration, stabilization, and vitrification. In general, treatment options relying on biological methods such as bioremediation have the ability to be applied in situ and offer a sustainable remedial option with a lower environmental impact and reduced long-term operating expenses. These methods are generally considered ecologically friendly, particularly when compared to traditional physicochemical cleanup options. Phytoremediation, which relies on plants to take up and/or transform the contaminant of interest, is another alternative treatment method which has been developed. However, phytoremediation is not widely used, largely due to its low treatment efficiency. Endophytic phytoaugmentation is a variation on phytoremediation that relies on augmenting the phytoremediating plants with exogenous strains to stimulate associated plant-microbe interactions to facilitate and improve remediation efficiency. In this review, we offer a summary of the current knowledge as well as developments in endophytic phytoaugmentation and present some potential future applications for this technology. There has been a limited number of published endophytic phytoaugmentation case studies and much remains to be done to transition lab-scale results to field applications. Future research needs include large-scale endophytic phytoaugmentation experiments as well as the development of more exhaustive tools for monitoring plant-microbe-pollutant interactions. PMID:27158249
ASTER preflight and inflight calibration and the validation of level 2 products
Thome, K.; Aral, K.; Hook, S.; Kieffer, H.; Lang, H.; Matsunaga, T.; Ono, A.; Palluconi, F. D.; Sakuma, H.; Slater, P.; Takashima, T.; Tonooka, H.; Tsuchida, S.; Welch, R.M.; Zalewski, E.
1998-01-01
This paper describes the preflight and inflight calibration approaches used for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER). The system is a multispectral, high-spatial resolution sensor on the Earth Observing System's (EOS)-AMl platform. Preflight calibration of ASTER uses well-characterized sources to provide calibration and preflight round-robin exercises to understand biases between the calibration sources of ASTER and other EOS sensors. These round-robins rely on well-characterized, ultra-stable radiometers. An experiment held in Yokohama, Japan, showed that the output from the source used for the visible and near-infrared (VNIR) subsystem of ASTER may be underestimated by 1.5%, but this is still within the 4% specification for the absolute, radiometric calibration of these bands. Inflight calibration will rely on vicarious techniques and onboard blackbodies and lamps. Vicarious techniques include ground-reference methods using desert and water sites. A recent joint field campaign gives confidence that these methods currently provide absolute calibration to better than 5%, and indications are that uncertainties less than the required 4% should be achievable at launch. The EOS-AMI platform will also provide a spacecraft maneuver that will allow ASTER to see the moon, allowing further characterization of the sensor. A method for combining the results of these independent calibration results is presented. The paper also describes the plans for validating the Level 2 data products from ASTER. These plans rely heavily upon field campaigns using methods similar to those used for the ground-reference, vicarious calibration methods. ?? 1998 IEEE.
Robust image matching via ORB feature and VFC for mismatch removal
NASA Astrophysics Data System (ADS)
Ma, Tao; Fu, Wenxing; Fang, Bin; Hu, Fangyu; Quan, Siwen; Ma, Jie
2018-03-01
Image matching is at the base of many image processing and computer vision problems, such as object recognition or structure from motion. Current methods rely on good feature descriptors and mismatch removal strategies for detection and matching. In this paper, we proposed a robust image match approach based on ORB feature and VFC for mismatch removal. ORB (Oriented FAST and Rotated BRIEF) is an outstanding feature, it has the same performance as SIFT with lower cost. VFC (Vector Field Consensus) is a state-of-the-art mismatch removing method. The experiment results demonstrate that our method is efficient and robust.
Semi-Supervised Recurrent Neural Network for Adverse Drug Reaction mention extraction.
Gupta, Shashank; Pawar, Sachin; Ramrakhiyani, Nitin; Palshikar, Girish Keshav; Varma, Vasudeva
2018-06-13
Social media is a useful platform to share health-related information due to its vast reach. This makes it a good candidate for public-health monitoring tasks, specifically for pharmacovigilance. We study the problem of extraction of Adverse-Drug-Reaction (ADR) mentions from social media, particularly from Twitter. Medical information extraction from social media is challenging, mainly due to short and highly informal nature of text, as compared to more technical and formal medical reports. Current methods in ADR mention extraction rely on supervised learning methods, which suffer from labeled data scarcity problem. The state-of-the-art method uses deep neural networks, specifically a class of Recurrent Neural Network (RNN) which is Long-Short-Term-Memory network (LSTM). Deep neural networks, due to their large number of free parameters rely heavily on large annotated corpora for learning the end task. But in the real-world, it is hard to get large labeled data, mainly due to the heavy cost associated with the manual annotation. To this end, we propose a novel semi-supervised learning based RNN model, which can leverage unlabeled data also present in abundance on social media. Through experiments we demonstrate the effectiveness of our method, achieving state-of-the-art performance in ADR mention extraction. In this study, we tackle the problem of labeled data scarcity for Adverse Drug Reaction mention extraction from social media and propose a novel semi-supervised learning based method which can leverage large unlabeled corpus available in abundance on the web. Through empirical study, we demonstrate that our proposed method outperforms fully supervised learning based baseline which relies on large manually annotated corpus for a good performance.
Piot, P.; Behrens, C.; Gerth, C.; ...
2011-09-07
We report on the successful experimental generation of electron bunches with ramped current profiles. The technique relies on impressing nonlinear correlations in the longitudinal phase space using a superconducing radiofrequency linear accelerator operating at two frequencies and a current-enhancing dispersive section. The produced {approx} 700-MeV bunches have peak currents of the order of a kilo-Ampere. Data taken for various accelerator settings demonstrate the versatility of the method and in particular its ability to produce current profiles that have a quasi-linear dependency on the longitudinal (temporal) coordinate. The measured bunch parameters are shown, via numerical simulations, to produce gigavolt-per-meter peak acceleratingmore » electric fields with transformer ratios larger than 2 in dielectric-lined waveguides.« less
Piot, P; Behrens, C; Gerth, C; Dohlus, M; Lemery, F; Mihalcea, D; Stoltz, P; Vogt, M
2012-01-20
We report on the successful experimental generation of electron bunches with ramped current profiles. The technique relies on impressing nonlinear correlations in the longitudinal phase space using a superconducing radio frequency linear accelerator operating at two frequencies and a current-enhancing dispersive section. The produced ~700-MeV bunches have peak currents of the order of a kilo-Ampère. Data taken for various accelerator settings demonstrate the versatility of the method and, in particular, its ability to produce current profiles that have a quasilinear dependency on the longitudinal (temporal) coordinate. The measured bunch parameters are shown, via numerical simulations, to produce gigavolt-per-meter peak accelerating electric fields with transformer ratios larger than 2 in dielectric-lined waveguides. © 2012 American Physical Society
Huffman, D.D.; Hughes, R.C.; Kelsey, C.A.; Lane, R.; Ricco, A.J.; Snelling, J.B.; Zipperian, T.E.
1986-08-29
Methods of and apparatus for in vivo radiation measurements rely on a MOSFET dosimeter of high radiation sensitivity which operates in both the passive mode to provide an integrated dose detector and active mode to provide an irradiation rate detector. A compensating circuit with a matched unirradiated MOSFET is provided to operate at a current designed to eliminate temperature dependence of the device. Preferably, the MOSFET is rigidly mounted in the end of a miniature catheter and the catheter is implanted in the patient proximate the radiation source.
A Fully Automated Stage for Optical Waveguide Measurements
1993-09-01
method, as in the case of the out-of-plane method, also relies on a certain level of uniformity in the waveguide. Accurate loss measurements over a...2 . The S1227-66BQ has a response from 190 nm to 1000 nm with a peak at 720 nm and a typical radiant sensitivity of 0.35 A/W at the peak wavelength 3... levels . The current generated in the detector due to incident light is converted to a voltage at the output of the operational amplifier (op-amp
Simultaneous confidence sets for several effective doses.
Tompsett, Daniel M; Biedermann, Stefanie; Liu, Wei
2018-04-03
Construction of simultaneous confidence sets for several effective doses currently relies on inverting the Scheffé type simultaneous confidence band, which is known to be conservative. We develop novel methodology to make the simultaneous coverage closer to its nominal level, for both two-sided and one-sided simultaneous confidence sets. Our approach is shown to be considerably less conservative than the current method, and is illustrated with an example on modeling the effect of smoking status and serum triglyceride level on the probability of the recurrence of a myocardial infarction. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
How to know and choose online games: differences between current and potential players.
Teng, Ching-I; Lo, Shao-Kang; Wang, Pe-Cheng
2007-12-01
This study investigated how different adolescent players acquire game information and the criteria they use in choosing online games and found that (1) current players generally use comprehensive information sources more than potential players do; (2) current players rely on free trials and smooth display of motion graphics as choice criteria more than potential players do; (3) potential players rely on the look of advertisements more than current players do; (4) both current and potential players most likely use word-of-mouth and gaming programs on TV as information sources; and (5) endorser attractiveness is ranked the least important among six choice criteria by both current and potential players.
Thermometer use among Mexican immigrant mothers in California.
Schwartz, N; Guendelman, S; English, P
1997-11-01
A community-based household survey was utilized to assess the relationship between thermometer use, home treatment and utilization of health care services. Using a cross-sectional design, the study surveyed 688 low income Mexican origin mothers of children between the ages of 8 and 16 months in San Diego County. Mothers were asked how they determine that their child has fever and how often they use a thermometer. Nearly 40% of low income Mexican mothers interviewed in San Diego county never used a thermometer for determining childhood fever. Approximately two-thirds (64.7%) relied either primarily or exclusively on embodied methods such as visual observation or touch to determine fever in their child. A multivariate logistic regression analysis determined that low education and a separated or divorced marital status decreased the odds of thermometer use, whereas regular contact with the health care system doubled the likelihood of thermometer use. Mothers who relied on embodied methods were more likely to use over-the-counter medications than those who relied on thermometers; however, no significant differences were found between groups using other methods of home treatment. Fever determination modalities can be used to screen for lack of access to care and to provide for other health care needs in a culturally appropriate manner. While clinicians' expectations may include parental experience with temperature taking, current pediatric literature questions the need for home-based thermometer use. Possible alternatives to the traditional rectal thermometer might include digital thermometers and color coded thermometer strips.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faulconer, D.W
2004-03-15
Certain devices aimed at magnetic confinement of thermonuclear plasma rely on the steady flow of an electric current in the plasma. In view of the dominant place it occupies in both the world magnetic-confinement fusion effort and the author's own activity, the tokamak toroidal configuration is selected as prototype for discussing the question of how such a current can be maintained. Tokamaks require a stationary toroidal plasma current, this being traditionally provided by a pulsed magnetic induction which drives the plasma ring as the secondary of a transformer. Since this mechanism is essentially transient, and steady-state fusion reactor operation hasmore » manifold advantages, significant effort is now devoted to developing alternate steady-state means of generating toroidal current. These methods are classed under the global heading of 'noninductive current drive' or simply 'current drive', generally, though not exclusively, employing the injection of waves and/or toroidally directed particle beams. In what follows we highlight the physical mechanisms underlying surprisingly various approaches to driving current in a tokamak, downplaying a number of practical and technical issues. When a significant data base exists for a given method, its experimental current drive efficiency and future prospects are detailed.« less
Current trends in endotoxin detection and analysis of endotoxin-protein interactions.
Dullah, Elvina Clarie; Ongkudon, Clarence M
2017-03-01
Endotoxin is a type of pyrogen that can be found in Gram-negative bacteria. Endotoxin can form a stable interaction with other biomolecules thus making its removal difficult especially during the production of biopharmaceutical drugs. The prevention of endotoxins from contaminating biopharmaceutical products is paramount as endotoxin contamination, even in small quantities, can result in fever, inflammation, sepsis, tissue damage and even lead to death. Highly sensitive and accurate detection of endotoxins are keys in the development of biopharmaceutical products derived from Gram-negative bacteria. It will facilitate the study of the intermolecular interaction of an endotoxin with other biomolecules, hence the selection of appropriate endotoxin removal strategies. Currently, most researchers rely on the conventional LAL-based endotoxin detection method. However, new methods have been and are being developed to overcome the problems associated with the LAL-based method. This review paper highlights the current research trends in endotoxin detection from conventional methods to newly developed biosensors. Additionally, it also provides an overview of the use of electron microscopy, dynamic light scattering (DLS), fluorescence resonance energy transfer (FRET) and docking programs in the endotoxin-protein analysis.
Hur, M. S.; Ersfeld, B.; Noble, A.; Suk, H.; Jaroszynski, D. A.
2017-01-01
Ultra-intense, narrow-bandwidth, electromagnetic pulses have become important tools for exploring the characteristics of matter. Modern tuneable high-power light sources, such as free-electron lasers and vacuum tubes, rely on bunching of relativistic or near-relativistic electrons in vacuum. Here we present a fundamentally different method for producing narrow-bandwidth radiation from a broad spectral bandwidth current source, which takes advantage of the inflated radiation impedance close to cut-off in a medium with a plasma-like permittivity. We find that by embedding a current source in this cut-off region, more than an order of magnitude enhancement of the radiation intensity is obtained compared with emission directly into free space. The method suggests a simple and general way to flexibly use broadband current sources to produce broad or narrow bandwidth pulses. As an example, we demonstrate, using particle-in-cell simulations, enhanced monochromatic emission of terahertz radiation using a two-colour pumped current source enclosed by a tapered waveguide. PMID:28071681
NASA Astrophysics Data System (ADS)
Hur, M. S.; Ersfeld, B.; Noble, A.; Suk, H.; Jaroszynski, D. A.
2017-01-01
Ultra-intense, narrow-bandwidth, electromagnetic pulses have become important tools for exploring the characteristics of matter. Modern tuneable high-power light sources, such as free-electron lasers and vacuum tubes, rely on bunching of relativistic or near-relativistic electrons in vacuum. Here we present a fundamentally different method for producing narrow-bandwidth radiation from a broad spectral bandwidth current source, which takes advantage of the inflated radiation impedance close to cut-off in a medium with a plasma-like permittivity. We find that by embedding a current source in this cut-off region, more than an order of magnitude enhancement of the radiation intensity is obtained compared with emission directly into free space. The method suggests a simple and general way to flexibly use broadband current sources to produce broad or narrow bandwidth pulses. As an example, we demonstrate, using particle-in-cell simulations, enhanced monochromatic emission of terahertz radiation using a two-colour pumped current source enclosed by a tapered waveguide.
A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.
Petrantonakis, Panagiotis C; Poirazi, Panayiota
2015-01-01
The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.
Evaluation of Methods for In-Situ Calibration of Field-Deployable Microphone Phased Arrays
NASA Technical Reports Server (NTRS)
Humphreys, William M.; Lockard, David P.; Khorrami, Mehdi R.; Culliton, William G.; McSwain, Robert G.
2017-01-01
Current field-deployable microphone phased arrays for aeroacoustic flight testing require the placement of hundreds of individual sensors over a large area. Depending on the duration of the test campaign, the microphones may be required to stay deployed at the testing site for weeks or even months. This presents a challenge in regards to tracking the response (i.e., sensitivity) of the individual sensors as a function of time in order to evaluate the health of the array. To address this challenge, two different methods for in-situ tracking of microphone responses are described. The first relies on the use of an aerial sound source attached as a payload on a hovering small Unmanned Aerial System (sUAS) vehicle. The second relies on the use of individually excited ground-based sound sources strategically placed throughout the array pattern. Testing of the two methods was performed in microphone array deployments conducted at Fort A.P. Hill in 2015 and at Edwards Air Force Base in 2016. The results indicate that the drift in individual sensor responses can be tracked reasonably well using both methods. Thus, in-situ response tracking methods are useful as a diagnostic tool for monitoring the health of a phased array during long duration deployments.
Comparison of Spatiotemporal Mapping Techniques for Enormous Etl and Exploitation Patterns
NASA Astrophysics Data System (ADS)
Deiotte, R.; La Valley, R.
2017-10-01
The need to extract, transform, and exploit enormous volumes of spatiotemporal data has exploded with the rise of social media, advanced military sensors, wearables, automotive tracking, etc. However, current methods of spatiotemporal encoding and exploitation simultaneously limit the use of that information and increase computing complexity. Current spatiotemporal encoding methods from Niemeyer and Usher rely on a Z-order space filling curve, a relative of Peano's 1890 space filling curve, for spatial hashing and interleaving temporal hashes to generate a spatiotemporal encoding. However, there exist other space-filling curves, and that provide different manifold coverings that could promote better hashing techniques for spatial data and have the potential to map spatiotemporal data without interleaving. The concatenation of Niemeyer's and Usher's techniques provide a highly efficient space-time index. However, other methods have advantages and disadvantages regarding computational cost, efficiency, and utility. This paper explores the several methods using a range of sizes of data sets from 1K to 10M observations and provides a comparison of the methods.
Vázquez-Rowe, Ian; Iribarren, Diego
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.
Vázquez-Rowe, Ian
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136
Casper, T. A.; Meyer, W. H.; Jackson, G. L.; ...
2010-12-08
We are exploring characteristics of ITER startup scenarios in similarity experiments conducted on the DIII-D Tokamak. In these experiments, we have validated scenarios for the ITER current ramp up to full current and developed methods to control the plasma parameters to achieve stability. Predictive simulations of ITER startup using 2D free-boundary equilibrium and 1D transport codes rely on accurate estimates of the electron and ion temperature profiles that determine the electrical conductivity and pressure profiles during the current rise. Here we present results of validation studies that apply the transport model used by the ITER team to DIII-D discharge evolutionmore » and comparisons with data from our similarity experiments.« less
Global disaster satellite communications system for disaster assessment and relief coordination
NASA Technical Reports Server (NTRS)
Leroy, B. E.
1979-01-01
The global communication requirements for disaster assistance and examines operationally feasible satellite system concepts and the associated system parameters are analyzed. Some potential problems associated with the current method of providing disaster assistance and a scenario for disaster assistance relying on satellite communications are described. Historical statistics are used with the scenario to assess service requirements. Both present and planned commercially available systems are considered. The associated global disaster communication yearly service costs are estimated.
A simple transformation independent method for outlier definition.
Johansen, Martin Berg; Christensen, Peter Astrup
2018-04-10
Definition and elimination of outliers is a key element for medical laboratories establishing or verifying reference intervals (RIs). Especially as inclusion of just a few outlying observations may seriously affect the determination of the reference limits. Many methods have been developed for definition of outliers. Several of these methods are developed for the normal distribution and often data require transformation before outlier elimination. We have developed a non-parametric transformation independent outlier definition. The new method relies on drawing reproducible histograms. This is done by using defined bin sizes above and below the median. The method is compared to the method recommended by CLSI/IFCC, which uses Box-Cox transformation (BCT) and Tukey's fences for outlier definition. The comparison is done on eight simulated distributions and an indirect clinical datasets. The comparison on simulated distributions shows that without outliers added the recommended method in general defines fewer outliers. However, when outliers are added on one side the proposed method often produces better results. With outliers on both sides the methods are equally good. Furthermore, it is found that the presence of outliers affects the BCT, and subsequently affects the determined limits of current recommended methods. This is especially seen in skewed distributions. The proposed outlier definition reproduced current RI limits on clinical data containing outliers. We find our simple transformation independent outlier detection method as good as or better than the currently recommended methods.
Remote-sensing-based rapid assessment of flood crop loss to support USDA flooding decision-making
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Yang, Z.; Hipple, J.; Shrestha, R.
2016-12-01
Floods often cause significant crop loss in the United States. Timely and objective assessment of flood-related crop loss is very important for crop monitoring and risk management in agricultural and disaster-related decision-making in USDA. Among all flood-related information, crop yield loss is particularly important. Decision on proper mitigation, relief, and monetary compensation relies on it. Currently USDA mostly relies on field surveys to obtain crop loss information and compensate farmers' loss claim. Such methods are expensive, labor intensive, and time consumptive, especially for a large flood that affects a large geographic area. Recent studies have demonstrated that Earth observation (EO) data are useful in post-flood crop loss assessment for a large geographic area objectively, timely, accurately, and cost effectively. There are three stages of flood damage assessment, including rapid assessment, early recovery assessment, and in-depth assessment. EO-based flood assessment methods currently rely on the time-series of vegetation index to assess the yield loss. Such methods are suitable for in-depth assessment but are less suitable for rapid assessment since the after-flood vegetation index time series is not available. This presentation presents a new EO-based method for the rapid assessment of crop yield loss immediately after a flood event to support the USDA flood decision making. The method is based on the historic records of flood severity, flood duration, flood date, crop type, EO-based both before- and immediate-after-flood crop conditions, and corresponding crop yield loss. It hypotheses that a flood of same severity occurring at the same pheonological stage of a crop will cause the similar damage to the crop yield regardless the flood years. With this hypothesis, a regression-based rapid assessment algorithm can be developed by learning from historic records of flood events and corresponding crop yield loss. In this study, historic records of MODIS-based flood and vegetation products and USDA/NASS crop type and crop yield data are used to train the regression-based rapid assessment algorithm. Validation of the rapid assessment algorithm indicates it can predict the yield loss at 90% accuracy, which is accurate enough to support USDA on flood-related quick response and mitigation.
Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M
2013-10-01
Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.
DeepLoc: prediction of protein subcellular localization using deep learning.
Almagro Armenteros, José Juan; Sønderby, Casper Kaae; Sønderby, Søren Kaae; Nielsen, Henrik; Winther, Ole
2017-11-01
The prediction of eukaryotic protein subcellular localization is a well-studied topic in bioinformatics due to its relevance in proteomics research. Many machine learning methods have been successfully applied in this task, but in most of them, predictions rely on annotation of homologues from knowledge databases. For novel proteins where no annotated homologues exist, and for predicting the effects of sequence variants, it is desirable to have methods for predicting protein properties from sequence information only. Here, we present a prediction algorithm using deep neural networks to predict protein subcellular localization relying only on sequence information. At its core, the prediction model uses a recurrent neural network that processes the entire protein sequence and an attention mechanism identifying protein regions important for the subcellular localization. The model was trained and tested on a protein dataset extracted from one of the latest UniProt releases, in which experimentally annotated proteins follow more stringent criteria than previously. We demonstrate that our model achieves a good accuracy (78% for 10 categories; 92% for membrane-bound or soluble), outperforming current state-of-the-art algorithms, including those relying on homology information. The method is available as a web server at http://www.cbs.dtu.dk/services/DeepLoc. Example code is available at https://github.com/JJAlmagro/subcellular_localization. The dataset is available at http://www.cbs.dtu.dk/services/DeepLoc/data.php. jjalma@dtu.dk. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Detection of food intake from swallowing sequences by supervised and unsupervised methods.
Lopez-Meyer, Paulo; Makeyev, Oleksandr; Schuckers, Stephanie; Melanson, Edward L; Neuman, Michael R; Sazonov, Edward
2010-08-01
Studies of food intake and ingestive behavior in free-living conditions most often rely on self-reporting-based methods that can be highly inaccurate. Methods of Monitoring of Ingestive Behavior (MIB) rely on objective measures derived from chewing and swallowing sequences and thus can be used for unbiased study of food intake with free-living conditions. Our previous study demonstrated accurate detection of food intake in simple models relying on observation of both chewing and swallowing. This article investigates methods that achieve comparable accuracy of food intake detection using only the time series of swallows and thus eliminating the need for the chewing sensor. The classification is performed for each individual swallow rather than for previously used time slices and thus will lead to higher accuracy in mass prediction models relying on counts of swallows. Performance of a group model based on a supervised method (SVM) is compared to performance of individual models based on an unsupervised method (K-means) with results indicating better performance of the unsupervised, self-adapting method. Overall, the results demonstrate that highly accurate detection of intake of foods with substantially different physical properties is possible by an unsupervised system that relies on the information provided by the swallowing alone.
Detection of Food Intake from Swallowing Sequences by Supervised and Unsupervised Methods
Lopez-Meyer, Paulo; Makeyev, Oleksandr; Schuckers, Stephanie; Melanson, Edward L.; Neuman, Michael R.; Sazonov, Edward
2010-01-01
Studies of food intake and ingestive behavior in free-living conditions most often rely on self-reporting-based methods that can be highly inaccurate. Methods of Monitoring of Ingestive Behavior (MIB) rely on objective measures derived from chewing and swallowing sequences and thus can be used for unbiased study of food intake with free-living conditions. Our previous study demonstrated accurate detection of food intake in simple models relying on observation of both chewing and swallowing. This article investigates methods that achieve comparable accuracy of food intake detection using only the time series of swallows and thus eliminating the need for the chewing sensor. The classification is performed for each individual swallow rather than for previously used time slices and thus will lead to higher accuracy in mass prediction models relying on counts of swallows. Performance of a group model based on a supervised method (SVM) is compared to performance of individual models based on an unsupervised method (K-means) with results indicating better performance of the unsupervised, self-adapting method. Overall, the results demonstrate that highly accurate detection of intake of foods with substantially different physical properties is possible by an unsupervised system that relies on the information provided by the swallowing alone. PMID:20352335
Transport Coefficients from Large Deviation Functions
NASA Astrophysics Data System (ADS)
Gao, Chloe; Limmer, David
2017-10-01
We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.
Cryo-balloon catheter localization in fluoroscopic images
NASA Astrophysics Data System (ADS)
Kurzendorfer, Tanja; Brost, Alexander; Jakob, Carolin; Mewes, Philip W.; Bourier, Felix; Koch, Martin; Kurzidim, Klaus; Hornegger, Joachim; Strobel, Norbert
2013-03-01
Minimally invasive catheter ablation has become the preferred treatment option for atrial fibrillation. Although the standard ablation procedure involves ablation points set by radio-frequency catheters, cryo-balloon catheters have even been reported to be more advantageous in certain cases. As electro-anatomical mapping systems do not support cryo-balloon ablation procedures, X-ray guidance is needed. However, current methods to provide support for cryo-balloon catheters in fluoroscopically guided ablation procedures rely heavily on manual user interaction. To improve this, we propose a first method for automatic cryo-balloon catheter localization in fluoroscopic images based on a blob detection algorithm. Our method is evaluated on 24 clinical images from 17 patients. The method successfully detected the cryoballoon in 22 out of 24 images, yielding a success rate of 91.6 %. The successful localization achieved an accuracy of 1.00 mm +/- 0.44 mm. Even though our methods currently fails in 8.4 % of the images available, it still offers a significant improvement over manual methods. Furthermore, detecting a landmark point along the cryo-balloon catheter can be a very important step for additional post-processing operations.
Hong, Seong Cheol; Murale, Dhiraj P; Jang, Se-Young; Haque, Md Mamunul; Seo, Minah; Lee, Seok; Woo, Deok Ha; Kwon, Junghoon; Song, Chang-Seon; Kim, Yun Kyung; Lee, Jun-Seok
2018-06-22
Avian Influenza (AI) caused an annual epidemic outbreak that led to destroying tens of millions of poultry worldwide. Current gold standard AI diagnosis method is an embryonic egg-based hemagglutination assay followed by immunoblotting or PCR sequencing to confirm subtypes. It requires, however, specialized facilities to handle egg inoculation and incubation, and the subtyping methods relied on costly reagents. Here, we demonstrated the first differential sensing approach to distinguish AI subtypes using series of cell lines and fluorescent sensor. Susceptibility of AI virus differs depending on genetic backgrounds of host cells. Thus, we examined cells from different organ origin, and the infection patterns against a panel of cells were utilized for AI virus subtyping. To quantify AI infection, we designed a highly cell-permeable fluorescent superoxide sensor to visualize infection. Though many AI monitoring strategies relied on sophisticated antibody have been extensively studied, our differential sensing strategy successfully proved discriminations of AI subtypes and demonstrated as a useful primary screening platform to monitor a large number of samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Electronic field emission models beyond the Fowler-Nordheim one
NASA Astrophysics Data System (ADS)
Lepetit, Bruno
2017-12-01
We propose several quantum mechanical models to describe electronic field emission from first principles. These models allow us to correlate quantitatively the electronic emission current with the electrode surface details at the atomic scale. They all rely on electronic potential energy surfaces obtained from three dimensional density functional theory calculations. They differ by the various quantum mechanical methods (exact or perturbative, time dependent or time independent), which are used to describe tunneling through the electronic potential energy barrier. Comparison of these models between them and with the standard Fowler-Nordheim one in the context of one dimensional tunneling allows us to assess the impact on the accuracy of the computed current of the approximations made in each model. Among these methods, the time dependent perturbative one provides a well-balanced trade-off between accuracy and computational cost.
[Bases and methods of suturing].
Vogt, P M; Altintas, M A; Radtke, C; Meyer-Marcotty, M
2009-05-01
If pharmaceutic modulation of scar formation does not improve the quality of the healing process over conventional healing, the surgeon must rely on personal skill and experience. Therefore a profound knowledge of wound healing based on experimental and clinical studies supplemented by postsurgical means of scar management and basic techniques of planning incisions, careful tissue handling, and thorough knowledge of suturing remain the most important ways to avoid abnormal scarring. This review summarizes the current experimental and clinical bases of surgical scar management.
Composite annotations: requirements for mapping multiscale data and models to biomedical ontologies
Cook, Daniel L.; Mejino, Jose L. V.; Neal, Maxwell L.; Gennari, John H.
2009-01-01
Current methods for annotating biomedical data resources rely on simple mappings between data elements and the contents of a variety of biomedical ontologies and controlled vocabularies. Here we point out that such simple mappings are inadequate for large-scale multiscale, multidomain integrative “virtual human” projects. For such integrative challenges, we describe a “composite annotation” schema that is simple yet sufficiently extensible for mapping the biomedical content of a variety of data sources and biosimulation models to available biomedical ontologies. PMID:19964601
Gate-Driven Pure Spin Current in Graphene
NASA Astrophysics Data System (ADS)
Lin, Xiaoyang; Su, Li; Si, Zhizhong; Zhang, Youguang; Bournel, Arnaud; Zhang, Yue; Klein, Jacques-Olivier; Fert, Albert; Zhao, Weisheng
2017-09-01
The manipulation of spin current is a promising solution for low-power devices beyond CMOS. However, conventional methods, such as spin-transfer torque or spin-orbit torque for magnetic tunnel junctions, suffer from large power consumption due to frequent spin-charge conversions. An important challenge is, thus, to realize long-distance transport of pure spin current, together with efficient manipulation. Here, the mechanism of gate-driven pure spin current in graphene is presented. Such a mechanism relies on the electrical gating of carrier-density-dependent conductivity and spin-diffusion length in graphene. The gate-driven feature is adopted to realize the pure spin-current demultiplexing operation, which enables gate-controllable distribution of the pure spin current into graphene branches. Compared with the Elliott-Yafet spin-relaxation mechanism, the D'yakonov-Perel spin-relaxation mechanism results in more appreciable demultiplexing performance. The feature of the pure spin-current demultiplexing operation will allow a number of logic functions to be cascaded without spin-charge conversions and open a route for future ultra-low-power devices.
Effect of seabed roughness on tidal current turbines
NASA Astrophysics Data System (ADS)
Gupta, Vikrant; Wan, Minping
2017-11-01
Tidal current turbines are shown to have potential to generate clean energy for a negligible environmental impact. These devices, however, operate in high to moderate current regions where the flow is highly turbulent. It has been shown in flume tank experiments at IFREMER in Boulogne-Sur-Mer (France) and NAFL in the University of Minnesota (US) that the level of turbulence and boundary layer profile affect a turbine's power output and wake characteristics. A major factor that determines these marine flow characteristics is the seabed roughness. Experiments, however, cannot simulate the high Reynolds number conditions of real marine flows. For that, we rely on numerical simulations. High accuracy numerical methods, such as DNS, of wall-bounded flows are very expensive, where the number of grid-points needed to resolve the flow varies as (Re) 9 / 4 (where Re is the flow Reynolds number). While numerically affordable RANS methods compromise on accuracy. Wall-modelled LES methods, which provide both accuracy and affordability, have been improved tremendously in the recent years. We discuss the application of such numerical methods for studying the effect of seabed roughness on marine flow features and their impact on turbine power output and wake characteristics. NSFC, Project Number 11672123.
NASA Astrophysics Data System (ADS)
Webster, Matthew Julian
The ultimate goal of any treatment of cancer is to maximize the likelihood of killing the tumor while minimizing the chance of damaging healthy tissues. One of the most effective ways to accomplish this is through radiation therapy, which must be able to target the tumor volume with a high accuracy while minimizing the dose delivered to healthy tissues. A successful method of accomplishing this is brachytherapy which works by placing the radiation source in very close proximity to the tumor. However, most current applications of brachytherapy rely mostly on the geometric manipulation of isotropic sources, which limits the ability to specifically target the tumor. The purpose of this work is to introduce several types of shielded brachytherapy applicators which are capable of targeting tumors with much greater accuracy than existing technologies. These applicators rely on the modulation of the dose profile through a high-density tungsten alloy shields to create anisotropic dose distributions. Two classes of applicators have been developed in this work. The first relies on the active motion of the shield, to aim a highly directional radiation profile. This allows for very precise control of the dose distribution for treatment, achieving unparalleled dose coverage to the tumor while sparing healthy tissues. This technique has been given the moniker of Dynamic Modulated Brachytherapy (DMBT). The second class of applicators, designed to reduce treatment complexity uses static applicators. These applicators retain the use of the tungsten shield, but the shield is motionless during treatment. By intelligently designing the shield, significant improvements over current methods have been demonstrated. Although these static applicators fail to match the dosimetric quality of DMBT applicators the simplified setup and treatment procedure gives them significant appeal. The focus of this work has been to optimize these shield designs, specifically for the treatment of rectal and breast carcinomas. The use of Monte Carlo methods and development of optimization algorithms have played a prominent role in accomplishing this. The use of shielded applicators, such as the ones described here, is the next logical step in the rapidly evolving field of brachytherapy.
Abrahamson, Joseph P; Zelina, Joseph; Andac, M Gurhan; Vander Wal, Randy L
2016-11-01
The first order approximation (FOA3) currently employed to estimate BC mass emissions underpredicts BC emissions due to inaccuracies in measuring low smoke numbers (SNs) produced by modern high bypass ratio engines. The recently developed Formation and Oxidation (FOX) method removes the need for and hence uncertainty associated with (SNs), instead relying upon engine conditions in order to predict BC mass. Using the true engine operating conditions from proprietary engine cycle data an improved FOX (ImFOX) predictive relation is developed. Still, the current methods are not optimized to estimate cruise emissions nor account for the use of alternative jet fuels with reduced aromatic content. Here improved correlations are developed to predict engine conditions and BC mass emissions at ground and cruise altitude. This new ImFOX is paired with a newly developed hydrogen relation to predict emissions from alternative fuels and fuel blends. The ImFOX is designed for rich-quench-lean style combustor technologies employed predominately in the current aviation fleet.
Renaissance of protein crystallization and precipitation in biopharmaceuticals purification.
Dos Santos, Raquel; Carvalho, Ana Luísa; Roque, A Cecília A
The current chromatographic approaches used in protein purification are not keeping pace with the increasing biopharmaceutical market demand. With the upstream improvements, the bottleneck shifted towards the downstream process. New approaches rely in Anything But Chromatography methodologies and revisiting former techniques with a bioprocess perspective. Protein crystallization and precipitation methods are already implemented in the downstream process of diverse therapeutic biological macromolecules, overcoming the current chromatographic bottlenecks. Promising work is being developed in order to implement crystallization and precipitation in the purification pipeline of high value therapeutic molecules. This review focuses in the role of these two methodologies in current industrial purification processes, and highlights their potential implementation in the purification pipeline of high value therapeutic molecules, overcoming chromatographic holdups. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1990-01-01
A novel method of microwave power conversion to direct current is discussed that relies on a modification of well known resonant linear relativistic electron accelerator techniques. An analysis is presented that shows how, by establishing a 'slow' electromagnetic field in a waveguide, electrons liberated from an array of field emission cathodes, are resonantly accelerated to several times their rest energy, thus establishing an electric current over a large potential difference. Such an approach is not limited to the relatively low frequencies that characterize the operation of rectennas, and can, with appropriate waveguide and slow wave structure design, be employed in the 300 to 600 GHz range where much smaller transmitting and receiving antennas are needed.
Citizen science: a new direction in canine behavior research.
Hecht, Julie; Spicer Rice, Eleanor
2015-01-01
Researchers increasingly rely on members of the public to contribute to scientific projects-from collecting or identifying, to analyzing and disseminating data. The "citizen science" model proves useful to many thematically distinctive fields, like ornithology, astronomy, and phenology. The recent formalization of citizen science projects addresses technical issues related to volunteer participation--like data quality--so that citizen scientists can make longstanding, meaningful contributions to scientific projects. Since the late 1990s, canine science research has relied with greater frequency on the participation of the general public, particularly dog owners. These researchers do not typically consider the methods and technical issues that those conducting citizen science projects embrace and continue to investigate. As more canine science studies rely on public input, an in-depth knowledge of the benefits and challenges of citizen science can help produce relevant, high-quality data while increasing the general public's understanding of canine behavior and cognition as well as the scientific process. We examine the benefits and challenges of current citizen science models in an effort to enhance canine citizen science project preparation, execution, and dissemination. This article is part of a Special Issue entitled: Canine Behavior. Copyright © 2014 Elsevier B.V. All rights reserved.
Barlow, Anders J; Portoles, Jose F; Sano, Naoko; Cumpson, Peter J
2016-10-01
The development of the helium ion microscope (HIM) enables the imaging of both hard, inorganic materials and soft, organic or biological materials. Advantages include outstanding topographical contrast, superior resolution down to <0.5 nm at high magnification, high depth of field, and no need for conductive coatings. The instrument relies on helium atom adsorption and ionization at a cryogenically cooled tip that is atomically sharp. Under ideal conditions this arrangement provides a beam of ions that is stable for days to weeks, with beam currents in the order of picoamperes. Over time, however, this stability is lost as gaseous contamination builds up in the source region, leading to adsorbed atoms of species other than helium, which ultimately results in beam current fluctuations. This manifests itself as horizontal stripe artifacts in HIM images. We investigate post-processing methods to remove these artifacts from HIM images, such as median filtering, Gaussian blurring, fast Fourier transforms, and principal component analysis. We arrive at a simple method for completely removing beam current fluctuation effects from HIM images while maintaining the full integrity of the information within the image.
NASA Astrophysics Data System (ADS)
Laubscher, Markus; Bourquin, Stéphane; Froehly, Luc; Karamata, Boris; Lasser, Theo
2004-07-01
Current spectroscopic optical coherence tomography (OCT) methods rely on a posteriori numerical calculation. We present an experimental alternative for accessing spectroscopic information in OCT without post-processing based on wavelength de-multiplexing and parallel detection using a diffraction grating and a smart pixel detector array. Both a conventional A-scan with high axial resolution and the spectrally resolved measurement are acquired simultaneously. A proof-of-principle demonstration is given on a dynamically changing absorbing sample. The method's potential for fast spectroscopic OCT imaging is discussed. The spectral measurements obtained with this approach are insensitive to scan non-linearities or sample movements.
NASA Technical Reports Server (NTRS)
Takeshita, Riki (Inventor); Hibbard, Terry L. (Inventor)
2001-01-01
Friction plug welding (FPW) usage is advantageous for friction stir welding (FSW) hole close-outs and weld repairs in 2195 Al--Cu--Li fusion or friction stir welds. Current fusion welding methods of Al--Cu--Li have produced welds containing varied defects. These areas are found by non-destructive examination both after welding and after proof testing. Current techniques for repairing typically small (<0.25) defects weaken the weldment, rely heavily on welders' skill, and are costly. Friction plug welding repairs increase strength, ductility and resistance to cracking over initial weld quality, without requiring much time or operator skill. Friction plug welding while pulling the plug is advantageous because all hardware for performing the weld can be placed on one side of the workpiece.
NASA Astrophysics Data System (ADS)
Lee, Mun Bae; Kwon, Oh-In
2018-04-01
Electrical brain stimulation (EBS) is an invasive electrotherapy and technique used in brain neurological disorders through direct or indirect stimulation using a small electric current. EBS has relied on computational modeling to achieve optimal stimulation effects and investigate the internal activations. Magnetic resonance diffusion weighted imaging (DWI) is commonly useful for diagnosis and investigation of tissue functions in various organs. The apparent diffusion coefficient (ADC) measures the intensity of water diffusion within biological tissues using DWI. By measuring trace ADC and magnetic flux density induced by the EBS, we propose a method to extract electrical properties including the effective extracellular ion-concentration (EEIC) and the apparent isotropic conductivity without any auxiliary additional current injection. First, the internal current density due to EBS is recovered using the measured one component of magnetic flux density. We update the EEIC by introducing a repetitive scheme called the diffusion weighting J-substitution algorithm using the recovered current density and the trace ADC. To verify the proposed method, we study an anesthetized canine brain to visualize electrical properties including electrical current density, effective extracellular ion-concentration, and effective isotropic conductivity by applying electrical stimulation of the brain.
Beyond the Condom: Frontiers in Male Contraception
Roth, Mara Y.; Amory, John K.
2016-01-01
Nearly half of all pregnancies worldwide are unplanned, despite numerous contraceptive options available. No new contraceptive method has been developed for men since the invention of condom. Nevertheless, more than 25% of contraception worldwide relies on male methods. Therefore, novel effective methods of male contraception are of interest. Herein we review the physiologic basis for both male hormonal and nonhormonal methods of contraception. We review the history of male hormonal contraception development, current hormonal agents in development, as well as the potential risks and benefits of male hormonal contraception options for men. Nonhormonal methods reviewed will include both pharmacological and mechanical approaches in development, with specific focus on methods which inhibit the testicular retinoic acid synthesis and action. Multiple hormonal and nonhormonal methods of male contraception are in the drug development pathway, with the hope that a reversible, reliable, safe method of male contraception will be available to couples in the not too distant future. PMID:26947703
Method For Chemical Sensing Using A Microfabricated Teeter-Totter Resonator
Adkins, Douglas Ray; Heller, Edwin J.; Shul, Randy J.
2004-11-30
A method for sensing a chemical analyte in a fluid stream comprises providing a microfabricated teeter-totter resonator that relies upon a Lorentz force to cause oscillation in a paddle, applying a static magnetic field substantially aligned in-plane with the paddle, energizing a current conductor line on a surface of the paddle with an alternating electrical current to generate the Lorentz force, exposing the resonator to the analyte, and detecting the response of the oscillatory motion of the paddle to the chemical analyte. Preferably, a chemically sensitive coating is disposed on at least one surface of the paddle to enhance the sorption of the analyte by the paddle. The concentration of the analyte in a fluid stream can be determined by measuring the change in the resonant frequency or phase of the teeter-totter resonator as the chemical analyte is added to or removed from the paddle.
Education for worksite monitors of impaired nurses.
Young, Linda J
2008-01-01
Boards of nursing sponsor programs, including those for alternatives to discipline, for recovering nurses. These programs rely on worksite monitors who are oftentimes other nurses or supervisors of nurses, to work with recovering nurses when they return to practice. The skills of these monitors vary with respect to understanding the monitor role and recognizing traits in chemical dependency and relapse. To determine the degree of content value and the best teaching method for monitors to learn program content, 17 currently active worksite monitors participated in a study to evaluate content value to 2 groups, new and experienced monitors, and to select the best method to teach 4 content topics. Results showed that current content was valued without necessary additions and that group instruction in urban areas was preferred over one-to-one instruction. Implementation of study outcomes yielded that issues of confidentiality made group instruction unsatisfactory.
In vitro plant tissue culture: means for production of biological active compounds.
Espinosa-Leal, Claudia A; Puente-Garza, César A; García-Lara, Silverio
2018-05-07
Plant tissue culture as an important tool for the continuous production of active compounds including secondary metabolites and engineered molecules. Novel methods (gene editing, abiotic stress) can improve the technique. Humans have a long history of reliance on plants for a supply of food, shelter and, most importantly, medicine. Current-day pharmaceuticals are typically based on plant-derived metabolites, with new products being discovered constantly. Nevertheless, the consistent and uniform supply of plant pharmaceuticals has often been compromised. One alternative for the production of important plant active compounds is in vitro plant tissue culture, as it assures independence from geographical conditions by eliminating the need to rely on wild plants. Plant transformation also allows the further use of plants for the production of engineered compounds, such as vaccines and multiple pharmaceuticals. This review summarizes the important bioactive compounds currently produced by plant tissue culture and the fundamental methods and plants employed for their production.
A convenient method for large-scale STM mapping of freestanding atomically thin conductive membranes
NASA Astrophysics Data System (ADS)
Uder, B.; Hartmann, U.
2017-06-01
Two-dimensional atomically flat sheets with a high flexibility are very attractive as ultrathin membranes but are also inherently challenging for microscopic investigations. We report on a method using Scanning Tunneling Microscopy (STM) under ultra-high vacuum conditions for large-scale mapping of several-micrometer-sized freestanding single and multilayer graphene membranes. This is achieved by operating the STM at unusual parameters. We found that large-scale scanning on atomically thin membranes delivers valuable results using very high tip-scan speeds combined with high feedback-loop gain and low tunneling currents. The method ultimately relies on the particular behavior of the freestanding membrane in the STM which is much different from that of a solid substrate.
From thermometric to spectrophotometric kinetic-catalytic methods of analysis. A review.
Cerdà, Víctor; González, Alba; Danchana, Kaewta
2017-05-15
Kinetic-catalytic analytical methods have proved to be very easy and highly sensitive strategies for chemical analysis, that rely on simple instrumentation [1,2]. Molecular absorption spectrophotometry is commonly used as the detection technique. However, other detection systems, like electrochemical or thermometric ones, offer some interesting possibilities since they are not affected by the color or turbidity of the samples. In this review some initial experience with thermometric kinetic-catalytic methods is described, up to our current experience exploiting spectrophotometric flow techniques to automate this kind of reactions, including the use of integrated chips. Procedures for determination of inorganic and organic species in organic and inorganic matrices are presented. Copyright © 2017 Elsevier B.V. All rights reserved.
Strain-Based Damage Determination Using Finite Element Analysis for Structural Health Management
NASA Technical Reports Server (NTRS)
Hochhalter, Jacob D.; Krishnamurthy, Thiagaraja; Aguilo, Miguel A.
2016-01-01
A damage determination method is presented that relies on in-service strain sensor measurements. The method employs a gradient-based optimization procedure combined with the finite element method for solution to the forward problem. It is demonstrated that strains, measured at a limited number of sensors, can be used to accurately determine the location, size, and orientation of damage. Numerical examples are presented to demonstrate the general procedure. This work is motivated by the need to provide structural health management systems with a real-time damage characterization. The damage cases investigated herein are characteristic of point-source damage, which can attain critical size during flight. The procedure described can be used to provide prognosis tools with the current damage configuration.
Neurobehavioral Development of Common Marmoset Monkeys
Schultz-Darken, Nancy; Braun, Katarina M.; Emborg, Marina E.
2016-01-01
Common marmoset (Callithrix jacchus) monkeys are a resource for biomedical research and their use is predicted to increase due to the suitability of this species for transgenic approaches. Identification of abnormal neurodevelopment due to genetic modification relies upon the comparison with validated patterns of normal behavior defined by unbiased methods. As scientists unfamiliar with nonhuman primate development are interested to apply genomic editing techniques in marmosets, it would be beneficial to the field that the investigators use validated methods of postnatal evaluation that are age and species appropriate. This review aims to analyze current available data on marmoset physical and behavioral postnatal development, describe the methods used and discuss next steps to better understand and evaluate marmoset normal and abnormal postnatal neurodevelopment PMID:26502294
Real time algorithms for sharp wave ripple detection.
Sethi, Ankit; Kemere, Caleb
2014-01-01
Neural activity during sharp wave ripples (SWR), short bursts of co-ordinated oscillatory activity in the CA1 region of the rodent hippocampus, is implicated in a variety of memory functions from consolidation to recall. Detection of these events in an algorithmic framework, has thus far relied on simple thresholding techniques with heuristically derived parameters. This study is an investigation into testing and improving the current methods for detection of SWR events in neural recordings. We propose and profile methods to reduce latency in ripple detection. Proposed algorithms are tested on simulated ripple data. The findings show that simple realtime algorithms can improve upon existing power thresholding methods and can detect ripple activity with latencies in the range of 10-20 ms.
Host-microbe interactions in distal airways: relevance to chronic airway diseases.
Martin, Clémence; Burgel, Pierre-Régis; Lepage, Patricia; Andréjak, Claire; de Blic, Jacques; Bourdin, Arnaud; Brouard, Jacques; Chanez, Pascal; Dalphin, Jean-Charles; Deslée, Gaetan; Deschildre, Antoine; Gosset, Philippe; Touqui, Lhousseine; Dusser, Daniel
2015-03-01
This article is the summary of a workshop, which took place in November 2013, on the roles of microorganisms in chronic respiratory diseases. Until recently, it was assumed that lower airways were sterile in healthy individuals. However, it has long been acknowledged that microorganisms could be identified in distal airway secretions from patients with various respiratory diseases, including cystic fibrosis (CF) and non-CF bronchiectasis, chronic obstructive pulmonary disease, asthma and other chronic airway diseases (e.g. post-transplantation bronchiolitis obliterans). These microorganisms were sometimes considered as infectious agents that triggered host immune responses and contributed to disease onset and/or progression; alternatively, microorganisms were often considered as colonisers, which were considered unlikely to play roles in disease pathophysiology. These concepts were developed at a time when the identification of microorganisms relied on culture-based methods. Importantly, the majority of microorganisms cannot be cultured using conventional methods, and the use of novel culture-independent methods that rely on the identification of microorganism genomes has revealed that healthy distal airways display a complex flora called the airway microbiota. The present article reviews some aspects of current literature on host-microbe (mostly bacteria and viruses) interactions in healthy and diseased airways, with a special focus on distal airways. Copyright ©ERS 2015.
Kizil, Caghan; Brand, Michael
2011-01-01
The teleost fish Danio rerio (zebrafish) has a remarkable ability to generate newborn neurons in its brain at adult stages of its lifespan-a process called adult neurogenesis. This ability relies on proliferating ventricular progenitors and is in striking contrast to mammalian brains that have rather restricted capacity for adult neurogenesis. Therefore, investigating the zebrafish brain can help not only to elucidate the molecular mechanisms of widespread adult neurogenesis in a vertebrate species, but also to design therapies in humans with what we learn from this teleost. Yet, understanding the cellular behavior and molecular programs underlying different biological processes in the adult zebrafish brain requires techniques that allow manipulation of gene function. As a complementary method to the currently used misexpression techniques in zebrafish, such as transgenic approaches or electroporation-based delivery of DNA, we devised a cerebroventricular microinjection (CVMI)-assisted knockdown protocol that relies on vivo morpholino oligonucleotides, which do not require electroporation for cellular uptake. This rapid method allows uniform and efficient knockdown of genes in the ventricular cells of the zebrafish brain, which contain the neurogenic progenitors. We also provide data on the use of CVMI for growth factor administration to the brain – in our case FGF8, which modulates the proliferation rate of the ventricular cells. In this paper, we describe the CVMI method and discuss its potential uses in zebrafish. PMID:22076157
Advanced image based methods for structural integrity monitoring: Review and prospects
NASA Astrophysics Data System (ADS)
Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.
2018-02-01
There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.
A real-time PCR diagnostic method for detection of Naegleria fowleri.
Madarová, Lucia; Trnková, Katarína; Feiková, Sona; Klement, Cyril; Obernauerová, Margita
2010-09-01
Naegleria fowleri is a free-living amoeba that can cause primary amoebic meningoencephalitis (PAM). While, traditional methods for diagnosing PAM still rely on culture, more current laboratory diagnoses exist based on conventional PCR methods; however, only a few real-time PCR processes have been described as yet. Here, we describe a real-time PCR-based diagnostic method using hybridization fluorescent labelled probes, with a LightCycler instrument and accompanying software (Roche), targeting the Naegleria fowleriMp2Cl5 gene sequence. Using this method, no cross reactivity with other tested epidemiologically relevant prokaryotic and eukaryotic organisms was found. The reaction detection limit was 1 copy of the Mp2Cl5 DNA sequence. This assay could become useful in the rapid laboratory diagnostic assessment of the presence or absence of Naegleria fowleri. Copyright 2009 Elsevier Inc. All rights reserved.
The Effects of Concurrent Verbal and Visual Tasks on Category Learning
ERIC Educational Resources Information Center
Miles, Sarah J.; Minda, John Paul
2011-01-01
Current theories of category learning posit separate verbal and nonverbal learning systems. Past research suggests that the verbal system relies on verbal working memory and executive functioning and learns rule-defined categories; the nonverbal system does not rely on verbal working memory and learns non-rule-defined categories (E. M. Waldron…
Utama, M Iqbal Bakti; Lu, Xin; Zhan, Da; Ha, Son Tung; Yuan, Yanwen; Shen, Zexiang; Xiong, Qihua
2014-11-07
Patterning two-dimensional materials into specific spatial arrangements and geometries is essential for both fundamental studies of materials and practical applications in electronics. However, the currently available patterning methods generally require etching steps that rely on complicated and expensive procedures. We report here a facile patterning method for atomically thin MoSe2 films using stripping with an SU-8 negative resist layer exposed to electron beam lithography. Additional steps of chemical and physical etching were not necessary in this SU-8 patterning method. The SU-8 patterning was used to define a ribbon channel from a field effect transistor of MoSe2 film, which was grown by chemical vapor deposition. The narrowing of the conduction channel area with SU-8 patterning was crucial in suppressing the leakage current within the device, thereby allowing a more accurate interpretation of the electrical characterization results from the sample. An electrical transport study, enabled by the SU-8 patterning, showed a variable range hopping behavior at high temperatures.
DeFelice, Nicholas B.; Johnston, Jill E.; Gibson, Jacqueline MacDonald
2016-01-01
Background: Previous analyses have suggested that unregulated private drinking water wells carry a higher risk of exposure to microbial contamination than regulated community water systems. In North Carolina, ~35% of the state’s population relies on private wells, but the health impact associated with widespread reliance on such unregulated drinking water sources is unknown. Objectives: We estimated the total number of emergency department visits for acute gastrointestinal illness (AGI) attributable to microbial contamination in private wells in North Carolina per year, the costs of those visits, and the potential health benefits of extending regulated water service to households currently relying on private wells for their drinking water. Methods: We developed a population intervention model using 2007–2013 data from all 122 North Carolina emergency departments along with microbial contamination data for all 2,120 community water systems and for 16,138 private well water samples collected since 2008. Results: An estimated 29,400 (95% CI: 26,600, 32,200) emergency department visits per year for acute gastrointestinal illness were attributable to microbial contamination in drinking water, constituting approximately 7.3% (95% CI: 6.6, 7.9%) of all AGI-related visits. Of these attributable cases, 99% (29,200; 95% CI: 26,500, 31,900) were associated with private well contamination. The estimated statewide annual cost of emergency department visits attributable to microbiological contamination of drinking water is 40.2 million USD (95% CI: 2.58 million USD, 193 million USD), of which 39.9 million USD (95% CI: 2.56 million USD, 192 million USD) is estimated to arise from private well contamination. An estimated 2,920 (95% CI: 2,650, 3,190) annual emergency department visits could be prevented by extending community water service to 10% of the population currently relying on private wells. Conclusions: This research provides new evidence that extending regulated community water service to populations currently relying on private wells may decrease the population burden of acute gastrointestinal illness. Citation: DeFelice NB, Johnston JE, Gibson JM. 2016. Reducing emergency department visits for acute gastrointestinal illnesses in North Carolina (USA) by extending community water service. Environ Health Perspect 124:1583–1591; http://dx.doi.org/10.1289/EHP160 PMID:27203131
Eddy-current inversion in the thin-skin limit: Determination of depth and opening for a long crack
NASA Astrophysics Data System (ADS)
Burke, S. K.
1994-09-01
A method for crack size determination using eddy-current nondestructive evaluation is presented for the case of a plate containing an infinitely long crack of uniform depth and uniform crack opening. The approach is based on the approximate solution to Maxwell's equations for nonmagnetic conductors in the limit of small skin depth and relies on least-squares polynomial fits to a normalized coil impedance function as a function of skin depth. The method is straightforward to implement and is relatively insensitive to both systematic and random errors. The procedure requires the computation of two functions: a normalizing function, which depends both on the coil parameters and the skin depth, and a crack-depth function which depends only on the coil parameters in addition to the crack depth. The practical perfomance of the method was tested using a set of simulated cracks in the form of electro-discharge machined slots in aluminum alloy plates. The crack depths and crack opening deduced from the eddy-current measurements agree with the actual crack dimensions to within 10% or better. Recommendations concerning the optimum conditions for crack sizing are also made.
NASA Technical Reports Server (NTRS)
1971-01-01
Methods for presterilization cleaning or decontamination of spacecraft hardware to reduce microbial load, without harming materials or spacecraft components, are investigated. Three methods were considered: (1) chemicals in liquid form, relying on physical removal as well as bacterial or bacteriostatic action; (2) chemicals used in the gaseous phase, relying on bacterial activity; and (3) mechanical cleaning relying on physical removal of organisms. These methods were evaluated in terms of their effectiveness in microbial burden reduction and compatibility with spacecraft hardware. Results show chemical methods were effective against spore microorganisms but were harmful to spacecraft materials. Mechanical methods were also effective with the degree depending upon the type of instrument employed. Mechanical methods caused problems in handling the equipment, due to vacuum pressure damaging the very thin layered materials used for shielding, and the bristles used in the process caused streaks or abrasions on some spacecraft components.
Sivalingam, Jaichandran; Lam, Alan Tin-Lun; Chen, Hong Yu; Yang, Bin Xia; Chen, Allen Kuan-Liang; Reuveny, Shaul; Loh, Yuin-Han; Oh, Steve Kah-Weng
2016-08-01
In vitro generation of red blood cells (RBCs) from human embryonic stem cells and human induced pluripotent stem cells appears to be a promising alternate approach to circumvent shortages in donor-derived blood supplies for clinical applications. Conventional methods for hematopoietic differentiation of human pluripotent stem cells (hPSC) rely on embryoid body (EB) formation and/or coculture with xenogeneic cell lines. However, most current methods for hPSC expansion and EB formation are not amenable for scale-up to levels required for large-scale RBC generation. Moreover, differentiation methods that rely on xenogenic cell lines would face obstacles for future clinical translation. In this study, we report the development of a serum-free and chemically defined microcarrier-based suspension culture platform for scalable hPSC expansion and EB formation. Improved survival and better quality EBs generated with the microcarrier-based method resulted in significantly improved mesoderm induction and, when combined with hematopoietic differentiation, resulted in at least a 6-fold improvement in hematopoietic precursor expansion, potentially culminating in a 80-fold improvement in the yield of RBC generation compared to a conventional EB-based differentiation method. In addition, we report efficient terminal maturation and generation of mature enucleated RBCs using a coculture system that comprised primary human mesenchymal stromal cells. The microcarrier-based platform could prove to be an appealing strategy for future scale-up of hPSC culture, EB generation, and large-scale generation of RBCs under defined and xeno-free conditions.
Kuhn, Alexandre; Ong, Yao Min; Quake, Stephen R; Burkholder, William F
2015-07-08
Like other structural variants, transposable element insertions can be highly polymorphic across individuals. Their functional impact, however, remains poorly understood. Current genome-wide approaches for genotyping insertion-site polymorphisms based on targeted or whole-genome sequencing remain very expensive and can lack accuracy, hence new large-scale genotyping methods are needed. We describe a high-throughput method for genotyping transposable element insertions and other types of structural variants that can be assayed by breakpoint PCR. The method relies on next-generation sequencing of multiplex, site-specific PCR amplification products and read count-based genotype calls. We show that this method is flexible, efficient (it does not require rounds of optimization), cost-effective and highly accurate. This method can benefit a wide range of applications from the routine genotyping of animal and plant populations to the functional study of structural variants in humans.
Active Thermal Architecture for Cryogenic Optical Instrumentation (ATACOI)
NASA Technical Reports Server (NTRS)
Swenson, Charles; Hunter, Roger C.; Baker, Christopher E.
2018-01-01
The Active Thermal Architecture for Cryogenic Optical Instrumentation (ATACOI) project will demonstrate an advanced thermal control system for CubeSats and enable the use of cryogenic electro-optical instrumentation on small satellite platforms. Specifically, the project focuses on the development of a deployable solar tracking radiator, a rotationally flexible rotary union fluid joint, and a thermal/vibrational isolation system for miniature cryogenic detectors. This technology will represent a significant improvement over the current state of the art for CubeSat thermal control, which generally relies on simple passive and conductive methods.
Chambers, C; Stewart, S; Su, B; Sandy, J; Ireland, A
2013-11-01
Orthodontic treatment, like all aspects of dentistry, exposes the clinician to the risk of malpractice and litigation. Demineralisation of tooth enamel is still one of the main complications of orthodontic treatment and it is essential patients are made aware of this risk during the consent process. There are a variety of fluoride delivery systems (mouthrinse, varnish, bonding system, and elastics), which can be used to prevent white spot lesion (WSL) formation. Glass-ionomer bonding cements (GIC) have also been shown to reduce WSL formation and have the benefit of not relying on patient compliance. However, these materials have not found widespread acceptance, possibly due to handling characteristics. A number of new technologies, principally fillers and coatings, have recently become available with potential antimicrobial and antibiofilm properties. Coatings can be applied to brackets and wires, which prevent bacterial adhesion. However, the longevity of these coatings is questionable. There are a number of methods available aimed at reducing the incidence of WSL, but they all have limitations. Capitalising on technological advances will enable the production of tailor made orthodontic brackets and adhesive systems, which provide long-term protection against WSL without relying on patient compliance.
Dynamic Routing of Aircraft in the Presence of Adverse Weather Using a POMDP Framework
NASA Technical Reports Server (NTRS)
Balaban, Edward; Roychoudhury, Indranil; Spirkovska, Lilly; Sankararaman, Shankar; Kulkarni, Chetan; Arnon, Tomer
2017-01-01
Each year weather-related airline delays result in hundreds of millions of dollars in additional fuel burn, maintenance, and lost revenue, not to mention passenger inconvenience. The current approaches for aircraft route planning in the presence of adverse weather still mainly rely on deterministic methods. In contrast, this work aims to deal with the problem using a Partially Observable Markov Decision Processes (POMDPs) framework, which allows for reasoning over uncertainty (including uncertainty in weather evolution over time) and results in solutions that are more robust to disruptions. The POMDP-based decision support system is demonstrated on several scenarios involving convective weather cells and is benchmarked against a deterministic planning system with functionality similar to those currently in use or under development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serne, R.J.; Wood, M.I.
1990-05-01
This report documents the currently available geochemical data base for release and retardation for actual Hanford Site materials (wastes and/or sediments). The report also recommends specific laboratory tests and presents the rationale for the recommendations. The purpose of this document is threefold: to summarize currently available information, to provide a strategy for generating additional data, and to provide recommendations on specific data collection methods and tests matrices. This report outlines a data collection approach that relies on feedback from performance analyses to ascertain when adequate data have been collected. The data collection scheme emphasizes laboratory testing based on empiricism. 196more » refs., 4 figs., 36 tabs.« less
Internal Nano Voids in Yttria-Stabilised Zirconia (YSZ) Powder
Barad, Chen; Shekel, Gal; Shandalov, Michael; Hayun, Hagay; Kimmel, Giora; Shamir, Dror; Gelbstein, Yaniv
2017-01-01
Porous yttria-stabilised zirconia ceramics have been gaining popularity throughout the years in various fields, such as energy, environment, medicine, etc. Although yttria-stabilised zirconia is a well-studied material, voided yttria-stabilised zirconia powder particles have not been demonstrated yet, and might play an important role in future technology developments. A sol-gel synthesis accompanied by a freeze-drying process is currently being proposed as a method of obtaining sponge-like nano morphology of embedded faceted voids inside yttria-stabilised zirconia particles. The results rely on a freeze-drying stage as an effective and simple method for generating nano-voided yttria-stabilised zirconia particles without the use of template-assisted additives. PMID:29258227
Internal Nano Voids in Yttria-Stabilised Zirconia (YSZ) Powder.
Barad, Chen; Shekel, Gal; Shandalov, Michael; Hayun, Hagay; Kimmel, Giora; Shamir, Dror; Gelbstein, Yaniv
2017-12-18
Porous yttria-stabilised zirconia ceramics have been gaining popularity throughout the years in various fields, such as energy, environment, medicine, etc. Although yttria-stabilised zirconia is a well-studied material, voided yttria-stabilised zirconia powder particles have not been demonstrated yet, and might play an important role in future technology developments. A sol-gel synthesis accompanied by a freeze-drying process is currently being proposed as a method of obtaining sponge-like nano morphology of embedded faceted voids inside yttria-stabilised zirconia particles. The results rely on a freeze-drying stage as an effective and simple method for generating nano-voided yttria-stabilised zirconia particles without the use of template-assisted additives.
Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A
2012-11-01
Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.
1992-06-01
processes. It demands commitment and discipline. It relies on people and involves everyone. (DoD TQM Pamplet (undated), 1) The following are four...export the TQM philosophy to their suppliers, as indicated in their brochure : TQM relies on continuous improvement in DoD’s acquired products and services
Summary of nondestructive testing theory and practice
NASA Technical Reports Server (NTRS)
Meister, R. P.; Randall, M. D.; Mitchell, D. K.; Williams, L. P.; Pattee, H. E.
1972-01-01
The ability to fabricate design critical and man-rated aerospace structures using materials near the limits of their capabilities requires a comprehensive and dependable assurance program. The quality assurance program must rely heavily on nondestructive testing methods for thorough inspection to assess properties and quality of hardware items. A survey of nondestructive testing methods is presented to provide space program managers, supervisors and engineers who are unfamiliar with this technical area with appropriate insight into the commonly accepted nondestructive testing methods available, their interrelationships, used, advantages and limitations. Primary emphasis is placed on the most common methods: liquid penetrant, magnetic particle, radiography, ultrasonics and eddy current. A number of the newer test techniques including thermal, acoustic emission, holography, microwaves, eddy-sonic and exo-electron emission, which are beginning to be used in applications of interest to NASA, are also discussed briefly.
A large-scale evaluation of computational protein function prediction
Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo
2013-01-01
Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650
Nanowire-nanopore transistor sensor for DNA detection during translocation
NASA Astrophysics Data System (ADS)
Xie, Ping; Xiong, Qihua; Fang, Ying; Qing, Quan; Lieber, Charles
2011-03-01
Nanopore sequencing, as a promising low cost, high throughput sequencing technique, has been proposed more than a decade ago. Due to the incompatibility between small ionic current signal and fast translocation speed and the technical difficulties on large scale integration of nanopore for direct ionic current sequencing, alternative methods rely on integrated DNA sensors have been proposed, such as using capacitive coupling or tunnelling current etc. But none of them have been experimentally demonstrated yet. Here we show that for the first time an amplified sensor signal has been experimentally recorded from a nanowire-nanopore field effect transistor sensor during DNA translocation. Independent multi-channel recording was also demonstrated for the first time. Our results suggest that the signal is from highly localized potential change caused by DNA translocation in none-balanced buffer condition. Given this method may produce larger signal for smaller nanopores, we hope our experiment can be a starting point for a new generation of nanopore sequencing devices with larger signal, higher bandwidth and large-scale multiplexing capability and finally realize the ultimate goal of low cost high throughput sequencing.
Jenkins, Cheryl; Chapman, Toni A.; Micallef, Jessica L.; Reynolds, Olivia L.
2012-01-01
Parasitoid detection and identification is a necessary step in the development and implementation of fruit fly biological control strategies employing parasitoid augmentive release. In recent years, DNA-based methods have been used to identify natural enemies of pest species where morphological differentiation is problematic. Molecular techniques also offer a considerable advantage over traditional morphological methods of fruit fly and parasitoid discrimination as well as within-host parasitoid identification, which currently relies on dissection of immature parasitoids from the host, or lengthy and labour-intensive rearing methods. Here we review recent research focusing on the use of molecular strategies for fruit fly and parasitoid detection and differentiation and discuss the implications of these studies on fruit fly management. PMID:26466628
Cho, Il-Hoon; Ku, Seockmo
2017-09-30
The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.
Field induced transient current in one-dimensional nanostructure
NASA Astrophysics Data System (ADS)
Sako, Tokuei; Ishida, Hiroshi
2018-07-01
Field-induced transient current in one-dimensional nanostructures has been studied by a model of an electron confined in a 1D attractive Gaussian potential subjected both to electrodes at the terminals and to an ultrashort pulsed oscillatory electric field with the central frequency ω and the FWHM pulse width Γ. The time-propagation of the electron wave packet has been simulated by integrating the time-dependent Schrödinger equation directly relying on the second-order symplectic integrator method. The transient current has been calculated as the flux of the probability density of the escaping wave packet emitted from the downstream side of the confining potential. When a static bias-field E0 is suddenly applied, the resultant transient current shows an oscillatory decay behavior with time followed by a minimum structure before converging to a nearly constant value. The ω-dependence of the integrated transient current induced by the pulsed electric field has shown an asymmetric resonance line-shape for large Γ while it shows a fringe pattern on the spectral line profile for small Γ. These observations have been rationalized on the basis of the energy-level structure and lifetime of the quasibound states in the bias-field modified confining potential obtained by the complex-scaling Fourier grid Hamiltonian method.
Dingwall, Kylie M; Pinkerton, Jennifer; Lindeman, Melissa A
2013-01-31
Achieving culturally fair assessments of cognitive functioning for Aboriginal people is difficult due to a scarcity of appropriately validated tools for use with this group. As a result, some Aboriginal people with cognitive impairments may lack fair and equitable access to services. The objective of this study was to examine current clinical practice in the Northern Territory regarding cognitive assessment for Aboriginal people thereby providing some guidance for clinicians new to this practice setting. Qualitative enquiry was used to describe practice context, reasons for assessment, and current practices in assessing cognition for Aboriginal Australians. Semi-structured interviews were conducted with 22 clinicians working with Aboriginal clients in central and northern Australia. Results pertaining to assessment methods are reported. A range of standardised tests were utilised with little consistency across clinical practice. Nevertheless, it was recognised that such tests bear severe limitations, requiring some modification and significant caution in their interpretation. Clinicians relied heavily on informal assessment or observations, contextual information and clinical judgement. Cognitive tests developed specifically for Aboriginal people are urgently needed. In the absence of appropriate, validated tests, clinicians have relied on and modified a range of standardised and informal assessments, whilst recognising the severe limitations of these. Past clinical training has not prepared clinicians adequately for assessing Aboriginal clients, and experience and clinical judgment were considered crucial for fair interpretation of test scores. Interpretation guidelines may assist inexperienced clinicians to consider whether they are achieving fair assessments of cognition for Aboriginal clients.
Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies
NASA Astrophysics Data System (ADS)
Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.
2016-02-01
Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.
Discovering Deeply Divergent RNA Viruses in Existing Metatranscriptome Data with Machine Learning
NASA Astrophysics Data System (ADS)
Rivers, A. R.
2016-02-01
Most sampling of RNA viruses and phages has been directed toward a narrow range of hosts and environments. Several marine metagenomic studies have examined the RNA viral fraction in aquatic samples and found a number of picornaviruses and uncharacterized sequences. The lack of homology to known protein families has limited the discovery of new RNA viruses. We developed a computational method for identifying RNA viruses that relies on information in the codon transition probabilities of viral sequences to train a classifier. This approach does not rely on homology, but it has higher information content than other reference-free methods such as tetranucleotide frequency. Training and validation with RefSeq data gave true positive and true negative rates of 99.6% and 99.5% on the highly imbalanced validation sets (0.2% viruses) that, like the metatranscriptomes themselves, contain mostly non-viral sequences. To further test the method, a validation dataset of putative RNA virus genomes were identified in metatransciptomes by the presence of RNA dependent RNA polymerase, an essential gene for RNA viruses. The classifier successfully identified 99.4% of those contigs as viral. This approach is currently being extended to screen all metatranscriptome data sequenced at the DOE Joint Genome Institute, presently 4.5 Gb of assembled data from 504 public projects representing a wide range of marine, aquatic and terrestrial environments.
Beyond the Condom: Frontiers in Male Contraception.
Roth, Mara Y; Amory, John K
2016-05-01
Nearly half of all pregnancies worldwide are unplanned, despite numerous contraceptive options available. No new contraceptive method has been developed for men since the invention of condom. Nevertheless, more than 25% of contraception worldwide relies on male methods. Therefore, novel effective methods of male contraception are of interest. Herein we review the physiologic basis for both male hormonal and nonhormonal methods of contraception. We review the history of male hormonal contraception development, current hormonal agents in development, as well as the potential risks and benefits of male hormonal contraception options for men. Nonhormonal methods reviewed will include both pharmacological and mechanical approaches in development, with specific focus on methods which inhibit the testicular retinoic acid synthesis and action. Multiple hormonal and nonhormonal methods of male contraception are in the drug development pathway, with the hope that a reversible, reliable, safe method of male contraception will be available to couples in the not too distant future. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Label-free resistive-pulse cytometry.
Chapman, M R; Sohn, L L
2011-01-01
Numerous methods have recently been developed to characterize cells for size, shape, and specific cell-surface markers. Most of these methods rely upon exogenous labeling of the cells and are better suited for large cell populations (>10,000). Here, we review a label-free method of characterizing and screening cells based on the Coulter-counter technique of particle sizing: an individual cell transiting a microchannel (or "pore") causes a downward pulse in the measured DC current across that "pore". Pulse magnitude corresponds to the cell size, pulse width to the transit time needed for the cell to pass through the pore, and pulse shape to how the cell traverses across the pore (i.e., rolling or tumbling). When the pore is functionalized with an antibody that is specific to a surface-epitope of interest, label-free screening of a specific marker is possible, as transient binding between the two results in longer time duration than when the pore is unfunctionalized or functionalized with a nonspecific antibody. While this method cannot currently compete with traditional technology in terms of throughput, there are a number of applications for which this technology is better suited than current commercial cytometry systems. Applications include the rapid and nondestructive analysis of small cell populations (<100), which is not possible with current technology, and a platform for providing true point-of-care clinical diagnostics, due to the simplicity of the device, low manufacturing costs, and ease of use. Copyright © 2011 Elsevier Inc. All rights reserved.
Probing interferometric parallax with interplanetary spacecraft
NASA Astrophysics Data System (ADS)
Rodeghiero, G.; Gini, F.; Marchili, N.; Jain, P.; Ralston, J. P.; Dallacasa, D.; Naletto, G.; Possenti, A.; Barbieri, C.; Franceschini, A.; Zampieri, L.
2017-07-01
We describe an experimental scenario for testing a novel method to measure distance and proper motion of astronomical sources. The method is based on multi-epoch observations of amplitude or intensity correlations between separate receiving systems. This technique is called Interferometric Parallax, and efficiently exploits phase information that has traditionally been overlooked. The test case we discuss combines amplitude correlations of signals from deep space interplanetary spacecraft with those from distant galactic and extragalactic radio sources with the goal of estimating the interplanetary spacecraft distance. Interferometric parallax relies on the detection of wavefront curvature effects in signals collected by pairs of separate receiving systems. The method shows promising potentialities over current techniques when the target is unresolved from the background reference sources. Developments in this field might lead to the construction of an independent, geometrical cosmic distance ladder using a dedicated project and future generation instruments. We present a conceptual overview supported by numerical estimates of its performances applied to a spacecraft orbiting the Solar System. Simulations support the feasibility of measurements with a simple and time-saving observational scheme using current facilities.
Broad and Inconsistent Muscle Food Classification Is Problematic for Dietary Guidance in the U.S.
O’Connor, Lauren E.; Campbell, Wayne W.; Woerner, Dale R.; Belk, Keith E.
2017-01-01
Dietary recommendations regarding consumption of muscle foods, such as red meat, processed meat, poultry or fish, largely rely on current dietary intake assessment methods. This narrative review summarizes how U.S. intake values for various types of muscle foods are grouped and estimated via methods that include: (1) food frequency questionnaires; (2) food disappearance data from the U.S. Department of Agriculture Economic Research Service; and (3) dietary recall information from the National Health and Nutrition Examination Survey data. These reported methods inconsistently classify muscle foods into groups, such as those previously listed, which creates discrepancies in estimated intakes. Researchers who classify muscle foods into these groups do not consistently considered nutrient content, in turn leading to implications of scientific conclusions and dietary recommendations. Consequentially, these factors demonstrate a need for a more universal muscle food classification system. Further specification to this system would improve accuracy and precision in which researchers can classify muscle foods in nutrition research. Future multidisciplinary collaboration is needed to develop a new classification system via systematic review protocol of current literature. PMID:28926963
Broadening the interface bandwidth in simulation based training
NASA Technical Reports Server (NTRS)
Somers, Larry E.
1989-01-01
Currently most computer based simulations rely exclusively on computer generated graphics to create the simulation. When training is involved, the method almost exclusively used to display information to the learner is text displayed on the cathode ray tube. MICROEXPERT Systems is concentrating on broadening the communications bandwidth between the computer and user by employing a novel approach to video image storage combined with sound and voice output. An expert system is used to combine and control the presentation of analog video, sound, and voice output with computer based graphics and text. Researchers are currently involved in the development of several graphics based user interfaces for NASA, the U.S. Army, and the U.S. Navy. Here, the focus is on the human factors considerations, software modules, and hardware components being used to develop these interfaces.
Adoptive Cell Transfer Therapy
Dudley, Mark E.; Rosenberg, Steven A.
2008-01-01
Adoptive cell transfer therapy has developed into a potent and effective treatment for patients with metastatic melanoma. Current application of this therapy relies on the ex vivo generation of highly active, highly avid tumor-reactive lymphocyte cultures from endogenous tumor infiltrating lymphocytes or on the genetic engineering of cells using antigen receptor genes to express de novo tumor antigen recognition. When anti-tumor lymphocyte cultures are administered to autologous patients with high dose interleukin-2 following a lymphodepleting conditioning regimen, the cells can expand in vivo, traffic to tumor, and mediate tumor regression and durable objective clinical responses. Current investigation seeks to improve the methods for generating and administering the lymphocyte cultures, and future clinical trials aim to improve durable response rates and extend the patient populations that are candidates for treatment. PMID:18083376
An Ultrasonographic Periodontal Probe
NASA Astrophysics Data System (ADS)
Bertoncini, C. A.; Hinders, M. K.
2010-02-01
Periodontal disease, commonly known as gum disease, affects millions of people. The current method of detecting periodontal pocket depth is painful, invasive, and inaccurate. As an alternative to manual probing, an ultrasonographic periodontal probe is being developed to use ultrasound echo waveforms to measure periodontal pocket depth, which is the main measure of periodontal disease. Wavelet transforms and pattern classification techniques are implemented in artificial intelligence routines that can automatically detect pocket depth. The main pattern classification technique used here, called a binary classification algorithm, compares test objects with only two possible pocket depth measurements at a time and relies on dimensionality reduction for the final determination. This method correctly identifies up to 90% of the ultrasonographic probe measurements within the manual probe's tolerance.
Optimal 2D-SIM reconstruction by two filtering steps with Richardson-Lucy deconvolution.
Perez, Victor; Chang, Bo-Jui; Stelzer, Ernst Hans Karl
2016-11-16
Structured illumination microscopy relies on reconstruction algorithms to yield super-resolution images. Artifacts can arise in the reconstruction and affect the image quality. Current reconstruction methods involve a parametrized apodization function and a Wiener filter. Empirically tuning the parameters in these functions can minimize artifacts, but such an approach is subjective and produces volatile results. We present a robust and objective method that yields optimal results by two straightforward filtering steps with Richardson-Lucy-based deconvolutions. We provide a resource to identify artifacts in 2D-SIM images by analyzing two main reasons for artifacts, out-of-focus background and a fluctuating reconstruction spectrum. We show how the filtering steps improve images of test specimens, microtubules, yeast and mammalian cells.
Optimal 2D-SIM reconstruction by two filtering steps with Richardson-Lucy deconvolution
NASA Astrophysics Data System (ADS)
Perez, Victor; Chang, Bo-Jui; Stelzer, Ernst Hans Karl
2016-11-01
Structured illumination microscopy relies on reconstruction algorithms to yield super-resolution images. Artifacts can arise in the reconstruction and affect the image quality. Current reconstruction methods involve a parametrized apodization function and a Wiener filter. Empirically tuning the parameters in these functions can minimize artifacts, but such an approach is subjective and produces volatile results. We present a robust and objective method that yields optimal results by two straightforward filtering steps with Richardson-Lucy-based deconvolutions. We provide a resource to identify artifacts in 2D-SIM images by analyzing two main reasons for artifacts, out-of-focus background and a fluctuating reconstruction spectrum. We show how the filtering steps improve images of test specimens, microtubules, yeast and mammalian cells.
Scalable Track Detection in SAR CCD Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chow, James G; Quach, Tu-Thach
Existing methods to detect vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images ta ken at different times of the same scene, rely on simple, fast models to label track pixels. These models, however, are often too simple to capture natural track features such as continuity and parallelism. We present a simple convolutional network architecture consisting of a series of 3-by-3 convolutions to detect tracks. The network is trained end-to-end to learn natural track features entirely from data. The network is computationally efficient and improves the F-score on a standard dataset to 0.988,more » up fr om 0.907 obtained by the current state-of-the-art method.« less
Measuring the speed of light with baryon acoustic oscillations.
Salzano, Vincenzo; Dąbrowski, Mariusz P; Lazkoz, Ruth
2015-03-13
In this Letter, we describe a new method to use baryon acoustic oscillations (BAO) to derive a constraint on the possible variation of the speed of light. The method relies on the fact that there is a simple relation between the angular diameter distance (D(A)) maximum and the Hubble function (H) evaluated at the same maximum-condition redshift, which includes speed of light c. We note the close analogy of the BAO probe with a laboratory experiment: here we have D(A) which plays the role of a standard (cosmological) ruler, and H^{-1}, with the dimension of time, as a (cosmological) clock. We evaluate if current or future missions such as Euclid can be sensitive enough to detect any variation of c.
Food and forensic molecular identification: update and challenges.
Teletchea, Fabrice; Maudet, Celia; Hänni, Catherine
2005-07-01
The need for accurate and reliable methods for animal species identification has steadily increased during past decades, particularly with the recent food scares and the overall crisis of biodiversity primarily resulting from the huge ongoing illegal traffic of endangered species. A relatively new biotechnological field, known as species molecular identification, based on the amplification and analysis of DNA, offers promising solutions. Indeed, despite the fact that retrieval and analysis of DNA in processed products is a real challenge, numerous technically consistent methods are now available and allow the detection of animal species in almost any organic substrate. However, this field is currently facing a turning point and should rely more on knowledge primarily from three fundamental fields--paleogenetics, molecular evolution and systematics.
NASA Technical Reports Server (NTRS)
Kim, Jong Dae (Inventor); Nagarajaiah, Satish (Inventor); Barrera, Enrique V. (Inventor); Dharap, Prasad (Inventor); Zhiling, Li (Inventor)
2010-01-01
The present invention is directed toward devices comprising carbon nanotubes that are capable of detecting displacement, impact, stress, and/or strain in materials, methods of making such devices, methods for sensing/detecting/monitoring displacement, impact, stress, and/or strain via carbon nanotubes, and various applications for such methods and devices. The devices and methods of the present invention all rely on mechanically-induced electronic perturbations within the carbon nanotubes to detect and quantify such stress/strain. Such detection and quantification can rely on techniques which include, but are not limited to, electrical conductivity/conductance and/or resistivity/resistance detection/measurements, thermal conductivity detection/measurements, electroluminescence detection/measurements, photoluminescence detection/measurements, and combinations thereof. All such techniques rely on an understanding of how such properties change in response to mechanical stress and/or strain.
Harnessing Aptamers to Overcome Challenges in Gluten Detection
Miranda-Castro, Rebeca; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, María Jesús
2016-01-01
Celiac disease is a lifelong autoimmune disorder triggered by foods containing gluten, the storage protein in wheat, rye, and barley. The rapidly escalating number of patients diagnosed with this disease poses a great challenge to both food industry and authorities to guarantee food safety for all. Therefore, intensive efforts are being made to establish minimal disease-eliciting doses of gluten and consequently to improve gluten-free labeling. These efforts depend to a high degree on the availability of methods capable of detecting the protein in food samples at levels as low as possible. Current analytical approaches rely on the use of antibodies as selective recognition elements. With limited sensitivity, these methods exhibit some deficiencies that compromise the accuracy of the obtained results. Aptamers provide an ideal alternative for designing biosensors for fast and selective measurement of gluten in foods. This article highlights the challenges in gluten detection, the current status of the use of aptamers for solving this problem, and what remains to be done to move these systems into commercial applications. PMID:27104578
Applying Formal Methods to NASA Projects: Transition from Research to Practice
NASA Technical Reports Server (NTRS)
Othon, Bill
2009-01-01
NASA project managers attempt to manage risk by relying on mature, well-understood process and technology when designing spacecraft. In the case of crewed systems, the margin for error is even tighter and leads to risk aversion. But as we look to future missions to the Moon and Mars, the complexity of the systems will increase as the spacecraft and crew work together with less reliance on Earth-based support. NASA will be forced to look for new ways to do business. Formal methods technologies can help NASA develop complex but cost effective spacecraft in many domains, including requirements and design, software development and inspection, and verification and validation of vehicle subsystems. To realize these gains, the technologies must be matured and field-tested so that they are proven when needed. During this discussion, current activities used to evaluate FM technologies for Orion spacecraft design will be reviewed. Also, suggestions will be made to demonstrate value to current designers, and mature the technology for eventual use in safety-critical NASA missions.
Handwriting Examination: Moving from Art to Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.
In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less
Faulon, Jean-Loup; Misra, Milind; Martin, Shawn; ...
2007-11-23
Motivation: Identifying protein enzymatic or pharmacological activities are important areas of research in biology and chemistry. Biological and chemical databases are increasingly being populated with linkages between protein sequences and chemical structures. Additionally, there is now sufficient information to apply machine-learning techniques to predict interactions between chemicals and proteins at a genome scale. Current machine-learning techniques use as input either protein sequences and structures or chemical information. We propose here a method to infer protein–chemical interactions using heterogeneous input consisting of both protein sequence and chemical information. Results: Our method relies on expressing proteins and chemicals with a common cheminformaticsmore » representation. We demonstrate our approach by predicting whether proteins can catalyze reactions not present in training sets. We also predict whether a given drug can bind a target, in the absence of prior binding information for that drug and target. Lastly, such predictions cannot be made with current machine-learning techniques requiring binding information for individual reactions or individual targets.« less
Iterative Addition of Kinetic Effects to Cold Plasma RF Wave Solvers
NASA Astrophysics Data System (ADS)
Green, David; Berry, Lee; RF-SciDAC Collaboration
2017-10-01
The hot nature of fusion plasmas requires a wave vector dependent conductivity tensor for accurate calculation of wave heating and current drive. Traditional methods for calculating the linear, kinetic full-wave plasma response rely on a spectral method such that the wave vector dependent conductivity fits naturally within the numerical method. These methods have seen much success for application to the well-confined core plasma of tokamaks. However, quantitative prediction of high power RF antenna designs for fusion applications has meant a requirement of resolving the geometric details of the antenna and other plasma facing surfaces for which the Fourier spectral method is ill-suited. An approach to enabling the addition of kinetic effects to the more versatile finite-difference and finite-element cold-plasma full-wave solvers was presented by where an operator-split iterative method was outlined. Here we expand on this approach, examine convergence and present a simplified kinetic current estimator for rapidly updating the right-hand side of the wave equation with kinetic corrections. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
Controlled assembly of jammed colloidal shells on fluid droplets.
Subramaniam, Anand Bala; Abkarian, Manouk; Stone, Howard A
2005-07-01
Assembly of colloidal particles on fluid interfaces is a promising technique for synthesizing two-dimensional microcrystalline materials useful in fields as diverse as biomedicine, materials science, mineral flotation and food processing. Current approaches rely on bulk emulsification methods, require further chemical and thermal treatments, and are restrictive with respect to the materials used. The development of methods that exploit the great potential of interfacial assembly for producing tailored materials have been hampered by the lack of understanding of the assembly process. Here we report a microfluidic method that allows direct visualization and understanding of the dynamics of colloidal crystal growth on curved interfaces. The crystals are periodically ejected to form stable jammed shells, which we refer to as colloidal armour. We propose that the energetic barriers to interfacial crystal growth and organization can be overcome by targeted delivery of colloidal particles through hydrodynamic flows. Our method allows an unprecedented degree of control over armour composition, size and stability.
Controlled assembly of jammed colloidal shells on fluid droplets
NASA Astrophysics Data System (ADS)
Subramaniam, Anand Bala; Abkarian, Manouk; Stone, Howard A.
2005-07-01
Assembly of colloidal particles on fluid interfaces is a promising technique for synthesizing two-dimensional microcrystalline materials useful in fields as diverse as biomedicine, materials science, mineral flotation and food processing. Current approaches rely on bulk emulsification methods, require further chemical and thermal treatments, and are restrictive with respect to the materials used. The development of methods that exploit the great potential of interfacial assembly for producing tailored materials have been hampered by the lack of understanding of the assembly process. Here we report a microfluidic method that allows direct visualization and understanding of the dynamics of colloidal crystal growth on curved interfaces. The crystals are periodically ejected to form stable jammed shells, which we refer to as colloidal armour. We propose that the energetic barriers to interfacial crystal growth and organization can be overcome by targeted delivery of colloidal particles through hydrodynamic flows. Our method allows an unprecedented degree of control over armour composition, size and stability.
Li, Qi-Gang; He, Yong-Han; Wu, Huan; Yang, Cui-Ping; Pu, Shao-Yan; Fan, Song-Qing; Jiang, Li-Ping; Shen, Qiu-Shuo; Wang, Xiao-Xiong; Chen, Xiao-Qiong; Yu, Qin; Li, Ying; Sun, Chang; Wang, Xiangting; Zhou, Jumin; Li, Hai-Peng; Chen, Yong-Bin; Kong, Qing-Peng
2017-01-01
Heterogeneity in transcriptional data hampers the identification of differentially expressed genes (DEGs) and understanding of cancer, essentially because current methods rely on cross-sample normalization and/or distribution assumption-both sensitive to heterogeneous values. Here, we developed a new method, Cross-Value Association Analysis (CVAA), which overcomes the limitation and is more robust to heterogeneous data than the other methods. Applying CVAA to a more complex pan-cancer dataset containing 5,540 transcriptomes discovered numerous new DEGs and many previously rarely explored pathways/processes; some of them were validated, both in vitro and in vivo , to be crucial in tumorigenesis, e.g., alcohol metabolism ( ADH1B ), chromosome remodeling ( NCAPH ) and complement system ( Adipsin ). Together, we present a sharper tool to navigate large-scale expression data and gain new mechanistic insights into tumorigenesis.
Intrinsic Frequency and the Single Wave Biopsy
Petrasek, Danny; Pahlevan, Niema M.; Tavallali, Peyman; Rinderknecht, Derek G.; Gharib, Morteza
2015-01-01
Insulin resistance is the hallmark of classical type II diabetes. In addition, insulin resistance plays a central role in metabolic syndrome, which astonishingly affects 1 out of 3 adults in North America. The insulin resistance state can precede the manifestation of diabetes and hypertension by years. Insulin resistance is correlated with a low-grade inflammatory condition, thought to be induced by obesity as well as other conditions. Currently, the methods to measure and monitor insulin resistance, such as the homeostatic model assessment and the euglycemic insulin clamp, can be impractical, expensive, and invasive. Abundant evidence exists that relates increased pulse pressure, pulse wave velocity (PWV), and vascular dysfunction with insulin resistance. We introduce a potential method of assessing insulin resistance that relies on a novel signal-processing algorithm, the intrinsic frequency method (IFM). The method requires a single pulse pressure wave, thus the term “ wave biopsy.” PMID:26183600
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Ambra, P.; Vassilevski, P. S.
2014-05-30
Adaptive Algebraic Multigrid (or Multilevel) Methods (αAMG) are introduced to improve robustness and efficiency of classical algebraic multigrid methods in dealing with problems where no a-priori knowledge or assumptions on the near-null kernel of the underlined matrix are available. Recently we proposed an adaptive (bootstrap) AMG method, αAMG, aimed to obtain a composite solver with a desired convergence rate. Each new multigrid component relies on a current (general) smooth vector and exploits pairwise aggregation based on weighted matching in a matrix graph to define a new automatic, general-purpose coarsening process, which we refer to as “the compatible weighted matching”. Inmore » this work, we present results that broaden the applicability of our method to different finite element discretizations of elliptic PDEs. In particular, we consider systems arising from displacement methods in linear elasticity problems and saddle-point systems that appear in the application of the mixed method to Darcy problems.« less
Text mining of cancer-related information: review of current status and future directions.
Spasić, Irena; Livsey, Jacqueline; Keane, John A; Nenadić, Goran
2014-09-01
This paper reviews the research literature on text mining (TM) with the aim to find out (1) which cancer domains have been the subject of TM efforts, (2) which knowledge resources can support TM of cancer-related information and (3) to what extent systems that rely on knowledge and computational methods can convert text data into useful clinical information. These questions were used to determine the current state of the art in this particular strand of TM and suggest future directions in TM development to support cancer research. A review of the research on TM of cancer-related information was carried out. A literature search was conducted on the Medline database as well as IEEE Xplore and ACM digital libraries to address the interdisciplinary nature of such research. The search results were supplemented with the literature identified through Google Scholar. A range of studies have proven the feasibility of TM for extracting structured information from clinical narratives such as those found in pathology or radiology reports. In this article, we provide a critical overview of the current state of the art for TM related to cancer. The review highlighted a strong bias towards symbolic methods, e.g. named entity recognition (NER) based on dictionary lookup and information extraction (IE) relying on pattern matching. The F-measure of NER ranges between 80% and 90%, while that of IE for simple tasks is in the high 90s. To further improve the performance, TM approaches need to deal effectively with idiosyncrasies of the clinical sublanguage such as non-standard abbreviations as well as a high degree of spelling and grammatical errors. This requires a shift from rule-based methods to machine learning following the success of similar trends in biological applications of TM. Machine learning approaches require large training datasets, but clinical narratives are not readily available for TM research due to privacy and confidentiality concerns. This issue remains the main bottleneck for progress in this area. In addition, there is a need for a comprehensive cancer ontology that would enable semantic representation of textual information found in narrative reports. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
In flight image processing on multi-rotor aircraft for autonomous landing
NASA Astrophysics Data System (ADS)
Henry, Richard, Jr.
An estimated $6.4 billion was spent during the year 2013 on developing drone technology around the world and is expected to double in the next decade. However, drone applications typically require strong pilot skills, safety, responsibilities and adherence to regulations during flight. If the flight control process could be safer and more reliable in terms of landing, it would be possible to further develop a wider range of applications. The objective of this research effort is to describe the design and evaluation of a fully autonomous Unmanned Aerial system (UAS), specifically a four rotor aircraft, commonly known as quad copter for precise landing applications. The full landing autonomy is achieved by image processing capabilities during flight for target recognition by employing the open source library OpenCV. In addition, all imaging data is processed by a single embedded computer that estimates a relative position with respect to the target landing pad. Results shows a reduction on the average offset error by 67.88% in comparison to the current return to lunch (RTL) method which only relies on GPS positioning. The present work validates the need for relying on image processing for precise landing applications instead of the inexact method of a commercial low cost GPS dependency.
NASA Astrophysics Data System (ADS)
Rouillon, M.; Taylor, M. P.; Dong, C.
2016-12-01
This research assesses the advantages of integrating field portable X-ray Fluorescence (pXRF) technology for reducing the risk and increase confidence of decision making for metal-contaminated site assessments. Metal-contaminated sites are often highly heterogeneous and require a high sampling density to accurately characterize the distribution and concentration of contaminants. The current regulatory assessment approaches rely on a small number of samples processed using standard wet-chemistry methods. In New South Wales (NSW), Australia, the current notification trigger for characterizing metal-contaminated sites require the upper 95% confidence interval of the site mean to equal or exceed the relevant guidelines. The method's low `minimum' sampling requirements can misclassify sites due to the heterogeneous nature of soil contamination, leading to inaccurate decision making. To address this issue, we propose integrating infield pXRF analysis with the established sampling method to overcome sampling limitations. This approach increases the minimum sampling resolution and reduces the 95% CI of the site mean. Infield pXRF analysis at contamination hotspots enhances sample resolution efficiently and without the need to return to the site. In this study, the current and proposed pXRF site assessment methods are compared at five heterogeneous metal-contaminated sites by analysing the spatial distribution of contaminants, 95% confidence intervals of site means, and the sampling and analysis uncertainty associated with each method. Finally, an analysis of costs associated with both the current and proposed methods is presented to demonstrate the advantages of incorporating pXRF into metal-contaminated site assessments. The data shows that pXRF integrated site assessments allows for faster, cost-efficient, characterisation of metal-contaminated sites with greater confidence for decision making.
ERIC Educational Resources Information Center
Gage, Nicholas A.; Lewis, Timothy J.; Stichter, Janine P.
2012-01-01
Of the myriad practices currently utilized for students with disabilities, particularly students with or at risk for emotional and/or behavioral disorder (EBD), functional behavior assessment (FBA) is a practice with an emerging solid research base. However, the FBA research base relies on single-subject design (SSD) and synthesis has relied on…
A monoclonal antibody-based ELISA for differential diagnosis of 2009 pandemic H1N1
USDA-ARS?s Scientific Manuscript database
The swine-origin 2009 pandemic H1N1 virus (pdmH1N1) is genetically related to North American swine H1 influenza viruses and unrelated to human seasonal H1 viruses. Currently, specific diagnosis of pdmH1N1 relies on RT-PCR. In order to develop an assay that does not rely in amplification of the viral...
Near Critical/Supercritical Carbon Dioxide Extraction for Treating Contaminated Bilgewater
2000-02-24
SUMMARY i TABLE OF CONTENTS ii LIST OF FIGURES iii LIST OF TABLES iii 1. INTRODUCTION 1 1.1 Current Treatment Processes 1 2. SUPERCRITICAL...Treatment Processes Historically, the Navy has relied on gravimetric separation to remove oily contaminants from bilgewater. Most ships contain one...continuously changes the orientation of the separator with respect to gravity, lowering the effectiveness of a separation process that relies on subtle
Teaching and assessment of professional attitudes in UK dental schools - commentary.
Field, J; Ellis, J; Abbas, C; Germain, P
2010-08-01
The General Dental Council expects professionalism to be embedded and assessed through-out the undergraduate dental programme. Curricula need therefore to accommodate these recommendations. A stroll poll of UK dental schools provided a basis for understanding the current methods of teaching and assessing professionalism. All respondent schools recognised the importance of professionalism and reported that this was taught and assessed within their curriculum. For most the methods involved were largely traditional, relying on lectures and seminars taught throughout the course. The most common form of assessment was by grading and providing formative feedback after a clinical encounter. Whilst clinical skills and knowledge can perhaps be readily taught and assessed using traditional methods, those involved in education are challenged to identify and implement effective methods of not only teaching, but also assessing professionalism. A variety of standalone methods need to be developed that assess professionalism and this will, in turn, allow the effectiveness of teaching methods to be assessed.
DKIST Adaptive Optics System: Simulation Results
NASA Astrophysics Data System (ADS)
Marino, Jose; Schmidt, Dirk
2016-05-01
The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.
NASA Astrophysics Data System (ADS)
Topping, David; Alibay, Irfan; Bane, Michael
2017-04-01
To predict the evolving concentration, chemical composition and ability of aerosol particles to act as cloud droplets, we rely on numerical modeling. Mechanistic models attempt to account for the movement of compounds between the gaseous and condensed phases at a molecular level. This 'bottom up' approach is designed to increase our fundamental understanding. However, such models rely on predicting the properties of molecules and subsequent mixtures. For partitioning between the gaseous and condensed phases this includes: saturation vapour pressures; Henrys law coefficients; activity coefficients; diffusion coefficients and reaction rates. Current gas phase chemical mechanisms predict the existence of potentially millions of individual species. Within a dynamic ensemble model, this can often be used as justification for neglecting computationally expensive process descriptions. Indeed, on whether we can quantify the true sensitivity to uncertainties in molecular properties, even at the single aerosol particle level it has been impossible to embed fully coupled representations of process level knowledge with all possible compounds, typically relying on heavily parameterised descriptions. Relying on emerging numerical frameworks, and designed for the changing landscape of high-performance computing (HPC), in this study we focus specifically on the ability to capture activity coefficients in liquid solutions using the UNIFAC method. Activity coefficients are often neglected with the largely untested hypothesis that they are simply too computationally expensive to include in dynamic frameworks. We present results demonstrating increased computational efficiency for a range of typical scenarios, including a profiling of the energy use resulting from reliance on such computations. As the landscape of HPC changes, the latter aspect is important to consider in future applications.
Propulsion Trade Studies for Spacecraft Swarm Mission Design
NASA Technical Reports Server (NTRS)
Dono, Andres; Plice, Laura; Mueting, Joel; Conn, Tracie; Ho, Michael
2018-01-01
Spacecraft swarms constitute a challenge from an orbital mechanics standpoint. Traditional mission design involves the application of methodical processes where predefined maneuvers for an individual spacecraft are planned in advance. This approach does not scale to spacecraft swarms consisting of many satellites orbiting in close proximity; non-deterministic maneuvers cannot be preplanned due to the large number of units and the uncertainties associated with their differential deployment and orbital motion. For autonomous small sat swarms in LEO, we investigate two approaches for controlling the relative motion of a swarm. The first method involves modified miniature phasing maneuvers, where maneuvers are prescribed that cancel the differential delta V of each CubeSat's deployment vector. The second method relies on artificial potential functions (APFs) to contain the spacecraft within a volumetric boundary and avoid collisions. Performance results and required delta V budgets are summarized, indicating that each method has advantages and drawbacks for particular applications. The mini phasing maneuvers are more predictable and sustainable. The APF approach provides a more responsive and distributed performance, but at considerable propellant cost. After considering current state of the art CubeSat propulsion systems, we conclude that the first approach is feasible, but the modified APF method of requires too much control authority to be enabled by current propulsion systems.
Asymmetric nanopore membranes: Single molecule detection and unique transport properties
NASA Astrophysics Data System (ADS)
Bishop, Gregory William
Biological systems rely on the transport properties of transmembrane channels. Such pores can display selective transport by allowing the passage of certain ions or molecules while rejecting others. Recent advances in nanoscale fabrication have allowed the production of synthetic analogs of such channels. Synthetic nanopores (pores with a limiting dimension of 1--100 nm) can be produced in a variety of materials by several different methods. In the Martin group, we have been exploring the track-etch method to produce asymmetric nanopores in thin films of polymeric or crystalline materials. Asymmetric nanopores are of particular interest due to their ability to serve as ion-current rectifiers. This means that when a membrane that contains such a pore or collection of pores is used to separate identical portions of electrolyte solution, the magnitude of the ionic current will depend not only on the magnitude of the applied potential (as expected) but also the polarity. Ion-current rectification is characterized by an asymmetric current--potential response. Here, the interesting transport properties of asymmetric nanopores (ion-current rectification and the related phenomenon of electroosmotic flow rectification) are explored. The effects of pore shape and pore density on these phenomena are investigated. Membranes that contain a single nanopore can serve as platforms for the single-molecule sensing technique known as resistive pulse sensing. The resistive-pulse sensing method is based on the Coulter principle. Thus, the selectivity of the technique is based largely upon size, making the analysis of mixtures by this method difficult in many cases. Here, the surface of a single nanopore membrane is modified with a molecular recognition agent in an attempt to obtain a more selective resistive-pulse sensor for a specific analyte.
Effect of Resin-modified Glass Ionomer Cement Dispensing/Mixing Methods on Mechanical Properties.
Sulaiman, T A; Abdulmajeed, A A; Altitinchi, A; Ahmed, S N; Donovan, T E
2018-03-23
Resin-modified glass ionomer cements (RMGIs) are often used for luting indirect restorations. Hand-mixing traditional cements demands significant time and may be technique sensitive. Efforts have been made by manufacturers to introduce the same cement using different dispensing/mixing methods. It is not known what effects these changes may have on the mechanical properties of the dental cement. The purpose of this study was to evaluate the mechanical properties (diametral tensile strength [DTS], compressive strength [CS], and fracture toughness [FT]) of RMGIs with different dispensing/mixing systems. The RMGI specimens (n=14)-RelyX Luting (hand mix), RelyX Luting Plus (clicker-hand mix), RelyX Luting Plus (automix) (3M ESPE), GC Fuji PLUS (capsule-automix), and GC FujiCEM 2 (automix) (GC)-were prepared for each mechanical test and examined after thermocycling (n=7/subgroup) for 20,000 cycles to the following: DTS, CS (ISO 9917-1) and FT (ISO standard 6872; Single-edge V-notched beam method). Specimens were mounted and loaded with a universal testing machine until failure occurred. Two-/one-way analysis of variance followed by Tukey honestly significantly different post hoc test was used to analyze data for statistical significance ( p<0.05). The interaction effect of both dispensing/mixing method and thermocycling was significant only for the CS test of the GC group ( p<0.05). The different dispensing/mixing methods had no effect on the DTS of the tested cements. The CS of GC Fuji PLUS was significantly higher than that of the automix version ( p<0.05). The FT decreased significantly when switching from RelyX (hand mix) to RelyX Luting Plus (clicker-hand mix) and to RelyX Luting Plus (automix) ( p<0.05). Except in the case of the DTS of the GC group and the CS of GC Fuji PLUS, thermocycling had a significant effect reducing the mechanical properties of the RMGI cements ( p<0.05). Introducing alternative dispensing/mixing methods for mixing RMGIs to reduce time and technique sensitivity may affect mechanical properties and is brand dependent.
Gopal, Hemavathi; Hassan, Hassan K.; Rodríguez-Pérez, Mario A.; Toé, Laurent D.; Lustigman, Sara; Unnasch, Thomas R.
2012-01-01
Background Entomological surveys of Simulium vectors are an important component in the criteria used to determine if Onchocerca volvulus transmission has been interrupted and if focal elimination of the parasite has been achieved. However, because infection in the vector population is quite rare in areas where control has succeeded, large numbers of flies need to be examined to certify transmission interruption. Currently, this is accomplished through PCR pool screening of large numbers of flies. The efficiency of this process is limited by the size of the pools that may be screened, which is in turn determined by the constraints imposed by the biochemistry of the assay. The current method of DNA purification from pools of vector black flies relies upon silica adsorption. This method can be applied to screen pools containing a maximum of 50 individuals (from the Latin American vectors) or 100 individuals (from the African vectors). Methodology/Principal Findings We have evaluated an alternative method of DNA purification for pool screening of black flies which relies upon oligonucleotide capture of Onchocerca volvulus genomic DNA from homogenates prepared from pools of Latin American and African vectors. The oligonucleotide capture assay was shown to reliably detect one O. volvulus infective larva in pools containing 200 African or Latin American flies, representing a two-four fold improvement over the conventional assay. The capture assay requires an equivalent amount of technical time to conduct as the conventional assay, resulting in a two-four fold reduction in labor costs per insect assayed and reduces reagent costs to $3.81 per pool of 200 flies, or less than $0.02 per insect assayed. Conclusions/Significance The oligonucleotide capture assay represents a substantial improvement in the procedure used to detect parasite prevalence in the vector population, a major metric employed in the process of certifying the elimination of onchocerciasis. PMID:22724041
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
Kerogen extraction from subterranean oil shale resources
Looney, Mark Dean; Lestz, Robert Steven; Hollis, Kirk; Taylor, Craig; Kinkead, Scott; Wigand, Marcus
2010-09-07
The present invention is directed to methods for extracting a kerogen-based product from subsurface (oil) shale formations, wherein such methods rely on fracturing and/or rubblizing portions of said formations so as to enhance their fluid permeability, and wherein such methods further rely on chemically modifying the shale-bound kerogen so as to render it mobile. The present invention is also directed at systems for implementing at least some of the foregoing methods. Additionally, the present invention is also directed to methods of fracturing and/or rubblizing subsurface shale formations and to methods of chemically modifying kerogen in situ so as to render it mobile.
Kerogen extraction from subterranean oil shale resources
Looney, Mark Dean [Houston, TX; Lestz, Robert Steven [Missouri City, TX; Hollis, Kirk [Los Alamos, NM; Taylor, Craig [Los Alamos, NM; Kinkead, Scott [Los Alamos, NM; Wigand, Marcus [Los Alamos, NM
2009-03-10
The present invention is directed to methods for extracting a kerogen-based product from subsurface (oil) shale formations, wherein such methods rely on fracturing and/or rubblizing portions of said formations so as to enhance their fluid permeability, and wherein such methods further rely on chemically modifying the shale-bound kerogen so as to render it mobile. The present invention is also directed at systems for implementing at least some of the foregoing methods. Additionally, the present invention is also directed to methods of fracturing and/or rubblizing subsurface shale formations and to methods of chemically modifying kerogen in situ so as to render it mobile.
Viger, Mathieu L; Sheng, Wangzhong; McFearin, Cathryn L; Berezin, Mikhail Y; Almutairi, Adah
2013-11-10
Though accurately evaluating the kinetics of release is critical for validating newly designed therapeutic carriers for in vivo applications, few methods yet exist for release measurement in real time and without the need for any sample preparation. Many of the current approaches (e.g. chromatographic methods, absorption spectroscopy, or NMR spectroscopy) rely on isolation of the released material from the loaded vehicles, which require additional sample purification and can lead to loss of accuracy when probing fast kinetics of release. In this study we describe the use of time-resolved fluorescence for in situ monitoring of small molecule release kinetics from biodegradable polymeric drug delivery systems. This method relies on the observation that fluorescent reporters being released from polymeric drug delivery systems possess distinct excited-state lifetime components, reflecting their different environments in the particle suspensions, i.e., confined in the polymer matrices or free in the aqueous environment. These distinct lifetimes enable real-time quantitative mapping of the relative concentrations of dye in each population to obtain precise and accurate temporal information on the release profile of particular carrier/payload combinations. We found that fluorescence lifetime better distinguishes subtle differences in release profiles (e.g. differences associated with dye loading) than conventional steady-state fluorescence measurements, which represent the averaged dye behavior over the entire scan. Given the method's applicability to both hydrophobic and hydrophilic cargo, it could be employed to model the release of any drug-carrier combination. Copyright © 2013 Elsevier B.V. All rights reserved.
Wilson, Anna; Goldberg, Tony; Marcquenski, Susan; Olson, Wendy; Goetz, Frederick; Hershberger, Paul; Hart, Lucas M.; Toohey-Kurth, Kathy
2014-01-01
Viral hemorrhagic septicemia virus (VHSV) is a target of surveillance by many state and federal agencies in the United States. Currently, the detection of VHSV relies on virus isolation, which is lethal to fish and indicates only the current infection status. A serological method is required to ascertain prior exposure. Here, we report two serologic tests for VHSV that are nonlethal, rapid, and species independent, a virus neutralization (VN) assay and a blocking enzyme-linked immunosorbent assay (ELISA). The results show that the VN assay had a specificity of 100% and sensitivity of 42.9%; the anti-nucleocapsid-blocking ELISA detected nonneutralizing VHSV antibodies at a specificity of 88.2% and a sensitivity of 96.4%. The VN assay and ELISA are valuable tools for assessing exposure to VHSV.
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data
2017-01-01
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.
Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls
2017-10-06
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.
[Evidence-based rehabilitation of mobility after stroke].
Dohle, C; Tholen, R; Wittenberg, H; Quintern, J; Saal, S; Stephan, K M
2016-10-01
Approximately two thirds of stroke patients initially suffer from at least impaired mobility. Various rehabilitation concepts have been proposed. Based on the current literature, which rehabilitation methods can be recommended for improvement of gait, gait velocity, gait distance and balance? A systematic literature search was carried out for randomized clinical studies and reviews with clinically relevant outcome variables. Formulation of recommendations, separated for target variables and time after stroke. Restoration and improvement of gait function relies on a high number of repetitions of gait movements, which for more severely affected patients is preferentially machine-based. For improvement of gait velocity for less severely affected patients intensive gait training does not necessarily rely on mechanical support. Gait distance can be improved by aerobic endurance exercises with a cardiovascular effect, which have to be performed in a functional context. Improvement of balance should be achieved by intensive functional gait training. Additional stimulation techniques are only effective when included in a functionally relevant training program. These guidelines not only provide recommendations for action but also provide pathophysiological insights into functional restoration of stance and gait after stroke.
Terahertz pulsed imaging study of dental caries
NASA Astrophysics Data System (ADS)
Karagoz, Burcu; Altan, Hakan; Kamburoglu, Kıvanç
2015-07-01
Current diagnostic techniques in dentistry rely predominantly on X-rays to monitor dental caries. Terahertz Pulsed Imaging (TPI) has great potential for medical applications since it is a nondestructive imaging method. It does not cause any ionization hazard on biological samples due to low energy of THz radiation. Even though it is strongly absorbed by water which exhibits very unique chemical and physical properties that contribute to strong interaction with THz radiation, teeth can still be investigated in three dimensions. Recent investigations suggest that this method can be used in the early identification of dental diseases and imperfections in the tooth structure without the hazards of using techniques which rely on x-rays. We constructed a continuous wave (CW) and time-domain reflection mode raster scan THz imaging system that enables us to investigate various teeth samples in two or three dimensions. The samples comprised of either slices of individual tooth samples or rows of teeth embedded in wax, and the imaging was done by scanning the sample across the focus of the THz beam. 2D images were generated by acquiring the intensity of the THz radiation at each pixel, while 3D images were generated by collecting the amplitude of the reflected signal at each pixel. After analyzing the measurements in both the spatial and frequency domains, the results suggest that the THz pulse is sensitive to variations in the structure of the samples that suggest that this method can be useful in detecting the presence of caries.
Yair, Simo; Ofer, Butnaro; Arik, Eisenkraft; Shai, Shrot; Yossi, Rosman; Tzvika, Dushnitsky; Amir, Krivoy
2008-01-01
One of the major challenges in dealing with chemical warfare agent (CWA) dispersal, whether in the battlefield or after a terror act, is decontamination and rehabilitation of any contaminated area. Organophosphates (OPs) are considered to be among the deadliest CWAs to date. Other OPs are used as pesticides in modern agriculture, and are considered environmentally hazardous. Current methods for OP decontamination are either dangerous or insufficiently effective. As a promising solution for this problem, bioremediation--the use of biocomponents for environmental remediation--is a potentially effective, safe, and environment-friendly method. The technology relies on several enzymatic mechanisms, and can be applied in various ways. We will review recent achievements and potential applications, such as biocatalyst-containing foams and an enzymatic sponge, for environmental as well as personal exterior decontamination.
Aero-acoustic Properties of Eroded Airfoils of Compressor Blades for Use in Non-invasive Diagnostics
NASA Astrophysics Data System (ADS)
Drãgan, Valeriu; Grad, Danuţa
2013-09-01
The current techniques for investigating the erosion of turbo machineries rely on visual inspections trough boroscopy. However this implies shutting down the power plant in order to make the assessment which leads to operational costs and difficulties. This paper aims to provide a method for monitoring the erosion state of a bladed power plant operated in dusty environments such as the desert by measuring the changes in its acoustic spectrum. The method used for this study is numerical and the findings suggest that there are significant modifications to both the flow field and the acoustic parameters as the blade gets progressively eroded. This paves the way for the development of non-invasive permanent real time diagnostics for turbine engines and power plants.
On the present and future of dissolution-DNP
NASA Astrophysics Data System (ADS)
Ardenkjaer-Larsen, Jan Henrik
2016-03-01
Dissolution-DNP is a method to create solutions of molecules with nuclear spin polarization close to unity. The many orders of magnitude signal enhancement have enabled many new applications, in particular in vivo MR metabolic imaging. The method relies on solid state dynamic nuclear polarization at low temperature followed by a dissolution to produce the room temperature solution of highly polarized spins. This work describes the present and future of dissolution-DNP in the mind of the author. The article describes some of the current trends in the field as well as outlines some of the areas where new ideas will make an impact. Most certainly, the future will take unpredicted directions, but hopefully the thoughts presented here will stimulate new ideas that can further advance the field.
Water vapour tomography using GPS phase observations: Results from the ESCOMPTE experiment
NASA Astrophysics Data System (ADS)
Nilsson, T.; Gradinarsky, L.; Elgered, G.
2007-10-01
Global Positioning System (GPS) tomography is a technique for estimating the 3-D structure of the atmospheric water vapour using data from a dense local network of GPS receivers. Several current methods utilize estimates of slant wet delays between the GPS satellites and the receivers on the ground, which are difficult to obtain with millimetre accuracy from the GPS observations. We present results of applying a new tomographic method to GPS data from the Expériance sur site pour contraindre les modèles de pollution atmosphérique et de transport d'emissions (ESCOMPTE) experiment in southern France. This method does not rely on any slant wet delay estimates, instead it uses the GPS phase observations directly. We show that the estimated wet refractivity profiles estimated by this method is on the same accuracy level or better compared to other tomographic methods. The results are in agreement with earlier simulations, for example the profile information is limited above 4 km.
A DERATING METHOD FOR THERAPEUTIC APPLICATIONS OF HIGH INTENSITY FOCUSED ULTRASOUND
Bessonova, O.V.; Khokhlova, V.A.; Canney, M.S.; Bailey, M.R.; Crum, L.A.
2010-01-01
Current methods of determining high intensity focused ultrasound (HIFU) fields in tissue rely on extrapolation of measurements in water assuming linear wave propagation both in water and in tissue. Neglecting nonlinear propagation effects in the derating process can result in significant errors. In this work, a new method based on scaling the source amplitude is introduced to estimate focal parameters of nonlinear HIFU fields in tissue. Focal values of acoustic field parameters in absorptive tissue are obtained from a numerical solution to a KZK-type equation and are compared to those simulated for propagation in water. Focal waveforms, peak pressures, and intensities are calculated over a wide range of source outputs and linear focusing gains. Our modeling indicates, that for the high gain sources which are typically used in therapeutic medical applications, the focal field parameters derated with our method agree well with numerical simulation in tissue. The feasibility of the derating method is demonstrated experimentally in excised bovine liver tissue. PMID:20582159
A derating method for therapeutic applications of high intensity focused ultrasound
NASA Astrophysics Data System (ADS)
Bessonova, O. V.; Khokhlova, V. A.; Canney, M. S.; Bailey, M. R.; Crum, L. A.
2010-05-01
Current methods of determining high intensity focused ultrasound (HIFU) fields in tissue rely on extrapolation of measurements in water assuming linear wave propagation both in water and in tissue. Neglecting nonlinear propagation effects in the derating process can result in significant errors. A new method based on scaling the source amplitude is introduced to estimate focal parameters of nonlinear HIFU fields in tissue. Focal values of acoustic field parameters in absorptive tissue are obtained from a numerical solution to a KZK-type equation and are compared to those simulated for propagation in water. Focal wave-forms, peak pressures, and intensities are calculated over a wide range of source outputs and linear focusing gains. Our modeling indicates, that for the high gain sources which are typically used in therapeutic medical applications, the focal field parameters derated with our method agree well with numerical simulation in tissue. The feasibility of the derating method is demonstrated experimentally in excised bovine liver tissue.
A DERATING METHOD FOR THERAPEUTIC APPLICATIONS OF HIGH INTENSITY FOCUSED ULTRASOUND.
Bessonova, O V; Khokhlova, V A; Canney, M S; Bailey, M R; Crum, L A
2010-01-01
Current methods of determining high intensity focused ultrasound (HIFU) fields in tissue rely on extrapolation of measurements in water assuming linear wave propagation both in water and in tissue. Neglecting nonlinear propagation effects in the derating process can result in significant errors. In this work, a new method based on scaling the source amplitude is introduced to estimate focal parameters of nonlinear HIFU fields in tissue. Focal values of acoustic field parameters in absorptive tissue are obtained from a numerical solution to a KZK-type equation and are compared to those simulated for propagation in water. Focal waveforms, peak pressures, and intensities are calculated over a wide range of source outputs and linear focusing gains. Our modeling indicates, that for the high gain sources which are typically used in therapeutic medical applications, the focal field parameters derated with our method agree well with numerical simulation in tissue. The feasibility of the derating method is demonstrated experimentally in excised bovine liver tissue.
Identifying disease polymorphisms from case-control genetic association data.
Park, L
2010-12-01
In case-control association studies, it is typical to observe several associated polymorphisms in a gene region. Often the most significantly associated polymorphism is considered to be the disease polymorphism; however, it is not clear whether it is the disease polymorphism or there is more than one disease polymorphism in the gene region. Currently, there is no method that can handle these problems based on the linkage disequilibrium (LD) relationship between polymorphisms. To distinguish real disease polymorphisms from markers in LD, a method that can detect disease polymorphisms in a gene region has been developed. Relying on the LD between polymorphisms in controls, the proposed method utilizes model-based likelihood ratio tests to find disease polymorphisms. This method shows reliable Type I and Type II error rates when sample sizes are large enough, and works better with re-sequenced data. Applying this method to fine mapping using re-sequencing or dense genotyping data would provide important information regarding the genetic architecture of complex traits.
Grundy, H H; Reece, P; Buckley, M; Solazzo, C M; Dowle, A A; Ashford, D; Charlton, A J; Wadsley, M K; Collins, M J
2016-01-01
Gelatine is a component of a wide range of foods. It is manufactured as a by-product of the meat industry from bone and hide, mainly from bovine and porcine sources. Accurate food labelling enables consumers to make informed decisions about the food they buy. Since labelling currently relies heavily on due diligence involving a paper trail, there could be benefits in developing a reliable test method for the consumer industries in terms of the species origin of gelatine. We present a method to determine the species origin of gelatines by peptide mass spectrometry methods. An evaluative comparison is also made with ELISA and PCR technologies. Commercial gelatines were found to contain undeclared species. Furthermore, undeclared bovine peptides were observed in commercial injection matrices. This analytical method could therefore support the food industry in terms of determining the species authenticity of gelatine in foods. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Loukotková, Lucie; VonTungeln, Linda S; Vanlandingham, Michelle; da Costa, Gonçalo Gamboa
2018-01-01
According to the World Health Organization, the consumption of tobacco products is the single largest cause of preventable deaths in the world, exceeding the total aggregated number of deaths caused by diseases such as AIDS, tuberculosis, and malaria. An important element in the evaluation of the health risks associated with the consumption of tobacco products is the assessment of the internal exposure to the tobacco constituents responsible for their addictive (e.g. nicotine) and carcinogenic (e.g. N-nitrosamines such as NNN and NNK) properties. However, the assessment of the serum levels of these compounds is often challenging from an analytical standpoint, in particular when limited sample volumes are available and low detection limits are required. Currently available analytical methods often rely on complex multi-step sample preparation procedures, which are prone to low analyte recoveries and ex-vivo contamination due to the ubiquitous nature of these compounds as background contaminants. In order to circumvent these problems, we report a facile and highly sensitive method for the simultaneous quantification of nicotine, cotinine, NNN, and NNK in serum samples. The method relies on a simple "one pot" liquid-liquid extraction procedure and isotope dilution ultra-high pressure (UPLC) hydrophilic interaction liquid chromatography (HILIC) coupled with tandem mass spectrometry. The method requires only 10μL of serum and presents a limit of quantification of 0.02nmol (3000pg/mL) nicotine, 0.6pmol (100pg/mL) cotinine, 0.05pmol NNK (10pg/mL), and 0.06pmol NNN (10pg/mL), making it appropriate for pharmacokinetic evaluations. Published by Elsevier B.V.
Sibley, Christopher D; Peirano, Gisele; Church, Deirdre L
2012-04-01
Clinical microbiology laboratories worldwide have historically relied on phenotypic methods (i.e., culture and biochemical tests) for detection, identification and characterization of virulence traits (e.g., antibiotic resistance genes, toxins) of human pathogens. However, limitations to implementation of molecular methods for human infectious diseases testing are being rapidly overcome allowing for the clinical evaluation and implementation of diverse technologies with expanding diagnostic capabilities. The advantages and limitation of molecular techniques including real-time polymerase chain reaction, partial or whole genome sequencing, molecular typing, microarrays, broad-range PCR and multiplexing will be discussed. Finally, terminal restriction fragment length polymorphism (T-RFLP) and deep sequencing are introduced as technologies at the clinical interface with the potential to dramatically enhance our ability to diagnose infectious diseases and better define the epidemiology and microbial ecology of a wide range of complex infections. Copyright © 2012 Elsevier B.V. All rights reserved.
Youngs, Noah; Penfold-Brown, Duncan; Drew, Kevin; Shasha, Dennis; Bonneau, Richard
2013-05-01
Computational biologists have demonstrated the utility of using machine learning methods to predict protein function from an integration of multiple genome-wide data types. Yet, even the best performing function prediction algorithms rely on heuristics for important components of the algorithm, such as choosing negative examples (proteins without a given function) or determining key parameters. The improper choice of negative examples, in particular, can hamper the accuracy of protein function prediction. We present a novel approach for choosing negative examples, using a parameterizable Bayesian prior computed from all observed annotation data, which also generates priors used during function prediction. We incorporate this new method into the GeneMANIA function prediction algorithm and demonstrate improved accuracy of our algorithm over current top-performing function prediction methods on the yeast and mouse proteomes across all metrics tested. Code and Data are available at: http://bonneaulab.bio.nyu.edu/funcprop.html
Bidirectional composition on lie groups for gradient-based image alignment.
Mégret, Rémi; Authesserre, Jean-Baptiste; Berthoumieu, Yannick
2010-09-01
In this paper, a new formulation based on bidirectional composition on Lie groups (BCL) for parametric gradient-based image alignment is presented. Contrary to the conventional approaches, the BCL method takes advantage of the gradients of both template and current image without combining them a priori. Based on this bidirectional formulation, two methods are proposed and their relationship with state-of-the-art gradient based approaches is fully discussed. The first one, i.e., the BCL method, relies on the compositional framework to provide the minimization of the compensated error with respect to an augmented parameter vector. The second one, the projected BCL (PBCL), corresponds to a close approximation of the BCL approach. A comparative study is carried out dealing with computational complexity, convergence rate and frequence of convergence. Numerical experiments using a conventional benchmark show the performance improvement especially for asymmetric levels of noise, which is also discussed from a theoretical point of view.
NASA Astrophysics Data System (ADS)
Wong, S. K.; Chan, V. S.; Hinton, F. L.
2001-10-01
The classic solution of the linearized drift kinetic equations in neoclassical transport theory for large-aspect-ratio tokamak flux-surfaces relies on the variational principle and the choice of ``localized" distribution functions as trialfunctions.(M.N. Rosenbluth, et al., Phys. Fluids 15) (1972) 116. Somewhat unclear in this approach are the nature and the origin of the ``localization" and whether the results obtained represent the exact leading terms in an asymptotic expansion int he inverse aspect ratio. Using the method of matched asymptotic expansions, we were able to derive the leading approximations to the distribution functions and demonstrated the asymptotic exactness of the existing results. The method is also applied to the calculation of angular momentum transport(M.N. Rosenbluth, et al., Plasma Phys. and Contr. Nucl. Fusion Research, 1970, Vol. 1 (IAEA, Vienna, 1971) p. 495.) and the current driven by electron cyclotron waves.
Well-tempered metadynamics converges asymptotically.
Dama, James F; Parrinello, Michele; Voth, Gregory A
2014-06-20
Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.
Well-Tempered Metadynamics Converges Asymptotically
NASA Astrophysics Data System (ADS)
Dama, James F.; Parrinello, Michele; Voth, Gregory A.
2014-06-01
Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.
Estimating 1 min rain rate distributions from numerical weather prediction
NASA Astrophysics Data System (ADS)
Paulson, Kevin S.
2017-01-01
Internationally recognized prognostic models of rain fade on terrestrial and Earth-space EHF links rely fundamentally on distributions of 1 min rain rates. Currently, in Rec. ITU-R P.837-6, these distributions are generated using the Salonen-Poiares Baptista method where 1 min rain rate distributions are estimated from long-term average annual accumulations provided by numerical weather prediction (NWP). This paper investigates an alternative to this method based on the distribution of 6 h accumulations available from the same NWPs. Rain rate fields covering the UK, produced by the Nimrod network of radars, are integrated to estimate the accumulations provided by NWP, and these are linked to distributions of fine-scale rain rates. The proposed method makes better use of the available data. It is verified on 15 NWP regions spanning the UK, and the extension to other regions is discussed.
Spectral-spatial classification of hyperspectral image using three-dimensional convolution network
NASA Astrophysics Data System (ADS)
Liu, Bing; Yu, Xuchu; Zhang, Pengqiang; Tan, Xiong; Wang, Ruirui; Zhi, Lu
2018-01-01
Recently, hyperspectral image (HSI) classification has become a focus of research. However, the complex structure of an HSI makes feature extraction difficult to achieve. Most current methods build classifiers based on complex handcrafted features computed from the raw inputs. The design of an improved 3-D convolutional neural network (3D-CNN) model for HSI classification is described. This model extracts features from both the spectral and spatial dimensions through the application of 3-D convolutions, thereby capturing the important discrimination information encoded in multiple adjacent bands. The designed model views the HSI cube data altogether without relying on any pre- or postprocessing. In addition, the model is trained in an end-to-end fashion without any handcrafted features. The designed model was applied to three widely used HSI datasets. The experimental results demonstrate that the 3D-CNN-based method outperforms conventional methods even with limited labeled training samples.
NASA Technical Reports Server (NTRS)
Rodriquez, Branelle; Shindo, David; Montgomery, Eliza
2013-01-01
The International Space Station (ISS) Program recognizes the risk of microbial contamination in their potable and non-potable water sources. The end of the Space Shuttle Program limited the ability to send up shock kits of biocides in the event of an outbreak. Currently, the United States Orbital Segment water system relies primarily on iodine to mitigate contamination concerns, which has been successful in remediating the small cases of contamination documented. However, a secondary method of disinfection is a necessary investment for future space flight. Over the past year, NASA Johnson Space Center has investigated the development of electrochemically generated systems for use on the ISS. These systems include: hydrogen peroxide, ozone, sodium hypochlorite, and peracetic acid. To use these biocides on deployed water systems, NASA must understand of the effect these biocides have on current ISS materials prior to proceeding forward with possible on-orbit applications. This paper will discuss the material testing that was conducted to assess the effects of the biocides on current ISS materials.
NASA Astrophysics Data System (ADS)
Zhang, Junmin; Chen, Zhang
2008-10-01
A new magnetic hydro-dynamics model for nozzle arc emphasizing the interaction of arc with PTFE (polytetrafluorethylene) vapour is established based on the conservation equations. The interruption of auto-expansion circuit breaker is simulated numerically by finite element method and the influence of PTFE vapour on the arc is analysed with this model. The results reveal that the flow field inside the arc chamber is determined by the arc current, the arcing time, the nozzle arc and the clogging of its thermal boundary. The establishment of quenching pressure relies on both SF6 gas and PTFE vapour that absorbed arc energy in the nozzle. The PTFE vapour leads to an increase in the pressure of nozzle arc obviously, and a decrease in the temperature of arc. But it enhances the temperature of arc at zero current and slows down the decreasing rate of arc temperature as the current decreases.
Mastren, Tara; Radchenko, Valery; Bach, Hong T.; ...
2017-06-01
Rhenium-186 g (t 1/2 = 3.72 d) is a β– emitting isotope suitable for theranostic applications. Current production methods rely on reactor production by way of the reaction 185Re(n,γ) 186gRe, which results in low specific activities limiting its use for cancer therapy. Production via charged particle activation of enriched 186W results in a 186gRe product with a much specific activity, allowing it to be used more broadly for targeted radiotherapy applications. Furthermore, this targets the unmet clinical need for more efficient radiotherapeutics.
The Pharmacogenomics of Anti-Hypertensive Therapy.
Padmanabhan, Sandosh; Paul, Laura; Dominczak, Anna F
2010-06-01
Hypertension is a major public health problem, but measures to reduce blood pressure and thus cardiovascular risk are complicated by the high prevalence of treatment resistance, despite the availability of multiple drugs. Drug side-effects contribute considerably to suboptimal blood pressure control. Clinicians must often rely on empirical methods to match patients with effective drug treatment. Hypertension pharmacogenomics seeks to find genetic predictors of response to drugs that lower blood pressure and to translate this knowledge into clinical practice. In this review we summarise the current status of hypertension pharmacogenetics from monogenic hypertension to essential hypertension and discuss the issues that need to be considered in a hypertension pharmacogenomic study.
Environmental Detection of Clandestine Nuclear Weapon Programs
NASA Astrophysics Data System (ADS)
Kemp, R. Scott
2016-06-01
Environmental sensing of nuclear activities has the potential to detect nuclear weapon programs at early stages, deter nuclear proliferation, and help verify nuclear accords. However, no robust system of detection has been deployed to date. This can be variously attributed to high costs, technical limitations in detector technology, simple countermeasures, and uncertainty about the magnitude or behavior of potential signals. In this article, current capabilities and promising opportunities are reviewed. Systematic research in a variety of areas could improve prospects for detecting covert nuclear programs, although the potential for countermeasures suggests long-term verification of nuclear agreements will need to rely on methods other than environmental sensing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mastren, Tara; Radchenko, Valery; Bach, Hong T.
Rhenium-186 g (t 1/2 = 3.72 d) is a β– emitting isotope suitable for theranostic applications. Current production methods rely on reactor production by way of the reaction 185Re(n,γ) 186gRe, which results in low specific activities limiting its use for cancer therapy. Production via charged particle activation of enriched 186W results in a 186gRe product with a much specific activity, allowing it to be used more broadly for targeted radiotherapy applications. Furthermore, this targets the unmet clinical need for more efficient radiotherapeutics.
Recurrence Quantifcation Analysis of Sentence-Level Speech Kinematics
ERIC Educational Resources Information Center
Jackson, Eric S.; Tiede, Mark; Riley, Michael A.; Whalen, D. H.
2016-01-01
Purpose: Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach--recurrence quantification analysis (RQA)--via a procedural example…
Nutrient Use Efficiency in Bioenergy Cropping Systems: Critical Research Questions
USDA-ARS?s Scientific Manuscript database
Current U.S. plans for energy security rely on converting large areas of cropland from food to biofuel production. Additionally, lands currently considered too marginal for intensive food production may be considered suitable for biofuels production; predominant cropping systems may shift to more va...
Optimization of monopiles for offshore wind turbines.
Kallehave, Dan; Byrne, Byron W; LeBlanc Thilsted, Christian; Mikkelsen, Kristian Kousgaard
2015-02-28
The offshore wind industry currently relies on subsidy schemes to be competitive with fossil-fuel-based energy sources. For the wind industry to survive, it is vital that costs are significantly reduced for future projects. This can be partly achieved by introducing new technologies and partly through optimization of existing technologies and design methods. One of the areas where costs can be reduced is in the support structure, where better designs, cheaper fabrication and quicker installation might all be possible. The prevailing support structure design is the monopile structure, where the simple design is well suited to mass-fabrication, and the installation approach, based on conventional impact driving, is relatively low-risk and robust for most soil conditions. The range of application of the monopile for future wind farms can be extended by using more accurate engineering design methods, specifically tailored to offshore wind industry design. This paper describes how state-of-the-art optimization approaches are applied to the design of current wind farms and monopile support structures and identifies the main drivers where more accurate engineering methods could impact on a next generation of highly optimized monopiles. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Using DNA chips for identification of tephritid pest species.
Chen, Yen-Hou; Liu, Lu-Yan; Tsai, Wei-Huang; Haymer, David S; Lu, Kuang-Hui
2014-08-01
The ability correctly to identify species in a rapid and reliable manner is critical in many situations. For insects in particular, the primary tools for such identification rely on adult-stage morphological characters. For a number of reasons, however, there is a clear need for alternatives. This paper reports on the development of a new method employing DNA biochip technology for the identification of pest species within the family Tephritidae. The DNA biochip developed and tested here quickly and efficiently identifies and discriminates between several tephritid species, except for some that are members of a complex of closely related taxa and that may in fact not represent distinct biological species. The use of these chips offers a number of potential advantages over current methods. Results can be obtained in less than 5 h using material from any stage of the life cycle and with greater sensitivity than other methods currently available. This technology provides a novel tool for the rapid and reliable identification of several major pest species that may be intercepted in imported fruits or other commodities. The existing chips can also easily be expanded to incorporate additional markers and species as needed. © 2013 Society of Chemical Industry.
Challenges Facing Evidence-Based Prevention: Incorporating an Abductive Theory of Method.
Mason, W Alex; Cogua-Lopez, Jasney; Fleming, Charles B; Scheier, Lawrence M
2018-06-01
Current systems used to determine whether prevention programs are "evidence-based" rely on the logic of deductive reasoning. This reliance has fostered implementation of strategies with explicitly stated evaluation criteria used to gauge program validity and suitability for dissemination. Frequently, investigators resort to the randomized controlled trial (RCT) combined with null hypothesis significance testing (NHST) as a means to rule out competing hypotheses and determine whether an intervention works. The RCT design has achieved success across numerous disciplines but is not without limitations. We outline several issues that question allegiance to the RCT, NHST, and the hypothetico-deductive method of scientific inquiry. We also discuss three challenges to the status of program evaluation including reproducibility, generalizability, and credibility of findings. As an alternative, we posit that extending current program evaluation criteria with principles drawn from an abductive theory of method (ATOM) can strengthen our ability to address these challenges and advance studies of drug prevention. Abductive reasoning involves working from observed phenomena to the generation of alternative explanations for the phenomena and comparing the alternatives to select the best possible explanation. We conclude that an ATOM can help increase the influence and impact of evidence-based prevention for population benefit.
Fail-safe reactivity compensation method for a nuclear reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nygaard, Erik T.; Angelo, Peter L.; Aase, Scott B.
The present invention relates generally to the field of compensation methods for nuclear reactors and, in particular to a method for fail-safe reactivity compensation in solution-type nuclear reactors. In one embodiment, the fail-safe reactivity compensation method of the present invention augments other control methods for a nuclear reactor. In still another embodiment, the fail-safe reactivity compensation method of the present invention permits one to control a nuclear reaction in a nuclear reactor through a method that does not rely on moving components into or out of a reactor core, nor does the method of the present invention rely on themore » constant repositioning of control rods within a nuclear reactor in order to maintain a critical state.« less
Coast, Joanna; Flynn, Terry; Sutton, Eileen; Al-Janabi, Hareth; Vosper, Jane; Lavender, Sarita; Louviere, Jordan; Peters, Tim
2008-10-01
This paper deals with three concerns about the evaluative framework that is currently dominant within health economics. These concerns are: that the evaluative framework is concerned entirely with health; that the evaluative framework has an individualistic focus on patients alone; and that the methods used to estimate 'health' within the current evaluative framework could be improved both in terms of the generation of descriptive systems and in using valuation methods that rely less on people's ability to express their preferences on a cardinal scale. In exploring these issues the Investigating Choice Experiments for Preferences of Older People (ICEPOP) programme has explicitly focused on both the topic of older people and the methods of discrete choice experiments. A capability index has been developed and attributes for an economic measure of end-of-life care are currently being generated, providing the possibility of extending the evaluative framework beyond health alone. A measure of carer's experience and a framework for extending measurement in end-of-life care to loved ones are both also in development, thus extending the evaluative framework beyond the patient alone. Rigorous qualitative methods employing an iterative approach have been developed for use in constructing attributes, and best-worst scaling has been utilized to reduce task complexity and provide insights into heterogeneity. There are a number of avenues for further research in all these areas, but in particular there is need for greater attention to be paid to the theory underlying the evaluative framework within health economics.
NASA Astrophysics Data System (ADS)
Scheele, C. J.; Huang, Q.
2016-12-01
In the past decade, the rise in social media has led to the development of a vast number of social media services and applications. Disaster management represents one of such applications leveraging massive data generated for event detection, response, and recovery. In order to find disaster relevant social media data, current approaches utilize natural language processing (NLP) methods based on keywords, or machine learning algorithms relying on text only. However, these approaches cannot be perfectly accurate due to the variability and uncertainty in language used on social media. To improve current methods, the enhanced text-mining framework is proposed to incorporate location information from social media and authoritative remote sensing datasets for detecting disaster relevant social media posts, which are determined by assessing the textual content using common text mining methods and how the post relates spatiotemporally to the disaster event. To assess the framework, geo-tagged Tweets were collected for three different spatial and temporal disaster events: hurricane, flood, and tornado. Remote sensing data and products for each event were then collected using RealEarthTM. Both Naive Bayes and Logistic Regression classifiers were used to compare the accuracy within the enhanced text-mining framework. Finally, the accuracies from the enhanced text-mining framework were compared to the current text-only methods for each of the case study disaster events. The results from this study address the need for more authoritative data when using social media in disaster management applications.
Allergen-Specific Immunotherapies for Food Allergy
Feuille, Elizabeth
2018-01-01
With rising prevalence of food allergy (FA), allergen-specific immunotherapy (AIT) for FA has become an active area of research in recent years. In AIT, incrementally increasing doses of inciting allergen are given with the goal to increase tolerance, initially through desensitization, which relies on regular exposure to allergen. With prolonged therapy in some subjects, AIT may induce sustained unresponsiveness, in which tolerance is retained after a period of allergen avoidance. Methods of AIT currently under study in humans include oral, sublingual, epicutaneous, and subcutaneous delivery of modified allergenic protein, as well as via DNA-based vaccines encoding allergen with lysosomal-associated membrane protein I. The balance of safety and efficacy varies by type of AIT, as well as by targeted allergen. Age, degree of sensitization, and other comorbidities may affect this balance within an individual patient. More recently, AIT with modified proteins or combined with immunomodulatory therapies has shown promise in making AIT safer and/or more effective. Though methods of AIT are neither currently advised by experts (oral immunotherapy [OIT]) nor widely available, AIT is likely to become a part of recommended management of FA in the coming years. Here, we review and compare methods of AIT currently under study in humans to prepare the practitioner for an exciting new phase in the care of food allergic patients in which improved tolerance to inciting foods will be a real possibility. PMID:29676066
Exploratory factor analysis in Rehabilitation Psychology: a content analysis.
Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N
2014-11-01
Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.
The Anopheles gambiae transcriptome - a turning point for malaria control.
Domingos, A; Pinheiro-Silva, R; Couto, J; do Rosário, V; de la Fuente, J
2017-04-01
Mosquitoes are important vectors of several pathogens and thereby contribute to the spread of diseases, with social, economic and public health impacts. Amongst the approximately 450 species of Anopheles, about 60 are recognized as vectors of human malaria, the most important parasitic disease. In Africa, Anopheles gambiae is the main malaria vector mosquito. Current malaria control strategies are largely focused on drugs and vector control measures such as insecticides and bed-nets. Improvement of current, and the development of new, mosquito-targeted malaria control methods rely on a better understanding of mosquito vector biology. An organism's transcriptome is a reflection of its physiological state and transcriptomic analyses of different conditions that are relevant to mosquito vector competence can therefore yield important information. Transcriptomic analyses have contributed significant information on processes such as blood-feeding parasite-vector interaction, insecticide resistance, and tissue- and stage-specific gene regulation, thereby facilitating the path towards the development of new malaria control methods. Here, we discuss the main applications of transcriptomic analyses in An. gambiae that have led to a better understanding of mosquito vector competence. © 2017 The Royal Entomological Society.
First Higher-Multipole Model of Gravitational Waves from Spinning and Coalescing Black-Hole Binaries
NASA Astrophysics Data System (ADS)
London, Lionel; Khan, Sebastian; Fauchon-Jones, Edward; García, Cecilio; Hannam, Mark; Husa, Sascha; Jiménez-Forteza, Xisco; Kalaghatgi, Chinmay; Ohme, Frank; Pannarale, Francesco
2018-04-01
Gravitational-wave observations of binary black holes currently rely on theoretical models that predict the dominant multipoles (ℓ=2 ,|m |=2 ) of the radiation during inspiral, merger, and ringdown. We introduce a simple method to include the subdominant multipoles to binary black hole gravitational waveforms, given a frequency-domain model for the dominant multipoles. The amplitude and phase of the original model are appropriately stretched and rescaled using post-Newtonian results (for the inspiral), perturbation theory (for the ringdown), and a smooth transition between the two. No additional tuning to numerical-relativity simulations is required. We apply a variant of this method to the nonprecessing PhenomD model. The result, PhenomHM, constitutes the first higher-multipole model of spinning and coalescing black-hole binaries, and currently includes the (ℓ,|m |)=(2 ,2 ),(3 ,3 ),(4 ,4 ),(2 ,1 ),(3 ,2 ),(4 ,3 ) radiative moments. Comparisons with numerical-relativity waveforms demonstrate that PhenomHM is more accurate than dominant-multipole-only models for all binary configurations, and typically improves the measurement of binary properties.
Centrifugal Compressor Aeroelastic Analysis Code
NASA Astrophysics Data System (ADS)
Keith, Theo G., Jr.; Srivastava, Rakesh
2002-01-01
Centrifugal compressors are very widely used in the turbomachine industry where low mass flow rates are required. Gas turbine engines for tanks, rotorcraft and small jets rely extensively on centrifugal compressors for rugged and compact design. These compressors experience problems related with unsteadiness of flowfields, such as stall flutter, separation at the trailing edge over diffuser guide vanes, tip vortex unsteadiness, etc., leading to rotating stall and surge. Considerable interest exists in small gas turbine engine manufacturers to understand and eventually eliminate the problems related to centrifugal compressors. The geometric complexity of centrifugal compressor blades and the twisting of the blade passages makes the linear methods inapplicable. Advanced computational fluid dynamics (CFD) methods are needed for accurate unsteady aerodynamic and aeroelastic analysis of centrifugal compressors. Most of the current day industrial turbomachines and small aircraft engines are designed with a centrifugal compressor. With such a large customer base and NASA Glenn Research Center being, the lead center for turbomachines, it is important that adequate emphasis be placed on this area as well. Currently, this activity is not supported under any project at NASA Glenn.
An illustration of new methods in machine condition monitoring, Part I: stochastic resonance
NASA Astrophysics Data System (ADS)
Worden, K.; Antoniadou, I.; Marchesiello, S.; Mba, C.; Garibaldi, L.
2017-05-01
There have been many recent developments in the application of data-based methods to machine condition monitoring. A powerful methodology based on machine learning has emerged, where diagnostics are based on a two-step procedure: extraction of damage-sensitive features, followed by unsupervised learning (novelty detection) or supervised learning (classification). The objective of the current pair of papers is simply to illustrate one state-of-the-art procedure for each step, using synthetic data representative of reality in terms of size and complexity. The first paper in the pair will deal with feature extraction. Although some papers have appeared in the recent past considering stochastic resonance as a means of amplifying damage information in signals, they have largely relied on ad hoc specifications of the resonator used. In contrast, the current paper will adopt a principled optimisation-based approach to the resonator design. The paper will also show that a discrete dynamical system can provide all the benefits of a continuous system, but also provide a considerable speed-up in terms of simulation time in order to facilitate the optimisation approach.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
London, Lionel; Khan, Sebastian; Fauchon-Jones, Edward; García, Cecilio; Hannam, Mark; Husa, Sascha; Jiménez-Forteza, Xisco; Kalaghatgi, Chinmay; Ohme, Frank; Pannarale, Francesco
2018-04-20
Gravitational-wave observations of binary black holes currently rely on theoretical models that predict the dominant multipoles (ℓ=2,|m|=2) of the radiation during inspiral, merger, and ringdown. We introduce a simple method to include the subdominant multipoles to binary black hole gravitational waveforms, given a frequency-domain model for the dominant multipoles. The amplitude and phase of the original model are appropriately stretched and rescaled using post-Newtonian results (for the inspiral), perturbation theory (for the ringdown), and a smooth transition between the two. No additional tuning to numerical-relativity simulations is required. We apply a variant of this method to the nonprecessing PhenomD model. The result, PhenomHM, constitutes the first higher-multipole model of spinning and coalescing black-hole binaries, and currently includes the (ℓ,|m|)=(2,2),(3,3),(4,4),(2,1),(3,2),(4,3) radiative moments. Comparisons with numerical-relativity waveforms demonstrate that PhenomHM is more accurate than dominant-multipole-only models for all binary configurations, and typically improves the measurement of binary properties.
[The role of biotechnology in pharmaceutical drug design].
Gaisser, Sibylle; Nusser, Michael
2010-01-01
Biotechnological methods have become an important tool in pharmaceutical drug research and development. Today approximately 15 % of drug revenues are derived from biopharmaceuticals. The most relevant indications are oncology, metabolic disorders and disorders of the musculoskeletal system. For the future it can be expected that the relevance of biopharmaceuticals will further increase. Currently, the share of substances in preclinical testing that rely on biotechnology is more than 25 % of all substances in preclinical testing. Products for the treatment of cancer, metabolic disorders and infectious diseases are most important. New therapeutic approaches such as RNA interference only play a minor role in current commercial drug research and development with 1.5 % of all biological preclinical substances. Investments in sustainable high technology such as biotechnology are of vital importance for a highly developed country like Germany because of its lack of raw materials. Biotechnology helps the pharmaceutical industry to develop new products, new processes, methods and services and to improve existing ones. Thus, international competitiveness can be strengthened, new jobs can be created and existing jobs preserved.
Optical Biopsy: A New Frontier in Endoscopic Detection and Diagnosis
WANG, THOMAS D.; VAN DAM, JACQUES
2007-01-01
Endoscopic diagnosis currently relies on the ability of the operator to visualize abnormal patterns in the image created by light reflected from the mucosal surface of the gastrointestinal tract. Advances in fiber optics, light sources, detectors, and molecular biology have led to the development of several novel methods for tissue evaluation in situ. The term “optical biopsy” refers to methods that use the properties of light to enable the operator to make an instant diagnosis at endoscopy, previously possible only by using histological or cytological analysis. Promising imaging techniques include fluorescence endoscopy, optical coherence tomography, confocal microendoscopy, and molecular imaging. Point detection schemes under development include light scattering and Raman spectroscopy. Such advanced diagnostic methods go beyond standard endoscopic techniques by offering improved image resolution, contrast, and tissue penetration and providing biochemical and molecular information about mucosal disease. This review describes the basic biophysics of light-tissue interactions, assesses the strengths and weaknesses of each method, and examines clinical and preclinical evidence for each approach. PMID:15354274
Measurements of dynamo effect on double-CHI pulse ST plasmas on HIST
NASA Astrophysics Data System (ADS)
Ito, K.; Hanao, T.; Ishihara, M.; Matsumoto, K.; Higashi, T.; Kikuchi, Y.; Fukumoto, N.; Nagata, M.
2011-10-01
Coaxial Helicity injection (CHI) is an efficient current-drive method used in spheromak and spherical torus (ST) experiments. An anticipated issue for CHI is achieving good energy confinement, since it relies on the magnetic relaxation and dynamo. This is essentially because CHI cannot drive a dynamo directly inside a closed magnetic flux surface. Thus, it is an important issue to investigate dynamo effect to explore CHI current drive mechanisms in a new approach such as Multi-pulsing CHI method. To study the dynamo model with two-fluid Hall effects, we have started from the generalized Ohm law. We have measured each MHD dynamo term and Hall dynamo term separately by using Mach probe and Hall probe involving 3-axis magnetic pick-up coils. The result shows that the induced electric field due to MHD dynamo is large enough to sustain the mean toroidal current against resistive decay in the core region. In the other hand, the anti-dynamo effect in the MHD dynamo term is observed in the central open flux column (OFC) region. From the viewpoint of two-fluid theory, ion diamagnetic drift is opposite to the electron diamagnetic drift, maybe resulting in the anti-dynamo effect. Hall dynamo may arise from the fluctuating electron diamagnetic current due to high electron density gradient which is large in the OFC region.
Intercontinental Ballistic Missiles and their Role in Future Nuclear Forces
2017-05-01
they cannot carry nuclear weapons. The B-52 relies entirely on the ALCM, whereas the B-2 currently relies on unguided bombs . A new stealthy bomber...SLBM program within the next five to seven years to maintain SLBM availability into the 2050s and beyond. bomb . The new bomb will be used by stealthy...accuracy combination in an ICBM, an SLBM, a guided bomb , or a cruise missile. Similarly, speed of response and in-flight survivability favor ICBMs
Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models
Eckert, Alissa M.; Tumpey, Terrence M.; Maines, Taronna R.
2016-01-01
SUMMARY Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. PMID:27412880
Complexities in Ferret Influenza Virus Pathogenesis and Transmission Models.
Belser, Jessica A; Eckert, Alissa M; Tumpey, Terrence M; Maines, Taronna R
2016-09-01
Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Wilson, Anna; Goldberg, Tony; Marcquenski, Susan; Olson, Wendy; Goetz, Frederick; Hershberger, Paul; Hart, Lucas
2014-01-01
Viral hemorrhagic septicemia virus (VHSV) is a target of surveillance by many state and federal agencies in the United States. Currently, the detection of VHSV relies on virus isolation, which is lethal to fish and indicates only the current infection status. A serological method is required to ascertain prior exposure. Here, we report two serologic tests for VHSV that are nonlethal, rapid, and species independent, a virus neutralization (VN) assay and a blocking enzyme-linked immunosorbent assay (ELISA). The results show that the VN assay had a specificity of 100% and sensitivity of 42.9%; the anti-nucleocapsid-blocking ELISA detected nonneutralizing VHSV antibodies at a specificity of 88.2% and a sensitivity of 96.4%. The VN assay and ELISA are valuable tools for assessing exposure to VHSV. PMID:24429071
Single-cell proteomics: potential implications for cancer diagnostics.
Gavasso, Sonia; Gullaksen, Stein-Erik; Skavland, Jørn; Gjertsen, Bjørn T
2016-01-01
Single-cell proteomics in cancer is evolving and promises to provide more accurate diagnoses based on detailed molecular features of cells within tumors. This review focuses on technologies that allow for collection of complex data from single cells, but also highlights methods that are adaptable to routine cancer diagnostics. Current diagnostics rely on histopathological analysis, complemented by mutational detection and clinical imaging. Though crucial, the information gained is often not directly transferable to defined therapeutic strategies, and predicting therapy response in a patient is difficult. In cancer, cellular states revealed through perturbed intracellular signaling pathways can identify functional mutations recurrent in cancer subsets. Single-cell proteomics remains to be validated in clinical trials where serial samples before and during treatment can reveal excessive clonal evolution and therapy failure; its use in clinical trials is anticipated to ignite a diagnostic revolution that will better align diagnostics with the current biological understanding of cancer.
Three-dimensional bio-printing: A new frontier in oncology research
Charbe, Nitin; McCarron, Paul A; Tambuwala, Murtaza M
2017-01-01
Current research in oncology deploys methods that rely principally on two-dimensional (2D) mono-cell cultures and animal models. Although these methodologies have led to significant advancement in the development of novel experimental therapeutic agents with promising anticancer activity in the laboratory, clinicians still struggle to manage cancer in the clinical setting. The disappointing translational success is attributable mainly to poor representation and recreation of the cancer microenvironment present in human neoplasia. Three-dimensional (3D) bio-printed models could help to simulate this micro-environment, with recent bio-printing of live human cells demonstrating that effective in vitro replication is achievable. This literature review outlines up-to-date advancements and developments in the use of 3D bio-printed models currently being used in oncology research. These innovative advancements in 3D bio-printing open up a new frontier for oncology research and could herald an era of progressive clinical cancer therapeutics. PMID:28246583
On the Interactions Between Planetary and Mesoscale Dynamics in the Oceans
NASA Astrophysics Data System (ADS)
Grooms, I.; Julien, K. A.; Fox-Kemper, B.
2011-12-01
Multiple-scales asymptotic methods are used to investigate the interaction of planetary and mesoscale dynamics in the oceans. We find three regimes. In the first, the slow, large-scale planetary flow sets up a baroclinically unstable background which leads to vigorous mesoscale eddy generation, but the eddy dynamics do not affect the planetary dynamics. In the second, the planetary flow feels the effects of the eddies, but appears to be unable to generate them. The first two regimes rely on horizontally isotropic large-scale dynamics. In the third regime, large-scale anisotropy, as exists for example in the Antarctic Circumpolar Current and in western boundary currents, allows the large-scale dynamics to both generate and respond to mesoscale eddies. We also discuss how the investigation may be brought to bear on the problem of parameterization of unresolved mesoscale dynamics in ocean general circulation models.
Graphene Based Ultra-Capacitors for Safer, More Efficient Energy Storage
NASA Technical Reports Server (NTRS)
Roberson, Luke B.; Mackey, Paul J.; Zide, Carson J.
2016-01-01
Current power storage methods must be continuously improved in order to keep up with the increasingly competitive electronics industry. This technological advancement is also essential for the continuation of deep space exploration. Today's energy storage industry relies heavily on the use of dangerous and corrosive chemicals such as lithium and phosphoric acid. These chemicals can prove hazardous to the user if the device is ruptured. Similarly they can damage the environment if they are disposed of improperly. A safer, more efficient alternative is needed across a wide range of NASA missions. One solution would a solid-state carbon based energy storage device. Carbon is a safer, less environmentally hazardous alternative to current energy storage materials. Using the amorphous carbon nanostructure, graphene, this idea of a safer portable energy is possible. Graphene was electrochemically produced in the lab and several coin cell devices were built this summer to create a working prototype of a solid-state graphene battery.
Mass spectrometry and immunoassay: how to measure steroid hormones today and tomorrow.
Taylor, Angela E; Keevil, Brian; Huhtaniemi, Ilpo T
2015-08-01
The recent onslaught of mass spectrometry (MS) to measurements of steroid hormones, including demands that they should be the only acceptable method, has confused clinicians and scientists who have relied for more than 40 years on a variety of immunoassay (IA) methods in steroid hormone measurements. There is little doubt that MS methods with their superior specificity will be the future method of choice in many clinical and research applications of steroid hormone measurement. However, the majority of steroid measurements are currently, and will continue to be, carried out using various types of IAs for several reasons, including their technical ease, cost and availability of commercial reagents. Speedy replacement of all IAs with MS is an unrealistic and unnecessary goal, because the availability of MS measurements is limited by cost, need of expensive equipment, technical demands and lack of commercial applications. Furthermore, IAs have multiple well-known advantages that vindicate their continuing use. The purpose of this article is to elucidate the advantages and limitations of the MS and IA techniques from two angles, i.e. promotion of MS and defence of IA. The purpose of the text is to give the reader an unbiased view about the current state and future trends of steroid analysis and to help him/her choose the correct assay method to serve his/her diagnostic and research needs. © 2015 European Society of Endocrinology.
Development of Methodologies for IV and V of Neural Networks
NASA Technical Reports Server (NTRS)
Taylor, Brian; Darrah, Marjorie
2003-01-01
Non-deterministic systems often rely upon neural network (NN) technology to "lean" to manage flight systems under controlled conditions using carefully chosen training sets. How can these adaptive systems be certified to ensure that they will become increasingly efficient and behave appropriately in real-time situations? The bulk of Independent Verification and Validation (IV&V) research of non-deterministic software control systems such as Adaptive Flight Controllers (AFC's) addresses NNs in well-behaved and constrained environments such as simulations and strict process control. However, neither substantive research, nor effective IV&V techniques have been found to address AFC's learning in real-time and adapting to live flight conditions. Adaptive flight control systems offer good extensibility into commercial aviation as well as military aviation and transportation. Consequently, this area of IV&V represents an area of growing interest and urgency. ISR proposes to further the current body of knowledge to meet two objectives: Research the current IV&V methods and assess where these methods may be applied toward a methodology for the V&V of Neural Network; and identify effective methods for IV&V of NNs that learn in real-time, including developing a prototype test bed for IV&V of AFC's. Currently. no practical method exists. lSR will meet these objectives through the tasks identified and described below. First, ISR will conduct a literature review of current IV&V technology. TO do this, ISR will collect the existing body of research on IV&V of non-deterministic systems and neural network. ISR will also develop the framework for disseminating this information through specialized training. This effort will focus on developing NASA's capability to conduct IV&V of neural network systems and to provide training to meet the increasing need for IV&V expertise in such systems.
NASA Astrophysics Data System (ADS)
Doran-Peterson, Joy; Jangid, Amruta; Brandon, Sarah K.; Decrescenzo-Henriksen, Emily; Dien, Bruce; Ingram, Lonnie O.
Ethanol production by fermentation of lignocellulosic biomass-derived sugars involves a fairly ancient art and an ever-evolving science. Production of ethanol from lignocellulosic biomass is not avant-garde, and wood ethanol plants have been in existence since at least 1915. Most current ethanol production relies on starch- and sugar-based crops as the substrate; however, limitations of these materials and competing value for human and animal feeds is renewing interest in lignocellulose conversion. Herein, we describe methods for both simultaneous saccharification and fermentation (SSF) and a similar but separate process for partial saccharification and cofermentation (PSCF) of lignocellulosic biomass for ethanol production using yeasts or pentose-fermenting engineered bacteria. These methods are applicable for small-scale preliminary evaluations of ethanol production from a variety of biomass sources.
Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Cao, J.
2016-09-01
Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.
Progress in reforming chemical engineering education.
Wankat, Phillip C
2013-01-01
Three successful historical reforms of chemical engineering education were the triumph of chemical engineering over industrial chemistry, the engineering science revolution, and Engineering Criteria 2000. Current attempts to change teaching methods have relied heavily on dissemination of the results of engineering-education research that show superior student learning with active learning methods. Although slow dissemination of education research results is probably a contributing cause to the slowness of reform, two other causes are likely much more significant. First, teaching is the primary interest of only approximately one-half of engineering faculty. Second, the vast majority of engineering faculty have no training in teaching, but trained professors are on average better teachers. Significant progress in reform will occur if organizations with leverage-National Science Foundation, through CAREER grants, and the Engineering Accreditation Commission of ABET-use that leverage to require faculty to be trained in pedagogy.
Plasmonic biosensor for label-free G-quadruplexes detection
NASA Astrophysics Data System (ADS)
Qiu, Suyan; Zhao, Fusheng; Santos, Greggy M.; Shih, Wei-Chuan
2016-03-01
G-quadruplex, readily formed by the G-rich sequence, potentially distributes in over 40 % of all human genes, such as the telomeric DNA with the G-rich sequence found at the end of the chromosome. The G-quadruplex structure is supposed to possess a diverse set of critical functions in the mammalian genome for transcriptional regulation, DNA replication and genome stability. However, most of the currently available methods for G-quadruplex identification are restricted to fluorescence techniques susceptible to poor sensitivity. It is essential to propose methods with higher sensitivity to specifically recognize the G-quadruplexes. In this study, we demonstrate a label-free plasmonic biosensor for G-quadruplex detection by relying on the advantages of nanoporous gold (NPG) disks that provide high-density plasmonic hot spots, suitable for molecular recognition capability without the requirement for labeling processes.
Groseth, Allison; Hoenen, Thomas
2017-01-01
Molecular biology is a broad discipline that seeks to understand biological phenomena at a molecular level, and achieves this through the study of DNA, RNA, proteins, and/or other macromolecules (e.g., those involved in the modification of these substrates). Consequently, it relies on the availability of a wide variety of methods that deal with the collection, preservation, inactivation, separation, manipulation, imaging, and analysis of these molecules. As such the state of the art in the field of ebolavirus molecular biology research (and that of all other viruses) is largely intertwined with, if not driven by, advancements in the technical methodologies available for these kinds of studies. Here we review of the current state of our knowledge regarding ebolavirus biology and emphasize the associated methods that made these discoveries possible.
Arrayed antibody library technology for therapeutic biologic discovery.
Bentley, Cornelia A; Bazirgan, Omar A; Graziano, James J; Holmes, Evan M; Smider, Vaughn V
2013-03-15
Traditional immunization and display antibody discovery methods rely on competitive selection amongst a pool of antibodies to identify a lead. While this approach has led to many successful therapeutic antibodies, targets have been limited to proteins which are easily purified. In addition, selection driven discovery has produced a narrow range of antibody functionalities focused on high affinity antagonism. We review the current progress in developing arrayed protein libraries for screening-based, rather than selection-based, discovery. These single molecule per microtiter well libraries have been screened in multiplex formats against both purified antigens and directly against targets expressed on the cell surface. This facilitates the discovery of antibodies against therapeutically interesting targets (GPCRs, ion channels, and other multispanning membrane proteins) and epitopes that have been considered poorly accessible to conventional discovery methods. Copyright © 2013. Published by Elsevier Inc.
Experimental characterization of the weld pool flow in a TIG configuration
NASA Astrophysics Data System (ADS)
Stadler, M.; Masquère, M.; Freton, P.; Franceries, X.; Gonzalez, J. J.
2014-11-01
Tungsten Inert Gas (TIG) welding process relies on heat transfer between plasma and work piece leading to a metallic weld pool. Combination of different forces produces movements on the molten pool surface. One of our aims is to determine the velocity on the weld pool surface. This provides a set of data that leads to a deeper comprehension of the flow behavior and allows us to validate numerical models used to study TIG parameters. In this paper, two diagnostic methods developed with high speed imaging for the determination of velocity of an AISI 304L stainless steel molten pool are presented. Application of the two methods to a metallic weld pool under helium with a current intensity of 100 A provides velocity values around 0.70 m/s which are in good agreement with literature works.
Fast Three-Dimensional Method of Modeling Atomic Oxygen Undercutting of Protected Polymers
NASA Technical Reports Server (NTRS)
Snyder, Aaron; Banks, Bruce A.
2002-01-01
A method is presented to model atomic oxygen erosion of protected polymers in low Earth orbit (LEO). Undercutting of protected polymers by atomic oxygen occurs in LEO due to the presence of scratch, crack or pin-window defects in the protective coatings. As a means of providing a better understanding of undercutting processes, a fast method of modeling atomic-oxygen undercutting of protected polymers has been developed. Current simulation methods often rely on computationally expensive ray-tracing procedures to track the surface-to-surface movement of individual "atoms." The method introduced in this paper replaces slow individual particle approaches by substituting a model that utilizes both a geometric configuration-factor technique, which governs the diffuse transport of atoms between surfaces, and an efficient telescoping series algorithm, which rapidly integrates the cumulative effects stemming from the numerous atomic oxygen events occurring at the surfaces of an undercut cavity. This new method facilitates the systematic study of three-dimensional undercutting by allowing rapid simulations to be made over a wide range of erosion parameters.
Semantic distance-based creation of clusters of pharmacovigilance terms and their evaluation.
Dupuch, Marie; Grabar, Natalia
2015-04-01
Pharmacovigilance is the activity related to the collection, analysis and prevention of adverse drug reactions (ADRs) induced by drugs or biologics. The detection of adverse drug reactions is performed using statistical algorithms and groupings of ADR terms from the MedDRA (Medical Dictionary for Drug Regulatory Activities) terminology. Standardized MedDRA Queries (SMQs) are the groupings which become a standard for assisting the retrieval and evaluation of MedDRA-coded ADR reports worldwide. Currently 84 SMQs have been created, while several important safety topics are not yet covered. Creation of SMQs is a long and tedious process performed by the experts. It relies on manual analysis of MedDRA in order to find out all the relevant terms to be included in a SMQ. Our objective is to propose an automatic method for assisting the creation of SMQs using the clustering of terms which are semantically similar. The experimental method relies on a specific semantic resource, and also on the semantic distance algorithms and clustering approaches. We perform several experiments in order to define the optimal parameters. Our results show that the proposed method can assist the creation of SMQs and make this process faster and systematic. The average performance of the method is precision 59% and recall 26%. The correlation of the results obtained is 0.72 against the medical doctors judgments and 0.78 against the medical coders judgments. These results and additional evaluation indicate that the generated clusters can be efficiently used for the detection of pharmacovigilance signals, as they provide better signal detection than the existing SMQs. Copyright © 2014. Published by Elsevier Inc.
Is anthelmintic resistance a concern for the control of human soil-transmitted helminths?
Vercruysse, Jozef; Albonico, Marco; Behnke, Jerzy M.; Kotze, Andrew C.; Prichard, Roger K.; McCarthy, James S.; Montresor, Antonio; Levecke, Bruno
2011-01-01
The major human soil-transmitted helminths (STH), Ascaris lumbricoides, hookworms (Necator americanus and Ancylostoma duodenale) and Trichuris trichiura have a marked impact on human health in many parts of the world. Current efforts to control these parasites rely predominantly on periodic mass administration of anthelmintic drugs to school age children and other at-risk groups. After many years of use of these same drugs for controlling roundworms in livestock, high levels of resistance have developed, threatening the sustainability of these livestock industries in some locations. Hence, the question arises as to whether this is likely to also occur in the human STH, thereby threatening our ability to control these parasites. This is particularly important because of the recent increase in mass control programmes, relying almost exclusively on benzimidazole anthelmintics. It will be important to ensure that resistance is detected as it emerges in order to allow the implementation of mitigation strategies, such as use of drug combinations, to ensure that the effectiveness of the few existing anthelmintic drugs is preserved. In this review we address these issues by firstly examining the efficacy of anthelmintics against the human STH, and assessing whether there are any indications to date that resistance has emerged. We then consider the factors that influence the effect of current drug-use patterns in selecting for resistant parasite populations. We describe the tools currently available for resistance monitoring (field-based coprological methods), and those under development (in vitro bioassays and molecular tests), and highlight confounding factors that need to be taken into account when interpreting such resistance-monitoring data. We then highlight means to ensure that the currently available tools are used correctly, particularly with regard to study design, and we set appropriate drug-efficacy thresholds. Finally, we make recommendations for monitoring drug efficacy in the field, as components of control programmes, in order to maximise the ability to detect drug resistance, and if it arises to change control strategy and prevent the spread of resistance. PMID:24533260
Aerodynamic Classification of Swept-Wing Ice Accretion
NASA Technical Reports Server (NTRS)
Broeren, Andy; Diebold, Jeff; Bragg, Mike
2013-01-01
The continued design, certification and safe operation of swept-wing airplanes in icing conditions rely on the advancement of computational and experimental simulation methods for higher fidelity results over an increasing range of aircraft configurations and performance, and icing conditions. The current state-of-the-art in icing aerodynamics is mainly built upon a comprehensive understanding of two-dimensional geometries that does not currently exist for fundamentally three-dimensional geometries such as swept wings. The purpose of this report is to describe what is known of iced-swept-wing aerodynamics and to identify the type of research that is required to improve the current understanding. Following the method used in a previous review of iced-airfoil aerodynamics, this report proposes a classification of swept-wing ice accretion into four groups based upon unique flowfield attributes. These four groups are: ice roughness, horn ice, streamwise ice, and spanwise-ridge ice. For all of the proposed ice-shape classifications, relatively little is known about the three-dimensional flowfield and even less about the effect of Reynolds number and Mach number on these flowfields. The classifications and supporting data presented in this report can serve as a starting point as new research explores swept-wing aerodynamics with ice shapes. As further results are available, it is expected that these classifications will need to be updated and revised.
28 CFR 35.139 - Direct threat.
Code of Federal Regulations, 2011 CFR
2011-07-01
... public entity must make an individualized assessment, based on reasonable judgment that relies on current medical knowledge or on the best available objective evidence, to ascertain: the nature, duration, and...
Current deflection NDE for pipeline inspection and monitoring
NASA Astrophysics Data System (ADS)
Jarvis, Rollo; Cawley, Peter; Nagy, Peter B.
2016-02-01
Failure of oil and gas pipelines can often be catastrophic, therefore routine inspection for time dependent degradation is essential. In-line inspection is the most common method used; however, this requires the insertion and retrieval of an inspection tool that is propelled by the fluid in the pipe and risks becoming stuck, so alternative methods must often be employed. This work investigates the applicability of a non-destructive evaluation technique for both the detection and growth monitoring of defects, particularly corrosion under insulation. This relies on injecting an electric current along the pipe and indirectly measuring the deflection of current around defects from perturbations in the orthogonal components of the induced magnetic flux density. An array of three orthogonally oriented anisotropic magnetoresistive sensors has been used to measure the magnetic flux density surrounding a 6'' schedule-40 steel pipe carrying 2 A quasi-DC axial current. A finite element model has been developed that predicts the perturbations in magnetic flux density caused by current deflection which has been validated by experimental results. Measurements of the magnetic flux density at 50 mm lift-off from the pipe surface are stable and repeatable to the order of 100 pT which suggests that defect detection or monitoring growth of corrosion-type defects may be possible with a feasible magnitude of injected current. Magnetic signals are additionally incurred by changes in the wall thickness of the pipe due to manufacturing tolerances, and material property variations. If a monitoring scheme using baseline subtraction is employed then the sensitivity to defects can be improved while avoiding false calls.
ParticleCall: A particle filter for base calling in next-generation sequencing systems
2012-01-01
Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067
Toward Understanding the Heterogeneity in OCD: Evidence from narratives in adult patients
Van Schalkwyk, Gerrit I; Bhalla, Ish P; Griepp, Matthew; Kelmendi, Benjamin; Davidson, Larry; Pittenger, Christopher
2015-01-01
Background Current attempts at understanding the heterogeneity in OCD have relied on quantitative methods. The results of such work point towards a dimensional structure for OCD. Existing qualitative work in OCD has focused on understanding specific aspects of the OCD experience in greater depth. However, qualitative methods are also of potential value in furthering our understanding of OCD heterogeneity by allowing for open-ended exploration of the OCD experience and correlating identified subtypes with patient narratives. Aims We explored variations in patients’ experience prior to, during, and immediately after performing their compulsions. Method Semi-structured interviews were conducted with 20 adults with OCD, followed by inductive thematic analysis. Participant responses were not analyzed within the context of an existing theoretical framework, and themes were labeled descriptively. Results The previously dichotomy of ‘anxiety’ vs ‘incompleteness’ emerged organically during narrative analysis. In addition, we found that some individuals with OCD utilize their behaviors as a way to cope with stress and anxiety more generally. Other participants did not share this experience and denied finding any comfort in their OC behaviors. The consequences of attention difficulties were highlighted, with some participants describing how difficulty focusing on a task could influence the need for it to be repeated multiple times. Conclusions The extent to which patients use OCD as a coping mechanism is a relevant distinction with potential implications for treatment engagement. Patients may experience ambivalence about suppressing behaviors that they have come to rely upon for management of stress and anxiety, even if these behaviors represent symptoms of a psychiatric illness. PMID:25855685
Microencapsulation and Electrostatic Processing Method
NASA Technical Reports Server (NTRS)
Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)
2000-01-01
Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.
Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob
2016-08-01
In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information).
A novel approach to enhance antibody sensitivity and specificity by peptide cross-linking
Namiki, Takeshi; Valencia, Julio C.; Hall, Matthew D.; Hearing, Vincent J.
2008-01-01
Most current techniques employed to improve antigen-antibody signals in western blotting and in immunohistochemistry rely on sample processing prior to staining (e.g. microwaving) or using a more robust reporter (e.g. a secondary antibody with biotin-streptavidin). We have developed and optimized a new approach intended to stabilize the complexes formed between antigens and their respective primary antibodies by cupric ions at high pH. This technique improves the affinity and lowers cross-reactivity with non-specific bands of ∼20% of antibodies tested (5/25). Here we report that this method can enhance antigen-antibody specificity and can improve the utility of some poorly reactive primary antibodies. PMID:18801330
Shu, Lisa L; Mazar, Nina; Gino, Francesca; Ariely, Dan; Bazerman, Max H
2012-09-18
Many written forms required by businesses and governments rely on honest reporting. Proof of honest intent is typically provided through signature at the end of, e.g., tax returns or insurance policy forms. Still, people sometimes cheat to advance their financial self-interests-at great costs to society. We test an easy-to-implement method to discourage dishonesty: signing at the beginning rather than at the end of a self-report, thereby reversing the order of the current practice. Using laboratory and field experiments, we find that signing before-rather than after-the opportunity to cheat makes ethics salient when they are needed most and significantly reduces dishonesty.
Uncovering text mining: A survey of current work on web-based epidemic intelligence
Collier, Nigel
2012-01-01
Real world pandemics such as SARS 2002 as well as popular fiction like the movie Contagion graphically depict the health threat of a global pandemic and the key role of epidemic intelligence (EI). While EI relies heavily on established indicator sources a new class of methods based on event alerting from unstructured digital Internet media is rapidly becoming acknowledged within the public health community. At the heart of automated information gathering systems is a technology called text mining. My contribution here is to provide an overview of the role that text mining technology plays in detecting epidemics and to synthesise my existing research on the BioCaster project. PMID:22783909
The NASA Hydrogen Energy Systems Technology study - A summary
NASA Technical Reports Server (NTRS)
Laumann, E. A.
1976-01-01
This study is concerned with: hydrogen use, alternatives and comparisons, hydrogen production, factors affecting application, and technology requirements. Two scenarios for future use are explained. One is called the reference hydrogen use scenario and assumes continued historic uses of hydrogen along with additional use for coal gasification and liquefaction, consistent with the Ford technical fix baseline (1974) projection. The expanded scenario relies on the nuclear electric economy (1973) energy projection and assumes the addition of limited new uses such as experimental hydrogen-fueled aircraft, some mixing with natural gas, and energy storage by utilities. Current uses and supply of hydrogen are described, and the technological requirements for developing new methods of hydrogen production are discussed.
Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.
2012-10-01
In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less
Saucedo-Espinosa, Mario A.; Lapizco-Encinas, Blanca H.
2016-01-01
Current monitoring is a well-established technique for the characterization of electroosmotic (EO) flow in microfluidic devices. This method relies on monitoring the time response of the electric current when a test buffer solution is displaced by an auxiliary solution using EO flow. In this scheme, each solution has a different ionic concentration (and electric conductivity). The difference in the ionic concentration of the two solutions defines the dynamic time response of the electric current and, hence, the current signal to be measured: larger concentration differences result in larger measurable signals. A small concentration difference is needed, however, to avoid dispersion at the interface between the two solutions, which can result in undesired pressure-driven flow that conflicts with the EO flow. Additional challenges arise as the conductivity of the test solution decreases, leading to a reduced electric current signal that may be masked by noise during the measuring process, making for a difficult estimation of an accurate EO mobility. This contribution presents a new scheme for current monitoring that employs multiple channels arranged in parallel, producing an increase in the signal-to-noise ratio of the electric current to be measured and increasing the estimation accuracy. The use of this parallel approach is particularly useful in the estimation of the EO mobility in systems where low conductivity mediums are required, such as insulator based dielectrophoresis devices. PMID:27375813
Gubhaju, Bina
2009-12-01
The association between education level and fertility, contraceptive behavior and method choice has been extensively researched, but little is known about how the education differential between husbands and wives in Nepal may influence the choice of specific methods. Data collected from currently married, nonpregnant women aged 15-49 in the Nepal Demographic and Health Surveys of 1996, 2001 and 2006 were analyzed to identify shifts in the education levels of husbands and wives and the influence of those shifts on couples' current contraceptive method use over the past decade. Multinomial logistic regression models assessed associations between method choice and each partner's education level, the education differential between partners and a combined education measure. Although the wife's education level was associated with the type of method used by the couple, the husband's education level had more influence on the use of male sterilization and condoms. For example, men with any secondary or higher education were more likely than those with none to rely on either of these methods (relative risk ratios, 1.6-2.1). Furthermore, couples in which the husband had at least six more years of education than the wife also showed increased reliance on male sterilization or condoms (1.6-1.8). Differences in the use of any method of family planning by education level have narrowed considerably in the past decade, although differentials remain in the use of some methods. A better understanding of how wives' and husbands' relative educational attainment affects decisions on their contraceptive choices is needed, particularly when both education levels and contraceptive use are increasing.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the health or safety of others, the agency must make an individualized assessment, based on reasonable judgment that relies on current medical knowledge or on the best available objective evidence to ascertain...
Code of Federal Regulations, 2010 CFR
2010-04-01
... the health or safety of others, the agency must make an individualized assessment, based on reasonable judgment that relies on current medical knowledge or on the best available objective evidence to ascertain...
Demand for Long-Term Care Insurance in China.
Wang, Qun; Zhou, Yi; Ding, Xinrui; Ying, Xiaohua
2017-12-22
The aim of this study was to estimate willingness to pay (WTP) for long-term care insurance (LTCI) and to explore the determinants of demand for LTCI in China. We collected data from a household survey conducted in Qinghai and Zhejiang on a sample of 1842 households. We relied on contingent valuation methods to elicit the demand for LTCI and random effects logistic regression to analyze the factors associated with the demand for LTCI. Complementarily, we used document analysis to compare the LTCI designed in this study and the current LTCI policies in the pilot cities. More than 90% of the respondents expressed their willingness to buy LTCI. The median WTP for LTCI was estimated at 370.14 RMB/year, accounting for 2.29% of average annual per capita disposable income. Price, age, education status, and income were significantly associated with demand for LTCI. Most pilot cities were found to mainly rely on Urban Employees Basic Medical Insurance funds as the financing source for LTCI. Considering that financing is one of the greatest challenges in the development of China's LTCI, we suggest that policy makers consider individual contribution as an important and possible option as a source of financing for LTCI.
Kocot, Kevin M; Citarella, Mathew R; Moroz, Leonid L; Halanych, Kenneth M
2013-01-01
Molecular phylogenetics relies on accurate identification of orthologous sequences among the taxa of interest. Most orthology inference programs available for use in phylogenomics rely on small sets of pre-defined orthologs from model organisms or phenetic approaches such as all-versus-all sequence comparisons followed by Markov graph-based clustering. Such approaches have high sensitivity but may erroneously include paralogous sequences. We developed PhyloTreePruner, a software utility that uses a phylogenetic approach to refine orthology inferences made using phenetic methods. PhyloTreePruner checks single-gene trees for evidence of paralogy and generates a new alignment for each group containing only sequences inferred to be orthologs. Importantly, PhyloTreePruner takes into account support values on the tree and avoids unnecessarily deleting sequences in cases where a weakly supported tree topology incorrectly indicates paralogy. A test of PhyloTreePruner on a dataset generated from 11 completely sequenced arthropod genomes identified 2,027 orthologous groups sampled for all taxa. Phylogenetic analysis of the concatenated supermatrix yielded a generally well-supported topology that was consistent with the current understanding of arthropod phylogeny. PhyloTreePruner is freely available from http://sourceforge.net/projects/phylotreepruner/.
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
Changes in Adult Child Caregiver Networks
ERIC Educational Resources Information Center
Szinovacz, Maximiliane E.; Davey, Adam
2007-01-01
Purpose: Caregiving research has typically relied on cross-sectional data that focus on the primary caregiver. This approach neglects the dynamic and systemic character of caregiver networks. Our analyses addressed changes in adult child care networks over a 2-year period. Design and Methods: The study relied on pooled data from Waves 1 through 5…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
..., scientific data or information relied on to support the adequacy of water treatment methods, treatment monitoring results, water testing results, and scientific data or information relied on to support any... recommendations in the Sprout Guides to test spent irrigation water; several comments supported expanded testing...
A Novel Approach to Rotorcraft Damage Tolerance
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Everett, Richard A.; Newman, John A.
2002-01-01
Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.
A Case for Sustainability Pedagogical Content Knowledge in Multicultural Teacher Education
ERIC Educational Resources Information Center
Perry, Robin K.
2013-01-01
If education is going to offer a remedy for rather than exasperate the problem of the ecological and cultural crisis currently being faced, teacher learning must be at the forefront of the discussion. Current efforts to educate for sustainability rely upon teachers who are knowledgeable, skilled, and committed agents for change. The same is true…
Female College Students' Perceptions of Organ Donation
ERIC Educational Resources Information Center
Boland, Kathleen; Baker, Kerrie
2010-01-01
The current process of organ donation in the U.S. relies on the premise of altruism or voluntary consent. Yet, human organs available for donation and transplant do not meet current demands. The literature has suggested that college students, who represent a large group of potential healthy organ donors, often are not part of donor pools. Before…
The Impact of Current Economic Crisis on Community Colleges
ERIC Educational Resources Information Center
Okpala, Comfort O.; Hopson, Linda; Okpala, Amon O.
2011-01-01
The focus of the study was to examine the impact of the recession on (1) community college funding, (2) community college student support services, and (3) on student enrollment. This study relied on data from document analysis and interview of community college personnel and students. The current crisis has resulted in a steep budget reduction to…
Selecting a Superintendent in a Tight Market: How the Current Superintendent Can Help
ERIC Educational Resources Information Center
Kersten, Thomas
2009-01-01
Selecting a new district leader is always a challenge for school board members. A poor decision can lead to difficulties for everyone associated with the school district including the newly appointed superintendent. By relying on the wisdom and experience of the current superintendent, boards of education enhance their chances of selecting the…
Controlled longitudinal emittance blow-up using band-limited phase noise in CERN PSB
NASA Astrophysics Data System (ADS)
Quartullo, D.; Shaposhnikova, E.; Timko, H.
2017-07-01
Controlled longitudinal emittance blow-up (from 1 eVs to 1.4 eVs) for LHC beams in the CERN PS Booster is currently achievied using sinusoidal phase modulation of a dedicated high-harmonic RF system. In 2021, after the LHC injectors upgrade, 3 eVs should be extracted to the PS. Even if the current method may satisfy the new requirements, it relies on low-power level RF improvements. In this paper another method of blow-up was considered, that is the injection of band-limited phase noise in the main RF system (h=1), never tried in PSB but already used in CERN SPS and LHC, under different conditions (longer cycles). This technique, which lowers the peak line density and therefore the impact of intensity effects in the PSB and the PS, can also be complementary to the present method. The longitudinal space charge, dominant in the PSB, causes significant synchrotron frequency shifts with intensity, and its effect should be taken into account. Another complication arises from the interaction of the phase loop with the injected noise, since both act on the RF phase. All these elements were studied in simulations of the PSB cycle with the BLonD code, and the required blow-up was achieved.
Searching for the full symphony of black hole binary mergers
NASA Astrophysics Data System (ADS)
Harry, Ian; Bustillo, Juan Calderón; Nitz, Alex
2018-01-01
Current searches for the gravitational-wave signature of compact binary mergers rely on matched-filtering data from interferometric observatories with sets of modeled gravitational waveforms. These searches currently use model waveforms that do not include the higher-order mode content of the gravitational-wave signal. Higher-order modes are important for many compact binary mergers and their omission reduces the sensitivity to such sources. In this work we explore the sensitivity loss incurred from omitting higher-order modes. We present a new method for searching for compact binary mergers using waveforms that include higher-order mode effects, and evaluate the sensitivity increase that using our new method would allow. We find that, when evaluating sensitivity at a constant rate-of-false alarm, and when including the fact that signal-consistency tests can reject some signals that include higher-order mode content, we observe a sensitivity increase of up to a factor of 2 in volume for high mass ratio, high total-mass systems. For systems with equal mass, or with total mass ˜50 M⊙, we see more modest sensitivity increases, <10 %, which indicates that the existing search is already performing well. Our new search method is also directly applicable in searches for generic compact binaries.
Evaluating groundwater flow using passive electrical measurements
NASA Astrophysics Data System (ADS)
Voytek, E.; Revil, A.; Singha, K.
2016-12-01
Accurate quantification of groundwater flow patterns, both in magnitude and direction, is a necessary component of evaluating any hydrologic system. Groundwater flow patterns are often determined using a dense network of wells or piezometers, which can be limited due to logistical or regulatory constraints. The self-potential (SP) method, a passive geophysical technique that relies on currents generated by water movement through porous materials, is a re-emerging alternative or addition to traditional piezometer networks. Naturally generated currents can be measured as voltage differences at the ground surface using only two electrodes, or a more complex electrode array. While the association between SP measurements and groundwater flow was observed as early as 1890s, the method has seen resurgence in hydrology since the governing equations were refined in the 1980s. The method can be used to analyze hydrologic processes at various temporal and spatial scales. Here we present the results of multiple SP surveys collected a multiple scales (1 to 10s of meters). Here single SP grid surveys are used to evaluate flow patterns through artic hillslopes at a discrete point in time. Additionally, a coupled groundwater and electrical model is used to analyze multiple SP data sets to evaluate seasonal changes in groundwater flow through an alpine meadow.
Masked Ballot Voting for Receipt-Free Online Elections
NASA Astrophysics Data System (ADS)
Wen, Roland; Buckland, Richard
To prevent bribery and coercion attacks on voters, current online election schemes rely on strong physical assumptions during the election. We introduce Masked Ballot, an online voting scheme that mitigates these attacks while using a more practical assumption: untappable channels are available but only before the election. During the election voters cast ballots over completely public channels without relying on untappable channels, anonymous channels or trusted devices. Masked Ballot performs only the voting part of an election and is designed to integrate with counting schemes that compute the final election result.
Development of a fast and efficient method for hepatitis A virus concentration from green onion.
Zheng, Yan; Hu, Yuan
2017-11-01
Hepatitis A virus (HAV) can cause serious liver disease and even death. HAV outbreaks are associated with the consumption of raw or minimally processed produce, making it a major public health concern. Infections have occurred despite the fact that effective HAV vaccine has been available. Development of a rapid and sensitive HAV detection method is necessary for an investigation of an HAV outbreak. Detection of HAV is complicated by the lack of a reliable culture method. In addition, due to the low infectious dose of HAV, these methods must be very sensitive. Current methods rely on efficient sample preparation and concentration steps followed by sensitive molecular detection techniques. Using green onions which was involved in most recent HAV outbreaks as a representative produce, a method of capturing virus particles was developed using carboxyl-derivatized magnetic beads in this study. Carboxyl beads, like antibody-coated beads or cationic beads, detect HAV at a level as low as 100 pfu/25g of green onions. RNA from virus concentrated in this manner can be released by heat-shock (98°C 5min) for molecular detection without sacrificing sensitivity. Bypassing the RNA extraction procedure saves time and removes multiple manipulation steps, which makes large scale HAV screening possible. In addition, the inclusion of beef extract and pectinase rather than NP40 in the elution buffer improved the HAV liberation from the food matrix over current methods by nearly 10 fold. The method proposed in this study provides a promising tool to improve food risk assessment and protect public health. Published by Elsevier B.V.
Atmospheric-pressure ionization and fragmentation of peptides by solution-cathode glow discharge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Andrew J.; Shelley, Jacob T.; Walton, Courtney L.
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here in this paper, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-likemore » spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma.« less
Atmospheric-pressure ionization and fragmentation of peptides by solution-cathode glow discharge
Schwartz, Andrew J.; Shelley, Jacob T.; Walton, Courtney L.; ...
2016-06-27
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here in this paper, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-likemore » spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma.« less
Boland, Mary Regina; Jacunski, Alexandra; Lorberbaum, Tal; Romano, Joseph D; Moskovitch, Robert; Tatonetti, Nicholas P
2016-01-01
Small molecules are indispensable to modern medical therapy. However, their use may lead to unintended, negative medical outcomes commonly referred to as adverse drug reactions (ADRs). These effects vary widely in mechanism, severity, and populations affected, making ADR prediction and identification important public health concerns. Current methods rely on clinical trials and postmarket surveillance programs to find novel ADRs; however, clinical trials are limited by small sample size, whereas postmarket surveillance methods may be biased and inherently leave patients at risk until sufficient clinical evidence has been gathered. Systems pharmacology, an emerging interdisciplinary field combining network and chemical biology, provides important tools to uncover and understand ADRs and may mitigate the drawbacks of traditional methods. In particular, network analysis allows researchers to integrate heterogeneous data sources and quantify the interactions between biological and chemical entities. Recent work in this area has combined chemical, biological, and large-scale observational health data to predict ADRs in both individual patients and global populations. In this review, we explore the rapid expansion of systems pharmacology in the study of ADRs. We enumerate the existing methods and strategies and illustrate progress in the field with a model framework that incorporates crucial data elements, such as diet and comorbidities, known to modulate ADR risk. Using this framework, we highlight avenues of research that may currently be underexplored, representing opportunities for future work. © 2015 Wiley Periodicals, Inc.
Current and Future Technologies for Microbiological Decontamination of Cereal Grains.
Los, Agata; Ziuzina, Dana; Bourke, Paula
2018-06-01
Cereal grains are the most important staple foods for mankind worldwide. The constantly increasing annual production and yield is matched by demand for cereals, which is expected to increase drastically along with the global population growth. A critical food safety and quality issue is to minimize the microbiological contamination of grains as it affects cereals both quantitatively and qualitatively. Microorganisms present in cereals can affect the safety, quality, and functional properties of grains. Some molds have the potential to produce harmful mycotoxins and pose a serious health risk for consumers. Therefore, it is essential to reduce cereal grain contamination to the minimum to ensure safety both for human and animal consumption. Current production of cereals relies heavily on pesticides input, however, numerous harmful effects on human health and on the environment highlight the need for more sustainable pest management and agricultural methods. This review evaluates microbiological risks, as well as currently used and potential technologies for microbiological decontamination of cereal grains. © 2018 Institute of Food Technologists®.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agustsson, Ronald
In this project, RadiaBeam Technologies was tasked with developing a novel solution for a cost effective quench protection based on fast expansion of the normal zone. By inductively coupling a strong electromagnetic pulse via a resonant LC circuit, we attempted to demonstrate accelerated normal zone propagation. The AC field induces currents in the superconducting layer with the current density exceeding that of the critical current density, J c. This creates a large normal zone, uniformly distributing the dissipation through the magnet body. The method does not rely on thermal heating of the conductor, thus enabling nearly instantaneous protection. Through themore » course of the Phase II project, RadiaBeam Technologies continued extensive numerical modeling of the inductive quench system, re-designed and built several iterations of the POC system for testing and observed evidence of a transient partial quench being induced. However the final device was not fabricated. This was a consequence of the fundamentally complex nature of the energy extraction process and the challenges associated even with demonstrating the proof of concept in a bench top device.« less
Organic electrochemical transistors for cell-based impedance sensing
NASA Astrophysics Data System (ADS)
Rivnay, Jonathan; Ramuz, Marc; Leleux, Pierre; Hama, Adel; Huerta, Miriam; Owens, Roisin M.
2015-01-01
Electrical impedance sensing of biological systems, especially cultured epithelial cell layers, is now a common technique to monitor cell motion, morphology, and cell layer/tissue integrity for high throughput toxicology screening. Existing methods to measure electrical impedance most often rely on a two electrode configuration, where low frequency signals are challenging to obtain for small devices and for tissues with high resistance, due to low current. Organic electrochemical transistors (OECTs) are conducting polymer-based devices, which have been shown to efficiently transduce and amplify low-level ionic fluxes in biological systems into electronic output signals. In this work, we combine OECT-based drain current measurements with simultaneous measurement of more traditional impedance sensing using the gate current to produce complex impedance traces, which show low error at both low and high frequencies. We apply this technique in vitro to a model epithelial tissue layer and show that the data can be fit to an equivalent circuit model yielding trans-epithelial resistance and cell layer capacitance values in agreement with literature. Importantly, the combined measurement allows for low biases across the cell layer, while still maintaining good broadband signal.
Particle tracing modeling of ion fluxes at geosynchronous orbit
Brito, Thiago V.; Woodroffe, Jesse; Jordanova, Vania K.; ...
2017-10-31
The initial results of a coupled MHD/particle tracing method to evaluate particle fluxes in the inner magnetosphere are presented. This setup is capable of capturing the earthward particle acceleration process resulting from dipolarization events in the tail region of the magnetosphere. On the period of study, the MHD code was able to capture a dipolarization event and the particle tracing algorithm was able to capture our results of these disturbances and calculate proton fluxes in the night side geosynchronous orbit region. The simulation captured dispersionless injections as well as the energy dispersion signatures that are frequently observed by satellites atmore » geosynchronous orbit. Currently, ring current models rely on Maxwellian-type distributions based on either empirical flux values or sparse satellite data for their boundary conditions close to geosynchronous orbit. In spite of some differences in intensity and timing, the setup presented here is able to capture substorm injections, which represents an improvement regarding a reverse way of coupling these ring current models with MHD codes through the use of boundary conditions.« less
Feng, Jie; Yee, Rebecca; Zhang, Shuo; Tian, Lili; Shi, Wanliang; Zhang, Wen-Hong; Zhang, Ying
2018-01-01
Antibiotic-resistant bacteria have caused huge concerns and demand innovative approaches for their prompt detection. Current antimicrobial susceptibility tests (AST) rely on the growth of the organisms which takes 1-2 days for fast-growing organisms and several weeks for slow growing organisms. Here, we show for the first time the utility of the SYBR Green I/propidium iodide (PI) viability assay for rapidly identifying antibiotic resistance in less than 30 min for major, antibiotic-resistant, fast-growing bacteria, such as Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae , and Acinetobacter baumannii for bactericidal and bacteriostatic agents and in 16 h for extremely rapid detection of drug resistance for isoniazid and pyrazinamide in slow-growing Mycobacterium tuberculosis . The SYBR Green I/PI assay generated rapid and robust results in concordance with traditional AST methods. This novel growth-independent methodology changes the concept of the current growth-based AST and may revolutionize current drug susceptibility testing for all cells of prokaryotic and eukaryotic origin and, subject to further clinical validation, may play a major role in saving lives and improving patient outcomes.
TMEM150C/Tentonin3 Is a Regulator of Mechano-gated Ion Channels.
Anderson, Evan O; Schneider, Eve R; Matson, Jon D; Gracheva, Elena O; Bagriantsev, Sviatoslav N
2018-04-17
Neuronal mechano-sensitivity relies on mechano-gated ion channels, but pathways regulating their activity remain poorly understood. TMEM150C was proposed to mediate mechano-activated current in proprioceptive neurons. Here, we studied functional interaction of TMEM150C with mechano-gated ion channels from different classes (Piezo2, Piezo1, and the potassium channel TREK-1) using two independent methods of mechanical stimulation. We found that TMEM150C significantly prolongs the duration of the mechano-current produced by all three channels, decreases apparent activation threshold in Piezo2, and induces persistent current in Piezo1. We also show that TMEM150C is co-expressed with Piezo2 in trigeminal neurons, expanding its role beyond proprioceptors. Finally, we cloned TMEM150C from the trigeminal neurons of the tactile-foraging domestic duck and showed that it functions similarly to the mouse ortholog, demonstrating evolutionary conservation among vertebrates. Our studies reveal TMEM150C as a general regulator of mechano-gated ion channels from different classes. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
Particle tracing modeling of ion fluxes at geosynchronous orbit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brito, Thiago V.; Woodroffe, Jesse; Jordanova, Vania K.
The initial results of a coupled MHD/particle tracing method to evaluate particle fluxes in the inner magnetosphere are presented. This setup is capable of capturing the earthward particle acceleration process resulting from dipolarization events in the tail region of the magnetosphere. On the period of study, the MHD code was able to capture a dipolarization event and the particle tracing algorithm was able to capture our results of these disturbances and calculate proton fluxes in the night side geosynchronous orbit region. The simulation captured dispersionless injections as well as the energy dispersion signatures that are frequently observed by satellites atmore » geosynchronous orbit. Currently, ring current models rely on Maxwellian-type distributions based on either empirical flux values or sparse satellite data for their boundary conditions close to geosynchronous orbit. In spite of some differences in intensity and timing, the setup presented here is able to capture substorm injections, which represents an improvement regarding a reverse way of coupling these ring current models with MHD codes through the use of boundary conditions.« less
Estimation of capacities on Florida freeways.
DOT National Transportation Integrated Search
2014-09-01
Current capacity estimates within Floridas travel time reliability tools rely on the Highway Capacity Manual (HCM 2010) to : estimate capacity under various conditions. Field measurements show that the capacities of Florida freeways are noticeably...
current weather conditions in their operating area. All NWS marine forecasts rely heavily on the Voluntary weather conditions in their operating area. Home, Parent Office, Marine, Tropical, and Tsunami Services
Richens, Joanna L; Urbanowicz, Richard A; Lunt, Elizabeth AM; Metcalf, Rebecca; Corne, Jonathan; Fairclough, Lucy; O'Shea, Paul
2009-01-01
Chronic obstructive pulmonary disease (COPD) is a treatable and preventable disease state, characterised by progressive airflow limitation that is not fully reversible. Although COPD is primarily a disease of the lungs there is now an appreciation that many of the manifestations of disease are outside the lung, leading to the notion that COPD is a systemic disease. Currently, diagnosis of COPD relies on largely descriptive measures to enable classification, such as symptoms and lung function. Here the limitations of existing diagnostic strategies of COPD are discussed and systems biology approaches to diagnosis that build upon current molecular knowledge of the disease are described. These approaches rely on new 'label-free' sensing technologies, such as high-throughput surface plasmon resonance (SPR), that we also describe. PMID:19386108
Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.
Venturi, D; Karniadakis, G E
2014-06-08
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.
Towards the estimation of effect measures in studies using respondent-driven sampling.
Rotondi, Michael A
2014-06-01
Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.
Biggs, Caroline I; Edmondson, Steve; Gibson, Matthew I
2015-01-01
Carbohydrate arrays are a vital tool in studying infection, probing the mechanisms of bacterial, viral and toxin adhesion and the development of new treatments, by mimicking the structure of the glycocalyx. Current methods rely on the formation of monolayers of carbohydrates that have been chemically modified with a linker to enable interaction with a functionalised surface. This includes amines, biotin, lipids or thiols. Thiol-addition to gold to form self-assembled monolayers is perhaps the simplest method for immobilisation as thiolated glycans are readily accessible from reducing carbohydrates in a single step, but are limited to gold surfaces. Here we have developed a quick and versatile methodology which enables the use of thiolated carbohydrates to be immobilised as monolayers directly onto acrylate-functional glass slides via a 'thiol-ene'/Michael-type reaction. By combining the ease of thiol chemistry with glass slides, which are compatible with microarray scanners this offers a cost effective, but also useful method to assemble arrays.
A permutation testing framework to compare groups of brain networks.
Simpson, Sean L; Lyday, Robert G; Hayasaka, Satoru; Marsh, Anthony P; Laurienti, Paul J
2013-01-01
Brain network analyses have moved to the forefront of neuroimaging research over the last decade. However, methods for statistically comparing groups of networks have lagged behind. These comparisons have great appeal for researchers interested in gaining further insight into complex brain function and how it changes across different mental states and disease conditions. Current comparison approaches generally either rely on a summary metric or on mass-univariate nodal or edge-based comparisons that ignore the inherent topological properties of the network, yielding little power and failing to make network level comparisons. Gleaning deeper insights into normal and abnormal changes in complex brain function demands methods that take advantage of the wealth of data present in an entire brain network. Here we propose a permutation testing framework that allows comparing groups of networks while incorporating topological features inherent in each individual network. We validate our approach using simulated data with known group differences. We then apply the method to functional brain networks derived from fMRI data.
Technical Aspects of Fecal Microbial Transplantation (FMT).
Bhutiani, N; Schucht, J E; Miller, K R; McClave, Stephen A
2018-06-09
Fecal microbial transplantation (FMT) has become established as an effective therapeutic modality in the treatment of antibiotic-refractory recurrent Clostridium difficile colitis. A number of formulations and methods of delivery of FMT are currently available, each with distinct advantages. This review aims to review donor and patient selection for FMT as well as procedural aspects of FMT to help guide clinical practice. FMT can be obtained in fresh, frozen, lyophilized, and capsule-based formulations for delivery by oral ingestion, nasoenteric tube, colonoscopy, or enema (depending on the formulation used). Choosing the optimal method relies heavily on patient-related factors, including underlying pathology and severity of illness. As potential applications for FMT expand, careful donor screening and patient selection are critical to minimizing risk to patients and physicians. FMT represents an excellent therapeutic option for treatment of recurrent Clostridium difficile colitis and holds promise as a possible treatment modality in a variety of other conditions. The wide array of delivery methods allows for its application in various disease states in both the inpatient and outpatient setting.
Morgan, Sonya J; Pullon, Susan R H; Macdonald, Lindsay M; McKinlay, Eileen M; Gray, Ben V
2017-06-01
Case study research is a comprehensive method that incorporates multiple sources of data to provide detailed accounts of complex research phenomena in real-life contexts. However, current models of case study research do not particularly distinguish the unique contribution observation data can make. Observation methods have the potential to reach beyond other methods that rely largely or solely on self-report. This article describes the distinctive characteristics of case study observational research, a modified form of Yin's 2014 model of case study research the authors used in a study exploring interprofessional collaboration in primary care. In this approach, observation data are positioned as the central component of the research design. Case study observational research offers a promising approach for researchers in a wide range of health care settings seeking more complete understandings of complex topics, where contextual influences are of primary concern. Future research is needed to refine and evaluate the approach.
Temporal enhancement of two-dimensional color doppler echocardiography
NASA Astrophysics Data System (ADS)
Terentjev, Alexey B.; Settlemier, Scott H.; Perrin, Douglas P.; del Nido, Pedro J.; Shturts, Igor V.; Vasilyev, Nikolay V.
2016-03-01
Two-dimensional color Doppler echocardiography is widely used for assessing blood flow inside the heart and blood vessels. Currently, frame acquisition time for this method varies from tens to hundreds of milliseconds, depending on Doppler sector parameters. This leads to low frame rates of resulting video sequences equal to tens of Hz, which is insufficient for some diagnostic purposes, especially in pediatrics. In this paper, we present a new approach for reconstruction of 2D color Doppler cardiac images, which results in the frame rate being increased to hundreds of Hz. This approach relies on a modified method of frame reordering originally applied to real-time 3D echocardiography. There are no previous publications describing application of this method to 2D Color Doppler data. The approach has been tested on several in-vivo cardiac 2D color Doppler datasets with approximate duration of 30 sec and native frame rate of 15 Hz. The resulting image sequences had equivalent frame rates to 500Hz.
Efficient Strategies for Estimating the Spatial Coherence of Backscatter
Hyun, Dongwoon; Crowley, Anna Lisa C.; Dahl, Jeremy J.
2017-01-01
The spatial coherence of ultrasound backscatter has been proposed to reduce clutter in medical imaging, to measure the anisotropy of the scattering source, and to improve the detection of blood flow. These techniques rely on correlation estimates that are obtained using computationally expensive strategies. In this study, we assess existing spatial coherence estimation methods and propose three computationally efficient modifications: a reduced kernel, a downsampled receive aperture, and the use of an ensemble correlation coefficient. The proposed methods are implemented in simulation and in vivo studies. Reducing the kernel to a single sample improved computational throughput and improved axial resolution. Downsampling the receive aperture was found to have negligible effect on estimator variance, and improved computational throughput by an order of magnitude for a downsample factor of 4. The ensemble correlation estimator demonstrated lower variance than the currently used average correlation. Combining the three methods, the throughput was improved 105-fold in simulation with a downsample factor of 4 and 20-fold in vivo with a downsample factor of 2. PMID:27913342
Mapping regional livelihood benefits from local ecosystem services assessments in rural Sahel
Sinare, Hanna; Enfors Kautsky, Elin; Ouedraogo, Issa; Gordon, Line J.
2018-01-01
Most current approaches to landscape scale ecosystem service assessments rely on detailed secondary data. This type of data is seldom available in regions with high levels of poverty and strong local dependence on provisioning ecosystem services for livelihoods. We develop a method to extrapolate results from a previously published village scale ecosystem services assessment to a higher administrative level, relevant for land use decision making. The method combines remote sensing (using a hybrid classification method) and interviews with community members. The resulting landscape scale maps show the spatial distribution of five different livelihood benefits (nutritional diversity, income, insurance/saving, material assets and energy, and crops for consumption) that illustrate the strong multifunctionality of the Sahelian landscapes. The maps highlight the importance of a diverse set of sub-units of the landscape in supporting Sahelian livelihoods. We see a large potential in using the resulting type of livelihood benefit maps for guiding future land use decisions in the Sahel. PMID:29389965
Mapping regional livelihood benefits from local ecosystem services assessments in rural Sahel.
Malmborg, Katja; Sinare, Hanna; Enfors Kautsky, Elin; Ouedraogo, Issa; Gordon, Line J
2018-01-01
Most current approaches to landscape scale ecosystem service assessments rely on detailed secondary data. This type of data is seldom available in regions with high levels of poverty and strong local dependence on provisioning ecosystem services for livelihoods. We develop a method to extrapolate results from a previously published village scale ecosystem services assessment to a higher administrative level, relevant for land use decision making. The method combines remote sensing (using a hybrid classification method) and interviews with community members. The resulting landscape scale maps show the spatial distribution of five different livelihood benefits (nutritional diversity, income, insurance/saving, material assets and energy, and crops for consumption) that illustrate the strong multifunctionality of the Sahelian landscapes. The maps highlight the importance of a diverse set of sub-units of the landscape in supporting Sahelian livelihoods. We see a large potential in using the resulting type of livelihood benefit maps for guiding future land use decisions in the Sahel.
Measuring carbon in forests: current status and future challenges.
Brown, Sandra
2002-01-01
To accurately and precisely measure the carbon in forests is gaining global attention as countries seek to comply with agreements under the UN Framework Convention on Climate Change. Established methods for measuring carbon in forests exist, and are best based on permanent sample plots laid out in a statistically sound design. Measurements on trees in these plots can be readily converted to aboveground biomass using either biomass expansion factors or allometric regression equations. A compilation of existing root biomass data for upland forests of the world generated a significant regression equation that can be used to predict root biomass based on aboveground biomass only. Methods for measuring coarse dead wood have been tested in many forest types, but the methods could be improved if a non-destructive tool for measuring the density of dead wood was developed. Future measurements of carbon storage in forests may rely more on remote sensing data, and new remote data collection technologies are in development.
Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems
Venturi, D.; Karniadakis, G. E.
2014-01-01
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519
Nielsen, H Bjørn; Almeida, Mathieu; Juncker, Agnieszka Sierakowska; Rasmussen, Simon; Li, Junhua; Sunagawa, Shinichi; Plichta, Damian R; Gautier, Laurent; Pedersen, Anders G; Le Chatelier, Emmanuelle; Pelletier, Eric; Bonde, Ida; Nielsen, Trine; Manichanh, Chaysavanh; Arumugam, Manimozhiyan; Batto, Jean-Michel; Quintanilha Dos Santos, Marcelo B; Blom, Nikolaj; Borruel, Natalia; Burgdorf, Kristoffer S; Boumezbeur, Fouad; Casellas, Francesc; Doré, Joël; Dworzynski, Piotr; Guarner, Francisco; Hansen, Torben; Hildebrand, Falk; Kaas, Rolf S; Kennedy, Sean; Kristiansen, Karsten; Kultima, Jens Roat; Léonard, Pierre; Levenez, Florence; Lund, Ole; Moumen, Bouziane; Le Paslier, Denis; Pons, Nicolas; Pedersen, Oluf; Prifti, Edi; Qin, Junjie; Raes, Jeroen; Sørensen, Søren; Tap, Julien; Tims, Sebastian; Ussery, David W; Yamada, Takuji; Renault, Pierre; Sicheritz-Ponten, Thomas; Bork, Peer; Wang, Jun; Brunak, Søren; Ehrlich, S Dusko
2014-08-01
Most current approaches for analyzing metagenomic data rely on comparisons to reference genomes, but the microbial diversity of many environments extends far beyond what is covered by reference databases. De novo segregation of complex metagenomic data into specific biological entities, such as particular bacterial strains or viruses, remains a largely unsolved problem. Here we present a method, based on binning co-abundant genes across a series of metagenomic samples, that enables comprehensive discovery of new microbial organisms, viruses and co-inherited genetic entities and aids assembly of microbial genomes without the need for reference sequences. We demonstrate the method on data from 396 human gut microbiome samples and identify 7,381 co-abundance gene groups (CAGs), including 741 metagenomic species (MGS). We use these to assemble 238 high-quality microbial genomes and identify affiliations between MGS and hundreds of viruses or genetic entities. Our method provides the means for comprehensive profiling of the diversity within complex metagenomic samples.
A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.
Faya, Paul; Stamey, James D; Seaman, John W
2017-01-01
For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.
Certified randomness in quantum physics.
Acín, Antonio; Masanes, Lluis
2016-12-07
The concept of randomness plays an important part in many disciplines. On the one hand, the question of whether random processes exist is fundamental for our understanding of nature. On the other, randomness is a resource for cryptography, algorithms and simulations. Standard methods for generating randomness rely on assumptions about the devices that are often not valid in practice. However, quantum technologies enable new methods for generating certified randomness, based on the violation of Bell inequalities. These methods are referred to as device-independent because they do not rely on any modelling of the devices. Here we review efforts to design device-independent randomness generators and the associated challenges.
Probing the Higgs self coupling via single Higgs production at the LHC
Degrassi, G.; Giardino, P. P.; Maltoni, F.; ...
2016-12-16
Here, we propose a method to determine the trilinear Higgs self coupling that is alternative to the direct measurement of Higgs pair production total cross sections and differential distributions. Furthermore, the method relies on the effects that electroweak loops featuring an anomalous trilinear coupling would imprint on single Higgs production at the LHC. We first calculate these contributions to all the phenomenologically relevant Higgs production (ggF, VBF, WH, ZH, tmore » $$\\bar{t}$$ ) and decay (γγ,WW*/ZZ*→ 4f, b$$\\bar{b}$$,ττ) modes at the LHC and then estimate the sensitivity to the trilinear coupling via a one-parameter fit to the single Higgs measurements at the LHC 8 TeV. We also found that the bounds on the self coupling are already competitive with those from Higgs pair production and will be further improved in the current and next LHC runs.« less
Quantitation of Localized 31P Magnetic Resonance Spectra Based on the Reciprocity Principle
NASA Astrophysics Data System (ADS)
Kreis, R.; Slotboom, J.; Pietz, J.; Jung, B.; Boesch, C.
2001-04-01
There is a need for absolute quantitation methods in 31P magnetic resonance spectroscopy, because none of the phosphorous-containing metabolites is necessarily constant in pathology. Here, a method for absolute quantitation of in vivo31P MR spectra that provides reproducible metabolite contents in institutional or standard units is described. It relies on the reciprocity principle, i.e., the proportionality between the B1 field map and the map of reception strength for a coil with identical relative current distributions in receive and transmit mode. Cerebral tissue contents of 31P metabolites were determined in a predominantly white matter-containing location in healthy subjects. The results are in good agreement with the literature and the interexamination coefficient of variance is better than that in most previous studies. A gender difference found for some of the 31P metabolites may be explained by different voxel composition.
Speckle-modulating optical coherence tomography in living mice and humans.
Liba, Orly; Lew, Matthew D; SoRelle, Elliott D; Dutta, Rebecca; Sen, Debasish; Moshfeghi, Darius M; Chu, Steven; de la Zerda, Adam
2017-06-20
Optical coherence tomography (OCT) is a powerful biomedical imaging technology that relies on the coherent detection of backscattered light to image tissue morphology in vivo. As a consequence, OCT is susceptible to coherent noise (speckle noise), which imposes significant limitations on its diagnostic capabilities. Here we show speckle-modulating OCT (SM-OCT), a method based purely on light manipulation that virtually eliminates speckle noise originating from a sample. SM-OCT accomplishes this by creating and averaging an unlimited number of scans with uncorrelated speckle patterns without compromising spatial resolution. Using SM-OCT, we reveal small structures in the tissues of living animals, such as the inner stromal structure of a live mouse cornea, the fine structures inside the mouse pinna, and sweat ducts and Meissner's corpuscle in the human fingertip skin-features that are otherwise obscured by speckle noise when using conventional OCT or OCT with current state of the art speckle reduction methods.
A generic, cost-effective, and scalable cell lineage analysis platform
Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud
2016-01-01
Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250
Greenhouse Gas Analysis by GC/MS
NASA Astrophysics Data System (ADS)
Bock, E. M.; Easton, Z. M.; Macek, P.
2015-12-01
Current methods to analyze greenhouse gases rely on designated complex, multiple-column, multiple-detector gas chromatographs. A novel method was developed in partnership with Shimadzu for simultaneous quantification of carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) in environmental gas samples. Gas bulbs were used to make custom standard mixtures by injecting small volumes of pure analyte into the nitrogen-filled bulb. Resulting calibration curves were validated using a certified gas standard. The use of GC/MS systems to perform this analysis has the potential to move the analysis of greenhouse gasses from expensive, custom GC systems to standard single-quadrupole GC/MS systems that are available in most laboratories, which wide variety of applications beyond greenhouse gas analysis. Additionally, use of mass spectrometry can provide confirmation of identity of target analytes, and will assist in the identification of unknown peaks should they be present in the chromatogram.
Beck, John J; Willett, Denis S; Gee, Wai S; Mahoney, Noreen E; Higbee, Bradley S
2016-12-14
Contamination by aflatoxin, a toxic metabolite produced by Aspergillus fungi ubiquitous in California almond and pistachio orchards, results in millions of dollars of lost product annually. Current detection of aflatoxin relies on destructive, expensive, and time-intensive laboratory-based methods. To explore an alternative method for the detection of general fungal growth, volatile emission profiles of almonds at varying humidities were sampled using both static SPME and dynamic needle-trap SPE followed by benchtop and portable GC-MS analysis. Despite the portable SPE/GC-MS system detecting fewer volatiles than the benchtop system, both systems resolved humidity treatments and identified potential fungal biomarkers at extremely low water activity levels. This ability to resolve humidity levels suggests that volatile profiles from germinating fungal spores could be used to create an early warning, nondestructive, portable detection system of fungal growth.
A practical and catalyst-free trifluoroethylation reaction of amines using trifluoroacetic acid
NASA Astrophysics Data System (ADS)
Andrews, Keith G.; Faizova, Radmila; Denton, Ross M.
2017-06-01
Amines are a fundamentally important class of biologically active compounds and the ability to manipulate their physicochemical properties through the introduction of fluorine is of paramount importance in medicinal chemistry. Current synthesis methods for the construction of fluorinated amines rely on air and moisture sensitive reagents that require special handling or harsh reductants that limit functionality. Here we report practical, catalyst-free, reductive trifluoroethylation reactions of free amines exhibiting remarkable functional group tolerance. The reactions proceed in conventional glassware without rigorous exclusion of either moisture or oxygen, and use trifluoroacetic acid as a stable and inexpensive fluorine source. The new methods provide access to a wide range of medicinally relevant functionalized tertiary β-fluoroalkylamine cores, either through direct trifluoroethylation of secondary amines or via a three-component coupling of primary amines, aldehydes and trifluoroacetic acid. A reduction of in situ-generated silyl ester species is proposed to account for the reductive selectivity observed.
AEG-1 promoter-mediated imaging of prostate cancer
Bhatnagar, Akrita; Wang, Yuchuan; Mease, Ronnie C.; Gabrielson, Matthew; Sysa, Polina; Minn, Il; Green, Gilbert; Simmons, Brian; Gabrielson, Kathleen; Sarkar, Siddik; Fisher, Paul B.; Pomper, Martin G.
2014-01-01
We describe a new imaging method for detecting prostate cancer, whether localized or disseminated and metastatic to soft tissues and bone. The method relies on the use of imaging reporter genes under the control of the promoter of AEG-1 (MTDH), which is selectively active only in malignant cells. Through systemic, nanoparticle-based delivery of the imaging construct, lesions can be identified through bioluminescence imaging and single photon emission-computed tomography in the PC3-ML murine model of prostate cancer at high sensitivity. This approach is applicable for the detection of prostate cancer metastases, including bone lesions for which there is no current reliable agent for non-invasive clinical imaging. Further, the approach compares favorably to accepted and emerging clinical standards, including positron emission tomography with [18F]fluorodeoxyglucose and [18F]sodium fluoride. Our results offer a preclinical proof of concept that rationalizes clinical evaluation in patients with advanced prostate cancer. PMID:25145668
Rural sewage treatment processing in Yongjia County, Zhejiang Province
NASA Astrophysics Data System (ADS)
Wang, W. H.; Kuan, T. H.
2016-08-01
Issues regarding water pollution in rural areas of China have garnered increased attention over the years. Further discussion on the circumstances and results of existing domestic sewage treatment methods may serve as an appropriate reference in solving these important issues. This article explored the current conditions of water contamination in rural areas of China, introduced the characteristics and effects of applicable sewage treatment technology, and summarized the results of the planning, installation, and operation of rural sewage treatment facilities in Yongjia County in Zhejiang Province. However, relying on a single technical design rule is not adequate for solving the practical problems that these villages face. Instead, methods of planning rural sewage treatment should be adapted to better suit local conditions and different residential forms. It is crucial, ultimately, for any domestic sewage treatment system in a rural area to be commissioned, engineered, and maintained by a market-oriented professional company.
Speckle-modulating optical coherence tomography in living mice and humans
Liba, Orly; Lew, Matthew D.; SoRelle, Elliott D.; Dutta, Rebecca; Sen, Debasish; Moshfeghi, Darius M.; Chu, Steven; de la Zerda, Adam
2017-01-01
Optical coherence tomography (OCT) is a powerful biomedical imaging technology that relies on the coherent detection of backscattered light to image tissue morphology in vivo. As a consequence, OCT is susceptible to coherent noise (speckle noise), which imposes significant limitations on its diagnostic capabilities. Here we show speckle-modulating OCT (SM-OCT), a method based purely on light manipulation that virtually eliminates speckle noise originating from a sample. SM-OCT accomplishes this by creating and averaging an unlimited number of scans with uncorrelated speckle patterns without compromising spatial resolution. Using SM-OCT, we reveal small structures in the tissues of living animals, such as the inner stromal structure of a live mouse cornea, the fine structures inside the mouse pinna, and sweat ducts and Meissner’s corpuscle in the human fingertip skin—features that are otherwise obscured by speckle noise when using conventional OCT or OCT with current state of the art speckle reduction methods. PMID:28632205
Molecular methods for septicemia diagnosis.
Marco, Francesc
2017-11-01
Septicemia remains a major cause of hospital mortality. Blood culture remains the best approach to identify the etiological microorganisms when a bloodstream infection is suspected but it takes long time because it relies on bacterial or fungal growth. The introduction in clinical microbiology laboratories of the matrix-assisted laser desorption ionization time-of-flight mass spectrometry technology, DNA hybridization, microarrays or rapid PCR-based test significantly reduce the time to results. Tests for direct detection in whole blood samples are highly desirable because of their potential to identify bloodstream pathogens without waiting for blood cultures to become positive. Nonetheless, limitations of current molecular diagnostic methods are substantial. This article reviews these new molecular approaches (LightCycler SeptiFast, Magicplex sepsis real time, Septitest, VYOO, PCR/ESI-MS analysis, T2Candida). Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Lessons from the recent rise in use of female sterilization in Malawi.
Jacobstein, Roy
2013-03-01
Although female sterilization is the most widely used modern contraceptive method in the world, most family planning programs in Africa have had difficulty providing it. Malawi, however, despite daunting constraints, has made female sterilization widely and equitably accessible, thereby increasing method choice and helping its citizens better meet their reproductive intentions. Ten percent of currently married Malawian women of reproductive age rely on female sterilization for contraceptive protection, compared with less than 2 percent across Africa, and demand to limit births now exceeds demand to space births. Malawi's female sterilization prevalence surpasses that of some high-resource countries. Key service-delivery factors enabling this achievement include supportive policies, strong public-private partnerships, and mobile services delivered at no cost by dedicated providers. Challenges remain, but Malawi's achievement offers lessons for other countries with low availability of female sterilization and similar resource constraints. © 2013 The Population Council, Inc.
Speckle-modulating optical coherence tomography in living mice and humans
NASA Astrophysics Data System (ADS)
Liba, Orly; Lew, Matthew D.; Sorelle, Elliott D.; Dutta, Rebecca; Sen, Debasish; Moshfeghi, Darius M.; Chu, Steven; de La Zerda, Adam
2017-06-01
Optical coherence tomography (OCT) is a powerful biomedical imaging technology that relies on the coherent detection of backscattered light to image tissue morphology in vivo. As a consequence, OCT is susceptible to coherent noise (speckle noise), which imposes significant limitations on its diagnostic capabilities. Here we show speckle-modulating OCT (SM-OCT), a method based purely on light manipulation that virtually eliminates speckle noise originating from a sample. SM-OCT accomplishes this by creating and averaging an unlimited number of scans with uncorrelated speckle patterns without compromising spatial resolution. Using SM-OCT, we reveal small structures in the tissues of living animals, such as the inner stromal structure of a live mouse cornea, the fine structures inside the mouse pinna, and sweat ducts and Meissner's corpuscle in the human fingertip skin--features that are otherwise obscured by speckle noise when using conventional OCT or OCT with current state of the art speckle reduction methods.
Rahnama, P; Hidarnia, A; Shokravi, F A; Kazemnejad, A; Montazeri, A; Najorkolaei, F R; Saburi, A
2013-09-01
Many couples in the Islamic Republic of Iran rely on coital withdrawal for contraception. The purpose of this cross-sectional study was to use the theory of planned behaviour to explore factors that influence withdrawal users' intent to switch to oral contraception (OC). Participants were 336 sexually active, married women, who were current users of withdrawal and were recruited from 5 public family planning clinics in Tehran. A questionnair included measures of the theory of planned behaviour: attitude (behavioural beliefs, outcome evaluations), subjective norms (normative beliefs, motivation to comply), perceived behaviour control, past behaviour and behavioural intention. Linear regression analyses showed that past behaviour, perceived behaviour control, attitude and subjective norms accounted for the highest percentage of total variance observed for intention to use OC (36%). Beliefs-based family planning education and counsellingshould to be designed for users of the withdrawal method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degrassi, G.; Giardino, P. P.; Maltoni, F.
Here, we propose a method to determine the trilinear Higgs self coupling that is alternative to the direct measurement of Higgs pair production total cross sections and differential distributions. Furthermore, the method relies on the effects that electroweak loops featuring an anomalous trilinear coupling would imprint on single Higgs production at the LHC. We first calculate these contributions to all the phenomenologically relevant Higgs production (ggF, VBF, WH, ZH, tmore » $$\\bar{t}$$ ) and decay (γγ,WW*/ZZ*→ 4f, b$$\\bar{b}$$,ττ) modes at the LHC and then estimate the sensitivity to the trilinear coupling via a one-parameter fit to the single Higgs measurements at the LHC 8 TeV. We also found that the bounds on the self coupling are already competitive with those from Higgs pair production and will be further improved in the current and next LHC runs.« less
Su, Zhangli
2016-01-01
Combinatorial patterns of histone modifications are key indicators of different chromatin states. Most of the current approaches rely on the usage of antibodies to analyze combinatorial histone modifications. Here we detail an antibody-free method named MARCC (Matrix-Assisted Reader Chromatin Capture) to enrich combinatorial histone modifications. The combinatorial patterns are enriched on native nucleosomes extracted from cultured mammalian cells and prepared by micrococcal nuclease digestion. Such enrichment is achieved by recombinant chromatin-interacting protein modules, or so-called reader domains, which can bind in a combinatorial modification-dependent manner. The enriched chromatin can be quantified by western blotting or mass spectrometry for the co-existence of histone modifications, while the associated DNA content can be analyzed by qPCR or next-generation sequencing. Altogether, MARCC provides a reproducible, efficient and customizable solution to enrich and analyze combinatorial histone modifications. PMID:26131849
Annual banned-substance review: Analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans
2018-01-01
Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Jiang, Wenqian; Zeng, Bo; Yang, Zhou; Li, Gang
2018-01-01
In the non-invasive load monitoring mode, the load decomposition can reflect the running state of each load, which will help the user reduce unnecessary energy costs. With the demand side management measures of time of using price, a resident load influence analysis method for time of using price (TOU) based on non-intrusive load monitoring data are proposed in the paper. Relying on the current signal of the resident load classification, the user equipment type, and different time series of self-elasticity and cross-elasticity of the situation could be obtained. Through the actual household load data test with the impact of TOU, part of the equipment will be transferred to the working hours, and users in the peak price of electricity has been reduced, and in the electricity at the time of the increase Electrical equipment, with a certain regularity.
2010-01-01
In clinical neurology, a comprehensive understanding of consciousness has been regarded as an abstract concept - best left to philosophers. However, times are changing and the need to clinically assess consciousness is increasingly becoming a real-world, practical challenge. Current methods for evaluating altered levels of consciousness are highly reliant on either behavioural measures or anatomical imaging. While these methods have some utility, estimates of misdiagnosis are worrisome (as high as 43%) - clearly this is a major clinical problem. The solution must involve objective, physiologically based measures that do not rely on behaviour. This paper reviews recent advances in physiologically based measures that enable better evaluation of consciousness states (coma, vegetative state, minimally conscious state, and locked in syndrome). Based on the evidence to-date, electroencephalographic and neuroimaging based assessments of consciousness provide valuable information for evaluation of residual function, formation of differential diagnoses, and estimation of prognosis. PMID:20113490
Surface plasmon resonance optical cavity enhanced refractive index sensing.
Giorgini, A; Avino, S; Malara, P; Gagliardi, G; Casalino, M; Coppola, G; Iodice, M; Adam, P; Chadt, K; Homola, J; De Natale, P
2013-06-01
We report on a method for surface plasmon resonance (SPR) refractive index sensing based on direct time-domain measurements. An optical resonator is built around an SPR sensor, and its photon lifetime is measured as a function of loss induced by refractive index variations. The method does not rely on any spectroscopic analysis or direct intensity measurement. Time-domain measurements are practically immune to light intensity fluctuations and thus lead to high resolution. A proof of concept experiment is carried out in which a sensor response to liquid samples of different refractive indices is measured. A refractive index resolution of the current system, extrapolated from the reproducibility of cavity-decay time determinations over 133 s, is found to be about 10(-5) RIU. The possibility of long-term averaging suggests that measurements with a resolution better than 10(-7) RIU/√Hz are within reach.
Error Mitigation for Short-Depth Quantum Circuits
NASA Astrophysics Data System (ADS)
Temme, Kristan; Bravyi, Sergey; Gambetta, Jay M.
2017-11-01
Two schemes are presented that mitigate the effect of errors and decoherence in short-depth quantum circuits. The size of the circuits for which these techniques can be applied is limited by the rate at which the errors in the computation are introduced. Near-term applications of early quantum devices, such as quantum simulations, rely on accurate estimates of expectation values to become relevant. Decoherence and gate errors lead to wrong estimates of the expectation values of observables used to evaluate the noisy circuit. The two schemes we discuss are deliberately simple and do not require additional qubit resources, so to be as practically relevant in current experiments as possible. The first method, extrapolation to the zero noise limit, subsequently cancels powers of the noise perturbations by an application of Richardson's deferred approach to the limit. The second method cancels errors by resampling randomized circuits according to a quasiprobability distribution.
Radionuclide-fluorescence Reporter Gene Imaging to Track Tumor Progression in Rodent Tumor Models
Volpe, Alessia; Man, Francis; Lim, Lindsay; Khoshnevisan, Alex; Blower, Julia; Blower, Philip J.; Fruhwirth, Gilbert O.
2018-01-01
Metastasis is responsible for most cancer deaths. Despite extensive research, the mechanistic understanding of the complex processes governing metastasis remains incomplete. In vivo models are paramount for metastasis research, but require refinement. Tracking spontaneous metastasis by non-invasive in vivo imaging is now possible, but remains challenging as it requires long-time observation and high sensitivity. We describe a longitudinal combined radionuclide and fluorescence whole-body in vivo imaging approach for tracking tumor progression and spontaneous metastasis. This reporter gene methodology employs the sodium iodide symporter (NIS) fused to a fluorescent protein (FP). Cancer cells are engineered to stably express NIS-FP followed by selection based on fluorescence-activated cell sorting. Corresponding tumor models are established in mice. NIS-FP expressing cancer cells are tracked non-invasively in vivo at the whole-body level by positron emission tomography (PET) using the NIS radiotracer [18F]BF4-. PET is currently the most sensitive in vivo imaging technology available at this scale and enables reliable and absolute quantification. Current methods either rely on large cohorts of animals that are euthanized for metastasis assessment at varying time points, or rely on barely quantifiable 2D imaging. The advantages of the described method are: (i) highly sensitive non-invasive in vivo 3D PET imaging and quantification, (ii) automated PET tracer production, (iii) a significant reduction in required animal numbers due to repeat imaging options, (iv) the acquisition of paired data from subsequent imaging sessions providing better statistical data, and (v) the intrinsic option for ex vivo confirmation of cancer cells in tissues by fluorescence microscopy or cytometry. In this protocol, we describe all steps required for routine NIS-FP-afforded non-invasive in vivo cancer cell tracking using PET/CT and ex vivo confirmation of in vivo results. This protocol has applications beyond cancer research whenever in vivo localization, expansion and long-time monitoring of a cell population is of interest. PMID:29608157
Estimation of Regional Carbon Balance from Atmospheric Observations
NASA Astrophysics Data System (ADS)
Denning, S.; Uliasz, M.; Skidmore, J.
2002-12-01
Variations in the concentration of CO2 and other trace gases in time and space contain information about sources and sinks at regional scales. Several methods have been developed to quantitatively extract this information from atmospheric measurements. Mass-balance techniques depend on the ability to repeatedly sample the same mass of air, which involves careful attention to airmass trajectories. Inverse and adjoint techniques rely on decomposition of the source field into quasi-independent "basis functions" that are propagated through transport models and then used to synthesize optimal linear combinations that best match observations. A recently proposed method for regional flux estimation from continuous measurements at tall towers relies on time-mean vertical gradients, and requires careful trajectory analysis to map the estimates onto regional ecosystems. Each of these techniques is likely to be applied to measurements made during the North American Carbon Program. We have also explored the use of Bayesian synthesis inversion at regional scales, using a Lagrangian particle dispersion model driven by mesoscale transport fields. Influence functions were calculated for each hypothetical observation in a realistic diurnally-varying flow. These influence functions were then treated as basis functions for the purpose of separate inversions for daytime photosynthesis and 24-hour mean ecosystem respiration. Our results highlight the importance of estimating CO2 fluxes through the lateral boundaries of the model. Respiration fluxes were well constrained by one or two hypothetical towers, regardless of inflow fluxes. Time-varying assimilation fluxes were less well constrained, and much more dependent on knowledge of inflow fluxes. The small net difference between respiration and photosynthesis was the most difficult to determine, being extremely sensitive to knowledge of inflow fluxes. Finally, we explored the feasibility of directly incorporating mid-day concentration values measured at surface-layer flux towers in global inversions for regional surface fluxes. We found that such data would substantially improve the observational constraint on current carbon cycle models, especially if applied selectively to a well-designed subset of the current network of flux towers.
Scalable cultivation of human pluripotent stem cells on chemically-defined surfaces
NASA Astrophysics Data System (ADS)
Hsiung, Michael Chi-Wei
Human stem cells (SCs) are classified as self-renewing cells possessing great ability in therapeutic applications due of their ability to differentiate along any major cell lineage in the human body. Despite their restorative potential, widespread use of SCs is hampered by strenuous control issues. Along with the need for strict xeno-free environments to sustain growth in culture, current methods for growing human pluripotent stem cells (hPSCs) rely on platforms which impede large-scale cultivation and therapeutic delivery. Hence, any progress towards development of large-scale culture systems is severely hindered. In a concentrated effort to develop a scheme that can serve as a model precursor for large scale SC propagation in clinical use, we have explored methods for cultivating hPSCs on completely defined surfaces. We discuss novel approaches with the potential to go beyond the limitations presented by current methods. In particular, we studied the cultivation of human embryonic stem cells (hESCs) and human induced pluripotent stem cells (hiPSCs) on surface which underwent synthetic or chemical modification. Current methods for hPSCs rely on animal-based extracellular matrices (ECMs) such as mouse embryonic fibroblasts (MEFs) or feeders and murine sacoma cell-derived substrates to facilitate their growth. While these layers or coatings can be used to maximize the output of hPSC production, they cannot be considered for clinical use because they risk introducing foreign pathogens into culture. We have identified and developed conditions for a completely defined xeno-free substrate used for culturing hPSCs. By utilizing coupling chemistry, we can functionalize ester groups on a given surface and conjugate synthetic peptides containing the arginine-glycine-aspartic acid (RGD) motif, known for their role in cell adhesion. This method offers advantages over traditional hPSC culture by keeping the modified substrata free of xenogenic response and can be scaled up in adherent microcarrier culture. To treat a major organ such as the heart or kidney, producing large quantities of clinical-grade pluripotent cells is a necessity for cell-based therapy. Here we apply our approach to spherical beads or microcarriers for large-scale cultivation of hPSCs in a stirred-suspension bioreactor. Stem cells seeded on microcarriers and cultivated for multiple six day passages in a stirred-suspension bioreactors remained viable (≥90%) and increased by an average of 25.0+/-7.2-fold in concentration. The cells maintained their expression of pluripotency markers POU5F1 and NANOG as assessed by RT-PCR and quantitative PCR. These findings aim at the development of a flexible cost-effect method for the generation of pluripotent cells which can be repurposed and utilized for cell therapies. This work also aims to promote exploration into different methods of surface modification to develop new tactics for culturing hPSCs that can achieve higher fold growth while maintaining overall therapeutic potential.
Polyphony: superposition independent methods for ensemble-based drug discovery.
Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L
2014-09-30
Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].
Timescale Correlation between Marine Atmospheric Exposure and Accelerated Corrosion Testing
NASA Technical Reports Server (NTRS)
Montgomery, Eliza L.; Calle, Luz Marina; Curran, Jerone C.; Kolody, Mark R.
2011-01-01
Evaluation of metal-based structures has long relied on atmospheric exposure test sites to determine corrosion resistance in marine environments. Traditional accelerated corrosion testing relies on mimicking the exposure conditions, often incorporating salt spray and ultraviolet (UV) radiation, and exposing the metal to continuous or cyclic conditions of the corrosive environment. Their success for correlation to atmospheric exposure is often a concern when determining the timescale to which the accelerated tests can be related. Accelerated laboratory testing, which often focuses on the electrochemical reactions that occur during corrosion conditions, has yet to be universally accepted as a useful tool in predicting the long term service life of a metal despite its ability to rapidly induce corrosion. Although visual and mass loss methods of evaluating corrosion are the standard and their use is imperative, a method that correlates timescales from atmospheric exposure to accelerated testing would be very valuable. This work uses surface chemistry to interpret the chemical changes occurring on low carbon steel during atmospheric and accelerated corrosion conditions with the objective of finding a correlation between its accelerated and long-term corrosion performance. The current results of correlating data from marine atmospheric exposure conditions at the Kennedy Space Center beachside corrosion test site, alternating seawater spray, and immersion in typical electrochemical laboratory conditions, will be presented. Key words: atmospheric exposure, accelerated corrosion testing, alternating seawater spray, marine, correlation, seawater, carbon steel, long-term corrosion performance prediction, X-ray photoelectron spectroscopy.
Tracking of electrochemical impedance of batteries
NASA Astrophysics Data System (ADS)
Piret, H.; Granjon, P.; Guillet, N.; Cattin, V.
2016-04-01
This paper presents an evolutionary battery impedance estimation method, which can be easily embedded in vehicles or nomad devices. The proposed method not only allows an accurate frequency impedance estimation, but also a tracking of its temporal evolution contrary to classical electrochemical impedance spectroscopy methods. Taking into account constraints of cost and complexity, we propose to use the existing electronics of current control to perform a frequency evolutionary estimation of the electrochemical impedance. The developed method uses a simple wideband input signal, and relies on a recursive local average of Fourier transforms. The averaging is controlled by a single parameter, managing a trade-off between tracking and estimation performance. This normalized parameter allows to correctly adapt the behavior of the proposed estimator to the variations of the impedance. The advantage of the proposed method is twofold: the method is easy to embed into a simple electronic circuit, and the battery impedance estimator is evolutionary. The ability of the method to monitor the impedance over time is demonstrated on a simulator, and on a real Lithium ion battery, on which a repeatability study is carried out. The experiments reveal good tracking results, and estimation performance as accurate as the usual laboratory approaches.
NASA Astrophysics Data System (ADS)
Huang, Xingguo; Sun, Hui
2018-05-01
Gaussian beam is an important complex geometrical optical technology for modeling seismic wave propagation and diffraction in the subsurface with complex geological structure. Current methods for Gaussian beam modeling rely on the dynamic ray tracing and the evanescent wave tracking. However, the dynamic ray tracing method is based on the paraxial ray approximation and the evanescent wave tracking method cannot describe strongly evanescent fields. This leads to inaccuracy of the computed wave fields in the region with a strong inhomogeneous medium. To address this problem, we compute Gaussian beam wave fields using the complex phase by directly solving the complex eikonal equation. In this method, the fast marching method, which is widely used for phase calculation, is combined with Gauss-Newton optimization algorithm to obtain the complex phase at the regular grid points. The main theoretical challenge in combination of this method with Gaussian beam modeling is to address the irregular boundary near the curved central ray. To cope with this challenge, we present the non-uniform finite difference operator and a modified fast marching method. The numerical results confirm the proposed approach.
A rapid and reliable PCR method for genotyping the ABO blood group. II: A2 and O2 alleles.
O'Keefe, D S; Dobrovic, A
1996-01-01
PCR permits direct genotyping of individuals at the ABO locus. Several methods have been reported for genotyping ABO that rely on differentiating the A, B, and O alleles at specific base substitutions. However, the O allele as defined by serology comprises at least two alleles (O1 and O2) at the molecular level, and most current ABO genotyping methods only take into account the O1 allele. Determining the presence of the O2 allele is critical, as this not-infrequent allele would be mistyped as an A or a B allele by standard PCR typing methods. Furthermore, none of the methods to date distinguish between the A1 and A2 alleles, even though 10% of all white persons are blood group A2. We have developed a method for genotyping the ABO locus that takes the O2 and A2 alleles into account. Typing for A2 and O2 by diagnostic restriction enzyme digestion is a sensitive, nonradioactive assay that provides a convenient method useful for forensic and paternity testing and for clarifying anomalous serological results.
Jeong, Woo Chul; Chauhan, Munish; Sajib, Saurav Z K; Kim, Hyung Joong; Serša, Igor; Kwon, Oh In; Woo, Eung Je
2014-09-07
Magnetic Resonance Electrical Impedance Tomography (MREIT) is an MRI method that enables mapping of internal conductivity and/or current density via measurements of magnetic flux density signals. The MREIT measures only the z-component of the induced magnetic flux density B = (Bx, By, Bz) by external current injection. The measured noise of Bz complicates recovery of magnetic flux density maps, resulting in lower quality conductivity and current-density maps. We present a new method for more accurate measurement of the spatial gradient of the magnetic flux density gradient (∇ Bz). The method relies on the use of multiple radio-frequency receiver coils and an interleaved multi-echo pulse sequence that acquires multiple sampling points within each repetition time. The noise level of the measured magnetic flux density Bz depends on the decay rate of the signal magnitude, the injection current duration, and the coil sensitivity map. The proposed method uses three key steps. The first step is to determine a representative magnetic flux density gradient from multiple receiver coils by using a weighted combination and by denoising the measured noisy data. The second step is to optimize the magnetic flux density gradient by using multi-echo magnetic flux densities at each pixel in order to reduce the noise level of ∇ Bz and the third step is to remove a random noise component from the recovered ∇ Bz by solving an elliptic partial differential equation in a region of interest. Numerical simulation experiments using a cylindrical phantom model with included regions of low MRI signal to noise ('defects') verified the proposed method. Experimental results using a real phantom experiment, that included three different kinds of anomalies, demonstrated that the proposed method reduced the noise level of the measured magnetic flux density. The quality of the recovered conductivity maps using denoised ∇ Bz data showed that the proposed method reduced the conductivity noise level up to 3-4 times at each anomaly region in comparison to the conventional method.
Oxide materials for spintronic device applications
NASA Astrophysics Data System (ADS)
Prestgard, Megan Campbell
Spintronic devices are currently being researched as next-generation alternatives to traditional electronics. Electronics, which utilize the charge-carrying capabilities of electrons to store information, are fundamentally limited not only by size constraints, but also by limits on current flow and degradation, due to electro-migration. Spintronics devices are able to overcome these limitations, as their information storage is in the spin of electrons, rather than their charge. By using spin rather than charge, these current-limiting shortcomings can be easily overcome. However, for spintronic devices to be fully implemented into the current technology industry, their capabilities must be improved. Spintronic device operation relies on the movement and manipulation of spin-polarized electrons, in which there are three main processes that must be optimized in order to maximize device efficiencies. These spin-related processes are: the injection of spin-polarized electrons, the transport and manipulation of these carriers, and the detection of spin-polarized currents. In order to enhance the rate of spin-polarized injection, research has been focused on the use of alternative methods to enhance injection beyond that of a simple ferromagnetic metal/semiconductor injector interface. These alternatives include the use of oxide-based tunnel barriers and the modification of semiconductors and insulators for their use as ferromagnetic injector materials. The transport of spin-polarized carriers is heavily reliant on the optimization of materials' properties in order to enhance the carrier mobility and to quench spin-orbit coupling (SOC). However, a certain degree of SOC is necessary in order to allow for the electric-field, gate-controlled manipulation of spin currents. Spin detection can be performed via both optical and electrical techniques. Using electrical methods relies on the conversion between spin and charge currents via SOC and is often the preferred method for device-based applications. This dissertation presents experimental results on the use of oxides for fulfilling the three spintronic device requirements. In the case of spin injection, the study of dilute magnetic dielectrics (DMDs) shows the importance of doping on the magnetic properties of the resulting tunnel barriers. The study of spin transport in ZnO has shown that, even at room temperature, the spin diffusion length is relatively long, on the order of 100 nm. These studies have also probed the spin relaxation mechanics in ZnO and have shown that Dyakonov-Perel spin relaxation, operating according to Fermi-Dirac statistics, is the dominant spin relaxation mechanism in zinc oxide. Finally, spin detection in ZnO has shown that, similar to other semiconductors, by modifying the resistivity of the ZnO thin films, the spin Hall angle (SHA) can be enhanced to nearly that of metals. This is possible by enhancing extrinsic SOC due to skew-scattering from impurities as well as phonons. In addition, thermal spin injection has also been detected using ZnO, which results support the independently measured inverse spin-Hall effect studies. The work represented herein illustrates that oxide materials have the potential to enhance spintronic device potential in all processes pertinent to spintronic applications.
Integrated Scenario Modeling of NSTX Advanced Plasma Configurations
NASA Astrophysics Data System (ADS)
Kessel, Charles; Synakowski, Edward
2003-10-01
The Spherical Torus will provide an attractive fusion energy source if it can demonstrate the following major features: high elongation and triangularity, 100% non-inductive current with a credible path to high bootstrap fractions, non-solenoidal startup and current rampup, high beta with stabilization of RWM instabilities, and sufficiently high energy confinement. NSTX has specific experimental milestones to examine these features, and integrated scenario modeling is helping to understand how these configurations might be produced and what tools are needed to access this operating space. Simulations with the Tokamak Simulation Code (TSC), CURRAY, and JSOLVER/BALMSC/PEST2 have identified fully non-inductively sustained, high beta plasmas that rely on strong plasma shaping accomplished with a PF coil modification, off-axis current drive from Electron Bernstein Waves (EBW), flexible on-axis heating and CD from High Harmonic Fast Wave (HHFW) and Neutral Beam Injection (NBI), and density control. Ideal MHD stability shows that with wall stabilization through plasma rotation and/or RWM feedback coils, a beta of 40% is achievable, with 100% non-inductive current sustained for 4 current diffusion times. Experimental data and theory are combined to produce a best extrapolation to these regimes, which is continuously improved as the discharges approach these parameters, and theoretical/computational methods expand. Further investigations and development for integrated scenario modeling on NSTX is discussed.
NASA Astrophysics Data System (ADS)
Hoy, Erik P.; Mazziotti, David A.; Seideman, Tamar
2017-11-01
Can an electronic device be constructed using only a single molecule? Since this question was first asked by Aviram and Ratner in the 1970s [Chem. Phys. Lett. 29, 277 (1974)], the field of molecular electronics has exploded with significant experimental advancements in the understanding of the charge transport properties of single molecule devices. Efforts to explain the results of these experiments and identify promising new candidate molecules for molecular devices have led to the development of numerous new theoretical methods including the current standard theoretical approach for studying single molecule charge transport, i.e., the non-equilibrium Green's function formalism (NEGF). By pairing this formalism with density functional theory (DFT), a wide variety of transport problems in molecular junctions have been successfully treated. For some systems though, the conductance and current-voltage curves predicted by common DFT functionals can be several orders of magnitude above experimental results. In addition, since density functional theory relies on approximations to the exact exchange-correlation functional, the predicted transport properties can show significant variation depending on the functional chosen. As a first step to addressing this issue, the authors have replaced density functional theory in the NEGF formalism with a 2-electron reduced density matrix (2-RDM) method, creating a new approach known as the NEGF-RDM method. 2-RDM methods provide a more accurate description of electron correlation compared to density functional theory, and they have lower computational scaling compared to wavefunction based methods of similar accuracy. Additionally, 2-RDM methods are capable of capturing static electron correlation which is untreatable by existing NEGF-DFT methods. When studying dithiol alkane chains and dithiol benzene in model junctions, the authors found that the NEGF-RDM predicts conductances and currents that are 1-2 orders of magnitude below those of B3LYP and M06 DFT functionals. This suggests that the NEGF-RDM method could be a viable alternative to NEGF-DFT for molecular junction calculations.
Avonto, Cristina; Chittiboyina, Amar G; Rua, Diego; Khan, Ikhlas A
2015-12-01
Skin sensitization is an important toxicological end-point in the risk assessment of chemical allergens. Because of the complexity of the biological mechanisms associated with skin sensitization, integrated approaches combining different chemical, biological and in silico methods are recommended to replace conventional animal tests. Chemical methods are intended to characterize the potential of a sensitizer to induce earlier molecular initiating events. The presence of an electrophilic mechanistic domain is considered one of the essential chemical features to covalently bind to the biological target and induce further haptenation processes. Current in chemico assays rely on the quantification of unreacted model nucleophiles after incubation with the candidate sensitizer. In the current study, a new fluorescence-based method, 'HTS-DCYA assay', is proposed. The assay aims at the identification of reactive electrophiles based on their chemical reactivity toward a model fluorescent thiol. The reaction workflow enabled the development of a High Throughput Screening (HTS) method to directly quantify the reaction adducts. The reaction conditions have been optimized to minimize solubility issues, oxidative side reactions and increase the throughput of the assay while minimizing the reaction time, which are common issues with existing methods. Thirty-six chemicals previously classified with LLNA, DPRA or KeratinoSens™ were tested as a proof of concept. Preliminary results gave an estimated 82% accuracy, 78% sensitivity, 90% specificity, comparable to other in chemico methods such as Cys-DPRA. In addition to validated chemicals, six natural products were analyzed and a prediction of their sensitization potential is presented for the first time. Copyright © 2015 Elsevier Inc. All rights reserved.
Modularity of Protein Folds as a Tool for Template-Free Modeling of Structures.
Vallat, Brinda; Madrid-Aliste, Carlos; Fiser, Andras
2015-08-01
Predicting the three-dimensional structure of proteins from their amino acid sequences remains a challenging problem in molecular biology. While the current structural coverage of proteins is almost exclusively provided by template-based techniques, the modeling of the rest of the protein sequences increasingly require template-free methods. However, template-free modeling methods are much less reliable and are usually applicable for smaller proteins, leaving much space for improvement. We present here a novel computational method that uses a library of supersecondary structure fragments, known as Smotifs, to model protein structures. The library of Smotifs has saturated over time, providing a theoretical foundation for efficient modeling. The method relies on weak sequence signals from remotely related protein structures to create a library of Smotif fragments specific to the target protein sequence. This Smotif library is exploited in a fragment assembly protocol to sample decoys, which are assessed by a composite scoring function. Since the Smotif fragments are larger in size compared to the ones used in other fragment-based methods, the proposed modeling algorithm, SmotifTF, can employ an exhaustive sampling during decoy assembly. SmotifTF successfully predicts the overall fold of the target proteins in about 50% of the test cases and performs competitively when compared to other state of the art prediction methods, especially when sequence signal to remote homologs is diminishing. Smotif-based modeling is complementary to current prediction methods and provides a promising direction in addressing the structure prediction problem, especially when targeting larger proteins for modeling.
Battery Capacity Fading Estimation Using a Force-Based Incremental Capacity Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samad, Nassim A.; Kim, Youngki; Siegel, Jason B.
Traditionally health monitoring techniques in lithium-ion batteries rely on voltage and current measurements. A novel method of using a mechanical rather than electrical signal in the incremental capacity analysis (ICA) method is introduced in this paper. This method derives the incremental capacity curves based onmeasured force (ICF) instead of voltage (ICV). The force ismeasured on the surface of a cell under compression in a fixture that replicates a battery pack assembly and preloading. The analysis is performed on data collected from cycling encased prismatic Lithium-ion Nickel-Manganese-Cobalt Oxide (NMC) cells. For the NMC chemistry, the ICF method can complement or replacemore » the ICV method for the following reasons. The identified ICV peaks are centered around 40% of state of charge (SOC) while the peaks of the ICF method are centered around 70% of SOC indicating that the ICF can be used more often because it is more likely that an electric vehicle (EV) or a plug-in hybrid electric vehicle (PHEV) will traverse the 70% SOC range than the 40% SOC. In addition the Signal to Noise ratio (SNR) of the force signal is four times larger than the voltage signal using laboratory grade sensors. The proposed ICF method is shown to achieve 0.42% accuracy in capacity estimation during a low C-rate constant current discharge. Future work will investigate the application of the capacity estimation technique under charging and operation under high C-rates by addressing the transient behavior of force so that an online methodology for capacity estimation is developed.« less
Battery Capacity Fading Estimation Using a Force-Based Incremental Capacity Analysis
Samad, Nassim A.; Kim, Youngki; Siegel, Jason B.; ...
2016-05-27
Traditionally health monitoring techniques in lithium-ion batteries rely on voltage and current measurements. A novel method of using a mechanical rather than electrical signal in the incremental capacity analysis (ICA) method is introduced in this paper. This method derives the incremental capacity curves based onmeasured force (ICF) instead of voltage (ICV). The force ismeasured on the surface of a cell under compression in a fixture that replicates a battery pack assembly and preloading. The analysis is performed on data collected from cycling encased prismatic Lithium-ion Nickel-Manganese-Cobalt Oxide (NMC) cells. For the NMC chemistry, the ICF method can complement or replacemore » the ICV method for the following reasons. The identified ICV peaks are centered around 40% of state of charge (SOC) while the peaks of the ICF method are centered around 70% of SOC indicating that the ICF can be used more often because it is more likely that an electric vehicle (EV) or a plug-in hybrid electric vehicle (PHEV) will traverse the 70% SOC range than the 40% SOC. In addition the Signal to Noise ratio (SNR) of the force signal is four times larger than the voltage signal using laboratory grade sensors. The proposed ICF method is shown to achieve 0.42% accuracy in capacity estimation during a low C-rate constant current discharge. Future work will investigate the application of the capacity estimation technique under charging and operation under high C-rates by addressing the transient behavior of force so that an online methodology for capacity estimation is developed.« less
Divertor target shape optimization in realistic edge plasma geometry
NASA Astrophysics Data System (ADS)
Dekeyser, W.; Reiter, D.; Baelmans, M.
2014-07-01
Tokamak divertor design for next-step fusion reactors heavily relies on numerical simulations of the plasma edge. Currently, the design process is mainly done in a forward approach, where the designer is strongly guided by his experience and physical intuition in proposing divertor shapes, which are then thoroughly assessed by numerical computations. On the other hand, automated design methods based on optimization have proven very successful in the related field of aerodynamic design. By recasting design objectives and constraints into the framework of a mathematical optimization problem, efficient forward-adjoint based algorithms can be used to automatically compute the divertor shape which performs the best with respect to the selected edge plasma model and design criteria. In the past years, we have extended these methods to automated divertor target shape design, using somewhat simplified edge plasma models and geometries. In this paper, we build on and extend previous work to apply these shape optimization methods for the first time in more realistic, single null edge plasma and divertor geometry, as commonly used in current divertor design studies. In a case study with JET-like parameters, we show that the so-called one-shot method is very effective is solving divertor target design problems. Furthermore, by detailed shape sensitivity analysis we demonstrate that the development of the method already at the present state provides physically plausible trends, allowing to achieve a divertor design with an almost perfectly uniform power load for our particular choice of edge plasma model and design criteria.
Utility installation review system : implementation report.
DOT National Transportation Integrated Search
2009-03-01
Each year, the Texas Department of Transportation (TxDOT) issues thousands of approvals that enable new : utility installations to occupy the state right of way (ROW). The current utility installation review process : relies on the physical delivery ...
End-user interest in geotechnical data management systems.
DOT National Transportation Integrated Search
2008-12-01
In conducting geotechnical site investigations, large volumes of subsurface information and associated test data : are generated. The current practice relies on paper-based filing systems that are often difficult and cumbersome : to access by users. ...
Volumetric change of silts following cyclic loading.
DOT National Transportation Integrated Search
2013-06-01
Estimating the settlement of adjacent structures during pile installation in silts is a challenging problem for : practicing engineers. The current state-of-practice relies primarily on local case studies and monitoring efforts, such as : inclinomete...
Interpolation methods and the accuracy of lattice-Boltzmann mesh refinement
Guzik, Stephen M.; Weisgraber, Todd H.; Colella, Phillip; ...
2013-12-10
A lattice-Boltzmann model to solve the equivalent of the Navier-Stokes equations on adap- tively refined grids is presented. A method for transferring information across interfaces between different grid resolutions was developed following established techniques for finite- volume representations. This new approach relies on a space-time interpolation and solving constrained least-squares problems to ensure conservation. The effectiveness of this method at maintaining the second order accuracy of lattice-Boltzmann is demonstrated through a series of benchmark simulations and detailed mesh refinement studies. These results exhibit smaller solution errors and improved convergence when compared with similar approaches relying only on spatial interpolation. Examplesmore » highlighting the mesh adaptivity of this method are also provided.« less
[Transsexualism as an interdisciplinary phenomenon].
Bilikiewicz, Adam; Gromska, Jadwiga
2005-01-01
The authors express their criticism on the currently prevailing Polish diagnostic and therapeutic criteria of transsexualism. Relying on their clinical experience and expertise (opinions for the court) as well as current literature, they point to the necessity of a discussion between specialists from various medical fields (psychiatry, sexology, urology, surgery, endocrinology, genetics) and humanistic sciences (psychology, sociology, law, ethics) on this interdisciplinary phenomenon.
Perinetti, Giuseppe; Contardo, Luca
2017-01-01
Current evidence on the reliability of growth indicators in the identification of the pubertal growth spurt and efficiency of functional treatment for skeletal Class II malocclusion, the timing of which relies on such indicators, is highly controversial. Regarding growth indicators, the hand and wrist (including the sole middle phalanx of the third finger) maturation method and the standing height recording appear to be most reliable. Other methods are subjected to controversies or were showed to be unreliable. Main sources of controversies include use of single stages instead of ossification events and diagnostic reliability conjecturally based on correlation analyses. Regarding evidence on the efficiency of functional treatment, when treated during the pubertal growth spurt, more favorable response is seen in skeletal Class II patients even though large individual responsiveness remains. Main sources of controversies include design of clinical trials, definition of Class II malocclusion, and lack of inclusion of skeletal maturity among the prognostic factors. While no growth indicator may be considered to have a full diagnostic reliability in the identification of the pubertal growth spurt, their use may still be recommended for increasing efficiency of functional treatment for skeletal Class II malocclusion.
2017-01-01
Current evidence on the reliability of growth indicators in the identification of the pubertal growth spurt and efficiency of functional treatment for skeletal Class II malocclusion, the timing of which relies on such indicators, is highly controversial. Regarding growth indicators, the hand and wrist (including the sole middle phalanx of the third finger) maturation method and the standing height recording appear to be most reliable. Other methods are subjected to controversies or were showed to be unreliable. Main sources of controversies include use of single stages instead of ossification events and diagnostic reliability conjecturally based on correlation analyses. Regarding evidence on the efficiency of functional treatment, when treated during the pubertal growth spurt, more favorable response is seen in skeletal Class II patients even though large individual responsiveness remains. Main sources of controversies include design of clinical trials, definition of Class II malocclusion, and lack of inclusion of skeletal maturity among the prognostic factors. While no growth indicator may be considered to have a full diagnostic reliability in the identification of the pubertal growth spurt, their use may still be recommended for increasing efficiency of functional treatment for skeletal Class II malocclusion. PMID:28168195
A review of current challenges for the identification of gemstones
NASA Astrophysics Data System (ADS)
Shigley, James E.
2008-01-01
A variety of treated and synthetic gem materials are encountered today in the jewelry marketplace in increasing quantities. Although normally entering into the market with correct information, in some cases these materials are sold with incorrect or inaccurate information on their identity. In some cases, they exhibit appearances that correspond closely to those of valuable untreated, natural gemstones. Although they can display certain distinctive gemological characteristics, some treated and synthetic gem materials can be difficult for jewelers to recognize, especially when these individuals lack gemological training and access to standard gem-testing methods and equipment. In such instances, testing by a professional gemological laboratory may be required. Accurate gem identification and complete information disclosure are essential in the jewelry trade to maintain both the commercial value of natural gemstones and the confidence among consumers who are considering gemstone purchases. The goal of most current gemological research is to provide practical means of gem identification for jewelers and gemologists to help insure integrity in the international gemstone trade. To support this goal, research on gem materials increasingly relies upon characterization with modern analytical tools such as chemical analysis, various spectroscopy methods, and other scientific techniques.
Assessing and Valuing Historical Geospatial Data for Decisions
NASA Astrophysics Data System (ADS)
Sylak-Glassman, E.; Gallo, J.
2016-12-01
We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.
Searching for Exoplanets using Artificial Intelligence
NASA Astrophysics Data System (ADS)
Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann
2017-10-01
In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.
Gravitational waveforms for neutron star binaries from binary black hole simulations
NASA Astrophysics Data System (ADS)
Barkett, Kevin; Scheel, Mark; Haas, Roland; Ott, Christian; Bernuzzi, Sebastiano; Brown, Duncan; Szilagyi, Bela; Kaplan, Jeffrey; Lippuner, Jonas; Muhlberger, Curran; Foucart, Francois; Duez, Matthew
2016-03-01
Gravitational waves from binary neutron star (BNS) and black-hole/neutron star (BHNS) inspirals are primary sources for detection by the Advanced Laser Interferometer Gravitational-Wave Observatory. The tidal forces acting on the neutron stars induce changes in the phase evolution of the gravitational waveform, and these changes can be used to constrain the nuclear equation of state. Current methods of generating BNS and BHNS waveforms rely on either computationally challenging full 3D hydrodynamical simulations or approximate analytic solutions. We introduce a new method for computing inspiral waveforms for BNS/BHNS systems by adding the post-Newtonian (PN) tidal effects to full numerical simulations of binary black holes (BBHs), effectively replacing the non-tidal terms in the PN expansion with BBH results. Comparing a waveform generated with this method against a full hydrodynamical simulation of a BNS inspiral yields a phase difference of < 1 radian over ~ 15 orbits. The numerical phase accuracy required of BNS simulations to measure the accuracy of the method we present here is estimated as a function of the tidal deformability parameter λ.
Gravitational waveforms for neutron star binaries from binary black hole simulations
NASA Astrophysics Data System (ADS)
Barkett, Kevin; Scheel, Mark A.; Haas, Roland; Ott, Christian D.; Bernuzzi, Sebastiano; Brown, Duncan A.; Szilágyi, Béla; Kaplan, Jeffrey D.; Lippuner, Jonas; Muhlberger, Curran D.; Foucart, Francois; Duez, Matthew D.
2016-02-01
Gravitational waves from binary neutron star (BNS) and black hole/neutron star (BHNS) inspirals are primary sources for detection by the Advanced Laser Interferometer Gravitational-Wave Observatory. The tidal forces acting on the neutron stars induce changes in the phase evolution of the gravitational waveform, and these changes can be used to constrain the nuclear equation of state. Current methods of generating BNS and BHNS waveforms rely on either computationally challenging full 3D hydrodynamical simulations or approximate analytic solutions. We introduce a new method for computing inspiral waveforms for BNS/BHNS systems by adding the post-Newtonian (PN) tidal effects to full numerical simulations of binary black holes (BBHs), effectively replacing the nontidal terms in the PN expansion with BBH results. Comparing a waveform generated with this method against a full hydrodynamical simulation of a BNS inspiral yields a phase difference of <1 radian over ˜15 orbits. The numerical phase accuracy required of BNS simulations to measure the accuracy of the method we present here is estimated as a function of the tidal deformability parameter λ .
User-guided segmentation for volumetric retinal optical coherence tomography images
Yin, Xin; Chao, Jennifer R.; Wang, Ruikang K.
2014-01-01
Abstract. Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method. PMID:25147962
PMU-Aided Voltage Security Assessment for a Wind Power Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Huaiguang; Zhang, Yingchen; Zhang, Jun Jason
2015-10-05
Because wind power penetration levels in electric power systems are continuously increasing, voltage stability is a critical issue for maintaining power system security and operation. The traditional methods to analyze voltage stability can be classified into two categories: dynamic and steady-state. Dynamic analysis relies on time-domain simulations of faults at different locations; however, this method needs to exhaust faults at all locations to find the security region for voltage at a single bus. With the widely located phasor measurement units (PMUs), the Thevenin equivalent matrix can be calculated by the voltage and current information collected by the PMUs. This papermore » proposes a method based on a Thevenin equivalent matrix to identify system locations that will have the greatest impact on the voltage at the wind power plant's point of interconnection. The number of dynamic voltage stability analysis runs is greatly reduced by using the proposed method. The numerical results demonstrate the feasibility, effectiveness, and robustness of the proposed approach for voltage security assessment for a wind power plant.« less
User-guided segmentation for volumetric retinal optical coherence tomography images.
Yin, Xin; Chao, Jennifer R; Wang, Ruikang K
2014-08-01
Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method.
A Buoyancy-based Method of Determining Fat Levels in Drosophila.
Hazegh, Kelsey E; Reis, Tânia
2016-11-01
Drosophila melanogaster is a key experimental system in the study of fat regulation. Numerous techniques currently exist to measure levels of stored fat in Drosophila, but most are expensive and/or laborious and have clear limitations. Here, we present a method to quickly and cheaply determine organismal fat levels in L3 Drosophila larvae. The technique relies on the differences in density between fat and lean tissues and allows for rapid detection of fat and lean phenotypes. We have verified the accuracy of this method by comparison to body fat percentage as determined by neutral lipid extraction and gas chromatography coupled with mass spectrometry (GCMS). We furthermore outline detailed protocols for the collection and synchronization of larvae as well as relevant experimental recipes. The technique presented below overcomes the major shortcomings in the most widely used lipid quantitation methods and provides a powerful way to quickly and sensitively screen L3 larvae for fat regulation phenotypes while maintaining the integrity of the larvae. This assay has wide applications for the study of metabolism and fat regulation using Drosophila.
A risk analysis for production processes with disposable bioreactors.
Merseburger, Tobias; Pahl, Ina; Müller, Daniel; Tanner, Markus
2014-01-01
: Quality management systems are, as a rule, tightly defined systems that conserve existing processes and therefore guarantee compliance with quality standards. But maintaining quality also includes introducing new enhanced production methods and making use of the latest findings of bioscience. The advances in biotechnology and single-use manufacturing methods for producing new drugs especially impose new challenges on quality management, as quality standards have not yet been set. New methods to ensure patient safety have to be established, as it is insufficient to rely only on current rules. A concept of qualification, validation, and manufacturing procedures based on risk management needs to be established and realized in pharmaceutical production. The chapter starts with an introduction to the regulatory background of the manufacture of medicinal products. It then continues with key methods of risk management. Hazards associated with the production of medicinal products with single-use equipment are described with a focus on bioreactors, storage containers, and connecting devices. The hazards are subsequently evaluated and criteria for risk evaluation are presented. This chapter concludes with aspects of industrial application of quality risk management.
Bias-Free Chemically Diverse Test Sets from Machine Learning.
Swann, Ellen T; Fernandez, Michael; Coote, Michelle L; Barnard, Amanda S
2017-08-14
Current benchmarking methods in quantum chemistry rely on databases that are built using a chemist's intuition. It is not fully understood how diverse or representative these databases truly are. Multivariate statistical techniques like archetypal analysis and K-means clustering have previously been used to summarize large sets of nanoparticles however molecules are more diverse and not as easily characterized by descriptors. In this work, we compare three sets of descriptors based on the one-, two-, and three-dimensional structure of a molecule. Using data from the NIST Computational Chemistry Comparison and Benchmark Database and machine learning techniques, we demonstrate the functional relationship between these structural descriptors and the electronic energy of molecules. Archetypes and prototypes found with topological or Coulomb matrix descriptors can be used to identify smaller, statistically significant test sets that better capture the diversity of chemical space. We apply this same method to find a diverse subset of organic molecules to demonstrate how the methods can easily be reapplied to individual research projects. Finally, we use our bias-free test sets to assess the performance of density functional theory and quantum Monte Carlo methods.
Chang, Yuqing; Yang, Bo; Zhao, Xue; Linhardt, Robert J.
2012-01-01
A quantitative and highly sensitive method for the analysis of glycosaminoglycan (GAG)-derived disaccharides is presented that relies on capillary electrophoresis (CE) with laser-induced fluorescence (LIF) detection. This method enables complete separation of seventeen GAG-derived disaccharides in a single run. Unsaturated disaccharides were derivatized with 2-aminoacridone (AMAC) to improve sensitivity. The limit of detection was at the attomole level and about 100-fold more sensitive than traditional CE-ultraviolet detection. A CE separation timetable was developed to achieve complete resolution and shorten analysis time. The RSD of migration time and peak areas at both low and high concentrations of unsaturated disaccharides are all less than 2.7% and 3.2%, respectively, demonstrating that this is a reproducible method. This analysis was successfully applied to cultured Chinese hamster ovary cell samples for determination of GAG disaccharides. The current method simplifies GAG extraction steps, and reduces inaccuracy in calculating ratios of heparin/heparan sulfate to chondroitin sulfate/dermatan sulfate, resulting from the separate analyses of a single sample. PMID:22609076
Measuring the local mobility of graphene on semiconductors
NASA Astrophysics Data System (ADS)
Zhong, Haijian; Liu, Zhenghui; Wang, Jianfeng; Pan, Anlian; Xu, Gengzhao; Xu, Ke
2018-04-01
Mobility is an important parameter to gauge the performance of graphene devices, which is usually measured by FET or Hall methods relying on the use of insulating substrates. However, these methods are not applicable for the case of graphene on semiconductors, because some current will inevitably cross their junctions and flow through the semiconductors except directly traversing the graphene surface. Here we demonstrate a method for measuring the local mobility of graphene on gallium nitrides combining Kelvin probe force microscopy (KPFM) and conductive atomic force microscopy (C-AFM). The carrier density related to Fermi level shifts in graphene can be acquired from KPFM. The local mobility of graphene is calculated from the carrier mean free path available from the effective contact area, which can be fitted from the local I-V curves in graphene/GaN junctions by C-AFM. Our method can be used to investigate an arbitrary region in graphene and also be applied to other semiconductor substrates and do not introduce damages. These results will benefit recent topical application researches for graphene integration in various semiconductor devices.
Lindstedt, Bjørn-Arne; Heir, Even; Gjernes, Elisabet; Vardund, Traute; Kapperud, Georg
2003-01-01
Background The ability to react early to possible outbreaks of Escherichia coli O157:H7 and to trace possible sources relies on the availability of highly discriminatory and reliable techniques. The development of methods that are fast and has the potential for complete automation is needed for this important pathogen. Methods In all 73 isolates of shiga-toxin producing E. coli O157 (STEC) were used in this study. The two available fully sequenced STEC genomes were scanned for tandem repeated stretches of DNA, which were evaluated as polymorphic markers for isolate identification. Results The 73 E. coli isolates displayed 47 distinct patterns and the MLVA assay was capable of high discrimination between the E. coli O157 strains. The assay was fast and all the steps can be automated. Conclusion The findings demonstrate a novel high discriminatory molecular typing method for the important pathogen E. coli O157 that is fast, robust and offers many advantages compared to current methods. PMID:14664722
High-resolution sulfur isotopes in ice cores identify large stratospheric volcanic eruptions
NASA Astrophysics Data System (ADS)
Burke, Andrea; Sigl, Michael; Adkins, Jess; Paris, Guillaume; McConnell, Joe
2016-04-01
The record of the volcanic forcing of climate over the past 2500 years is reconstructed primarily from sulfate concentrations in ice cores. Of particular interest are stratospheric eruptions, as these afford sulfate aerosols the longest residence time and largest dispersion in the atmosphere, and thus the greatest impact on radiative forcing. Identification of stratospheric eruptions currently relies on the successful matching of the same volcanic sulphate peak in ice cores from both the Northern and Southern hemispheres (a "bipolar event"). These are interpreted to reflect the global distribution of sulfur aerosols by the stratospheric winds. Despite its recent success, this method relies on precise and accurate dating of ice cores, in order to distinguish between a true 'bipolar event' and two separate eruptions that occurred in close temporal succession. Sulfur isotopes can been used to distinguish between these two scenarios since stratospheric sulfur aerosols are exposed to UV radiation which imparts a mass independent fractionation (Baroni et al., 2007). Mass independent fractionation of sulfate in ice cores thus offers a novel method of fingerprinting stratospheric eruptions, and thus refining the historic record of explosive volcanism and its forcing of climate. Here we present new high-resolution (sub-annual) sulfur isotope data from the Tunu Ice core in Greenland over seven eruptions. Sulfur isotopes were measured by MC-ICP-MS, which substantially reduces sample size requirements and allows high temporal resolution from a single ice core. We demonstrate the efficacy of the method on recent, well-known eruptions (including Pinatubo and Katmai/Novarupta), and then apply it to unidentified sulfate peaks, allowing us to identify new stratospheric eruptions. Baroni, M., Thiemens, M. H., Delmas, R. J., & Savarino, J. (2007). Mass-independent sulfur isotopic compositions in stratospheric volcanic eruptions. Science, 315(5808), 84-87. http://doi.org/10.1126/science.1131754
High-resolution Sulfur Isotopes in Ice Cores Identify Large Stratospheric Eruptions
NASA Astrophysics Data System (ADS)
Burke, A.; Sigl, M.; Moore, K.; Nita, D. C.; Adkins, J. F.; Paris, G.; McConnell, J.
2016-12-01
The record of the volcanic forcing of climate over the past 2500 years is reconstructed primarily from sulfate concentrations in ice cores. Of particular interest are stratospheric eruptions, as these afford sulfate aerosols the longest residence time and largest dispersion in the atmosphere, and thus the greatest impact on radiative forcing. Identification of stratospheric eruptions currently relies on the successful matching of the same volcanic sulfate peak in ice cores from both the Northern and Southern hemispheres (a "bipolar event"). These are interpreted to reflect the global distribution of sulfur aerosols by the stratospheric winds. Despite its recent success, this method relies on precise and accurate dating of ice cores, in order to distinguish between a true `bipolar event' and two separate eruptions that occurred in close temporal succession. Sulfur isotopes can been used to distinguish between these two scenarios since stratospheric sulfur aerosols are exposed to UV radiation which imparts a mass independent fractionation (Baroni et al., 2007). Mass independent fractionation of sulfate in ice cores thus offers a novel method of fingerprinting stratospheric eruptions, and thus refining the historic record of explosive volcanism and its forcing of climate. Here we present new high-resolution (sub-annual) sulfur isotope data from the Tunu Ice core in Greenland over seven eruptions. Sulfur isotopes were measured by MC-ICP-MS, which substantially reduces sample size requirements and allows high temporal resolution from a single ice core. We demonstrate the efficacy of the method on recent, well-known eruptions (including Pinatubo and Katmai/Novarupta), and then apply it to unidentified sulfate peaks, allowing us to identify new stratospheric eruptions. Baroni, M., Thiemens, M. H., Delmas, R. J., & Savarino, J. (2007). Mass-independent sulfur isotopic compositions in stratospheric volcanic eruptions. Science, 315(5808), 84-87. http://doi.org/10.1126/science.1131754
Forecasting Daily Patient Outflow From a Ward Having No Real-Time Clinical Data
Tran, Truyen; Luo, Wei; Phung, Dinh; Venkatesh, Svetha
2016-01-01
Background: Modeling patient flow is crucial in understanding resource demand and prioritization. We study patient outflow from an open ward in an Australian hospital, where currently bed allocation is carried out by a manager relying on past experiences and looking at demand. Automatic methods that provide a reasonable estimate of total next-day discharges can aid in efficient bed management. The challenges in building such methods lie in dealing with large amounts of discharge noise introduced by the nonlinear nature of hospital procedures, and the nonavailability of real-time clinical information in wards. Objective Our study investigates different models to forecast the total number of next-day discharges from an open ward having no real-time clinical data. Methods We compared 5 popular regression algorithms to model total next-day discharges: (1) autoregressive integrated moving average (ARIMA), (2) the autoregressive moving average with exogenous variables (ARMAX), (3) k-nearest neighbor regression, (4) random forest regression, and (5) support vector regression. Although the autoregressive integrated moving average model relied on past 3-month discharges, nearest neighbor forecasting used median of similar discharges in the past in estimating next-day discharge. In addition, the ARMAX model used the day of the week and number of patients currently in ward as exogenous variables. For the random forest and support vector regression models, we designed a predictor set of 20 patient features and 88 ward-level features. Results Our data consisted of 12,141 patient visits over 1826 days. Forecasting quality was measured using mean forecast error, mean absolute error, symmetric mean absolute percentage error, and root mean square error. When compared with a moving average prediction model, all 5 models demonstrated superior performance with the random forests achieving 22.7% improvement in mean absolute error, for all days in the year 2014. Conclusions In the absence of clinical information, our study recommends using patient-level and ward-level data in predicting next-day discharges. Random forest and support vector regression models are able to use all available features from such data, resulting in superior performance over traditional autoregressive methods. An intelligent estimate of available beds in wards plays a crucial role in relieving access block in emergency departments. PMID:27444059
NASA Astrophysics Data System (ADS)
Jorge, Marco G.; Brennand, Tracy A.
2017-07-01
Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.
The ethical tightrope: politics of intimacy and consensual method in sexuality research.
Zago, Luiz F; Holmes, Dave
2015-06-01
This paper seeks to analyze the construction of ethics in sexuality research in which qualitative methods are employed in the field of social sciences. Analyses are based on a bibliographic review of current discussions on research methods of queer theory and on the authors' own experiences of past research on sexuality. The article offers a theoretical perspective on the ways ethnography and in-depth interviews become methods that can rely on a consensual method and create a politics of intimacy between the researchers and research participants. The politics of intimacy may contribute to the production of a politically engaged knowledge while escaping from the moral matrix that usually governs the relationship between researchers and research participants. It is argued here that the researcher's sexed and gendered body matters for fieldwork; that the consensual method among participants may be employed in sexuality research as a fruitful tool; and that the relationships created among researchers and participants can pose a challenge to predetermined ethical guidelines in research. As a result, discussions problematize the existence of a politics of intimacy in sexuality research that is characterized by ethical relations among research participants. © 2014 John Wiley & Sons Ltd.
Implicit Plasma Kinetic Simulation Using The Jacobian-Free Newton-Krylov Method
NASA Astrophysics Data System (ADS)
Taitano, William; Knoll, Dana; Chacon, Luis
2009-11-01
The use of fully implicit time integration methods in kinetic simulation is still area of algorithmic research. A brute-force approach to simultaneously including the field equations and the particle distribution function would result in an intractable linear algebra problem. A number of algorithms have been put forward which rely on an extrapolation in time. They can be thought of as linearly implicit methods or one-step Newton methods. However, issues related to time accuracy of these methods still remain. We are pursuing a route to implicit plasma kinetic simulation which eliminates extrapolation, eliminates phase-space from the linear algebra problem, and converges the entire nonlinear system within a time step. We accomplish all this using the Jacobian-Free Newton-Krylov algorithm. The original research along these lines considered particle methods to advance the distribution function [1]. In the current research we are advancing the Vlasov equations on a grid. Results will be presented which highlight algorithmic details for single species electrostatic problems and coupled ion-electron electrostatic problems. [4pt] [1] H. J. Kim, L. Chac'on, G. Lapenta, ``Fully implicit particle in cell algorithm,'' 47th Annual Meeting of the Division of Plasma Physics, Oct. 24-28, 2005, Denver, CO
Recent Advances in Active Infrared Thermography for Non-Destructive Testing of Aerospace Components.
Ciampa, Francesco; Mahmoodi, Pooya; Pinto, Fulvio; Meo, Michele
2018-02-16
Active infrared thermography is a fast and accurate non-destructive evaluation technique that is of particular relevance to the aerospace industry for the inspection of aircraft and helicopters' primary and secondary structures, aero-engine parts, spacecraft components and its subsystems. This review provides an exhaustive summary of most recent active thermographic methods used for aerospace applications according to their physical principle and thermal excitation sources. Besides traditional optically stimulated thermography, which uses external optical radiation such as flashes, heaters and laser systems, novel hybrid thermographic techniques are also investigated. These include ultrasonic stimulated thermography, which uses ultrasonic waves and the local damage resonance effect to enhance the reliability and sensitivity to micro-cracks, eddy current stimulated thermography, which uses cost-effective eddy current excitation to generate induction heating, and microwave thermography, which uses electromagnetic radiation at the microwave frequency bands to provide rapid detection of cracks and delamination. All these techniques are here analysed and numerous examples are provided for different damage scenarios and aerospace components in order to identify the strength and limitations of each thermographic technique. Moreover, alternative strategies to current external thermal excitation sources, here named as material-based thermography methods, are examined in this paper. These novel thermographic techniques rely on thermoresistive internal heating and offer a fast, low power, accurate and reliable assessment of damage in aerospace composites.
Recent Advances in Active Infrared Thermography for Non-Destructive Testing of Aerospace Components
Mahmoodi, Pooya; Pinto, Fulvio; Meo, Michele
2018-01-01
Active infrared thermography is a fast and accurate non-destructive evaluation technique that is of particular relevance to the aerospace industry for the inspection of aircraft and helicopters’ primary and secondary structures, aero-engine parts, spacecraft components and its subsystems. This review provides an exhaustive summary of most recent active thermographic methods used for aerospace applications according to their physical principle and thermal excitation sources. Besides traditional optically stimulated thermography, which uses external optical radiation such as flashes, heaters and laser systems, novel hybrid thermographic techniques are also investigated. These include ultrasonic stimulated thermography, which uses ultrasonic waves and the local damage resonance effect to enhance the reliability and sensitivity to micro-cracks, eddy current stimulated thermography, which uses cost-effective eddy current excitation to generate induction heating, and microwave thermography, which uses electromagnetic radiation at the microwave frequency bands to provide rapid detection of cracks and delamination. All these techniques are here analysed and numerous examples are provided for different damage scenarios and aerospace components in order to identify the strength and limitations of each thermographic technique. Moreover, alternative strategies to current external thermal excitation sources, here named as material-based thermography methods, are examined in this paper. These novel thermographic techniques rely on thermoresistive internal heating and offer a fast, low power, accurate and reliable assessment of damage in aerospace composites. PMID:29462953
Mass Function of Galaxy Clusters in Relativistic Inhomogeneous Cosmology
NASA Astrophysics Data System (ADS)
Ostrowski, Jan J.; Buchert, Thomas; Roukema, Boudewijn F.
The current cosmological model (ΛCDM) with the underlying FLRW metric relies on the assumption of local isotropy, hence homogeneity of the Universe. Difficulties arise when one attempts to justify this model as an average description of the Universe from first principles of general relativity, since in general, the Einstein tensor built from the averaged metric is not equal to the averaged stress-energy tensor. In this context, the discrepancy between these quantities is called "cosmological backreaction" and has been the subject of scientific debate among cosmologists and relativists for more than 20 years. Here we present one of the methods to tackle this problem, i.e. averaging the scalar parts of the Einstein equations, together with its application, the cosmological mass function of galaxy clusters.
Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter
2015-01-20
While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.
Feng, Liang; Wang, Wei; Yao, Hang-Ping; Zhou, Jianwei; Zhang, Ruiwen; Wang, Ming-Hai
2015-01-01
Targeting receptor tyrosine kinases by therapeutic monoclonal antibodies and antibody-drug conjugates has met with tremendous success in clinical oncology. Currently, numerous therapeutic monoclonal antibodies are under preclinical development. The potential for moving candidate antibodies into clinical trials relies heavily on therapeutic efficacy validated by human tumor xenografts in mice. Here we describe methods used to determine therapeutic efficacy of monoclonal antibodies or antibody-drug conjugates specific to human receptor tyrosine kinase using human tumor xenografts in mice as the model. The end point of the study is to determine whether treatment of tumor-bearing mice with a monoclonal antibody or antibody-drug conjugates results in significant delay of tumor growth.
Navigating the changing learning landscape: perspective from bioinformatics.ca
Ouellette, B. F. Francis
2013-01-01
With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs. PMID:23515468
Quantitative ultrasonic evaluation of mechanical properties of engineering materials
NASA Technical Reports Server (NTRS)
Vary, A.
1978-01-01
Current progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength properties of engineering materials is reviewed. Even where conventional NDE techniques have shown that a part is free of overt defects, advanced NDE techniques should be available to confirm the material properties assumed in the part's design. There are many instances where metallic, composite, or ceramic parts may be free of critical defects while still being susceptible to failure under design loads due to inadequate or degraded mechanical strength. This must be considered in any failure prevention scheme that relies on fracture analysis. This review will discuss the availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions.
LDRD final report : mesoscale modeling of dynamic loading of heterogeneous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robbins, Joshua; Dingreville, Remi Philippe Michel; Voth, Thomas Eugene
2013-12-01
Material response to dynamic loading is often dominated by microstructure (grain structure, porosity, inclusions, defects). An example critically important to Sandia's mission is dynamic strength of polycrystalline metals where heterogeneities lead to localization of deformation and loss of shear strength. Microstructural effects are of broad importance to the scientific community and several institutions within DoD and DOE; however, current models rely on inaccurate assumptions about mechanisms at the sub-continuum or mesoscale. Consequently, there is a critical need for accurate and robust methods for modeling heterogeneous material response at this lower length scale. This report summarizes work performed as part ofmore » an LDRD effort (FY11 to FY13; project number 151364) to meet these needs.« less
Body Temperature Measurements for Metabolic Phenotyping in Mice
Meyer, Carola W.; Ootsuka, Youichirou; Romanovsky, Andrej A.
2017-01-01
Endothermic organisms rely on tightly balanced energy budgets to maintain a regulated body temperature and body mass. Metabolic phenotyping of mice, therefore, often includes the recording of body temperature. Thermometry in mice is conducted at various sites, using various devices and measurement practices, ranging from single-time probing to continuous temperature imaging. Whilst there is broad agreement that body temperature data is of value, procedural considerations of body temperature measurements in the context of metabolic phenotyping are missing. Here, we provide an overview of the various methods currently available for gathering body temperature data from mice. We explore the scope and limitations of thermometry in mice, with the hope of assisting researchers in the selection of appropriate approaches, and conditions, for comprehensive mouse phenotypic analyses. PMID:28824441
Forensic 3D Scene Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
LITTLE,CHARLES Q.; PETERS,RALPH R.; RIGDON,J. BRIAN
Traditionally law enforcement agencies have relied on basic measurement and imaging tools, such as tape measures and cameras, in recording a crime scene. A disadvantage of these methods is that they are slow and cumbersome. The development of a portable system that can rapidly record a crime scene with current camera imaging, 3D geometric surface maps, and contribute quantitative measurements such as accurate relative positioning of crime scene objects, would be an asset to law enforcement agents in collecting and recording significant forensic data. The purpose of this project is to develop a feasible prototype of a fast, accurate, 3Dmore » measurement and imaging system that would support law enforcement agents to quickly document and accurately record a crime scene.« less
Shu, Lisa L.; Mazar, Nina; Gino, Francesca; Ariely, Dan; Bazerman, Max H.
2012-01-01
Many written forms required by businesses and governments rely on honest reporting. Proof of honest intent is typically provided through signature at the end of, e.g., tax returns or insurance policy forms. Still, people sometimes cheat to advance their financial self-interests—at great costs to society. We test an easy-to-implement method to discourage dishonesty: signing at the beginning rather than at the end of a self-report, thereby reversing the order of the current practice. Using laboratory and field experiments, we find that signing before—rather than after—the opportunity to cheat makes ethics salient when they are needed most and significantly reduces dishonesty. PMID:22927408
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
Statistical Analysis of Protein Ensembles
NASA Astrophysics Data System (ADS)
Máté, Gabriell; Heermann, Dieter
2014-04-01
As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.
Novel concept for driving the linear compressor of a micro-miniature split Stirling cryogenic cooler
NASA Astrophysics Data System (ADS)
Maron, V.; Veprik, A.; Finkelstein, L.; Vilenchik, H.; Ziv, I.; Pundak, N.
2009-05-01
New methods of carrying out homeland security and antiterrorist operations call for the development of a new generation of mechanically cooled, portable, battery powered infrared imagers, relying on micro-miniature Stirling cryogenic coolers of rotary or linear types. Since split Stirling linearly driven micro-miniature cryogenic coolers have inherently longer life spans, low vibration export and better aural stealth as compared to their rotary driven rivals, they are more suitable for the above applications. The performance of such cryogenic coolers depends strongly on the efficacy of their electronic drivers. In a traditional approach, the PWM power electronics produce the fixed frequency tonal driving voltage/current, the magnitude of which is modulated via a PID control law so as to maintain the desired focal plane array temperature. The disadvantage of such drivers is that they draw high ripple current from the system's power bus. This results in the need for an oversized DC power supply (battery packs) and power electronic components, low efficiency due to excessive conductive losses and high residual electromagnetic interference which in turn degrades the performance of other systems connected to the same power bus. Without either an active line filter or large and heavy passive filtering, other electronics can not be powered from the same power bus, unless they incorporate heavy filtering at their inputs. The authors present the results of a feasibility study towards developing a novel "pumping" driver consuming essentially constant instant battery power/current without making use of an active or passive filter. In the tested setup, the driver relies on a bidirectional controllable bridge, invertible with the driving frequency, and a fast regulated DC/DC converter which maintains a constant level of current consumed from the DC power supply and thus operates in input current control mode. From the experimental results, the steady-state power consumed by the linear compressor remains the same as compared with the traditional sine wave driver, the voltage and current drawn from the battery pack is essentially free of low frequency ripple (this without use of any kind of filtering) and the overall coefficient of performance of the driver is in excess of 94% over the entire working range of supply voltages. Such a driver free of sine forming PWM stage and have reduced power peaks in all power conversion components.
NASA Astrophysics Data System (ADS)
Murray, S.; Guerra, J. A.
2017-12-01
One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.
Puzon, Geoffrey J; Lancaster, James A; Wylie, Jason T; Plumb, Iason J
2009-09-01
Rapid detection of pathogenic Naegleria fowler in water distribution networks is critical for water utilities. Current detection methods rely on sampling drinking water followed by culturing and molecular identification of purified strains. This culture-based method takes an extended amount of time (days), detects both nonpathogenic and pathogenic species, and does not account for N. fowleri cells associated with pipe wall biofilms. In this study, a total DNA extraction technique coupled with a real-time PCR method using primers specific for N. fowleri was developed and validated. The method readily detected N. fowleri without preculturing with the lowest detection limit for N. fowleri cells spiked in biofilm being one cell (66% detection rate) and five cells (100% detection rate). For drinking water, the detection limit was five cells (66% detection rate) and 10 cells (100% detection rate). By comparison, culture-based methods were less sensitive for detection of cells spiked into both biofilm (66% detection for <10 cells) and drinking water (0% detection for <10 cells). In mixed cultures of N. fowleri and nonpathogenic Naegleria, the method identified N. fowleri in 100% of all replicates, whereastests with the current consensus primers detected N. fowleri in only 5% of all replicates. Application of the new method to drinking water and pipe wall biofilm samples obtained from a distribution network enabled the detection of N. fowleri in under 6 h, versus 3+ daysforthe culture based method. Further, comparison of the real-time PCR data from the field samples and the standard curves enabled an approximation of N. fowleri cells in the biofilm and drinking water. The use of such a method will further aid water utilities in detecting and managing the persistence of N. fowleri in water distribution networks.
Microfluidics-based, time-resolved mechanical phenotyping of cells using high-speed imaging
NASA Astrophysics Data System (ADS)
Belotti, Yuri; Conneely, Michael; Huang, Tianjun; McKenna, Stephen; Nabi, Ghulam; McGloin, David
2017-07-01
We demonstrate a single channel hydrodynamic stretching microfluidic device that relies on high-speed imaging to allow repeated dynamic cell deformation measurements. Experiments on prostate cancer cells suggest richer data than current approaches.
Contractor evaluations in the contractor selection process.
DOT National Transportation Integrated Search
2014-04-01
The current contractor evaluation system in use within the Kentucky Transportation Cabinet is based on the contractor evaluation system developed as part of SPR 212-00 "Quality Based Prequalification of Contractors." This system relies on average per...
Sex and hemisphere differences when mentally rotating meaningful and meaningless stimuli.
Rilea, Stacy L
2008-05-01
The purpose of the current study was to investigate the influence of stimulus type and sex on strategy use and hemispheric processing during the mental rotation task. Participants included 67 right-handed men and women who completed three mental rotation tasks, all presented bilaterally. Participants rotated human stick figures, alphanumeric stimuli, and a two-dimensional (2D) meaningless object. No hemispheric differences were observed when rotating human stick figures, suggesting that men and women may rely on the same strategy. A left hemisphere advantage was observed in women when rotating alphanumeric stimuli, suggesting they may be relying on a verbal strategy, whereas no hemispheric differences were observed for men. Finally, inconsistent with predictions, no hemisphere differences were observed when rotating two-dimensional objects. The findings from the current study suggest that both the meaningfulness and the type of stimulus presented may influence strategy use differently for men and women.
Mao, Longfei; Verwoerd, Wynand S
2013-10-01
Synechocystis sp. PCC 6803 has been considered as a promising biocatalyst for electricity generation in recent microbial fuel cell research. However, the innate maximum current production potential and underlying metabolic pathways supporting the high current output are still unknown. This is mainly due to the fact that the high-current production cell phenotype results from the interaction among hundreds of reactions in the metabolism and it is impossible for reductionist methods to characterize the pathway selection in such a metabolic state. In this study, we employed computational metabolic techniques, flux balance analysis, and flux variability analysis, to exploit the maximum current outputs of Synechocystis sp. PCC 6803, in five electron transfer cases, namely, ferredoxin- and plastoquinol-dependent electron transfers under photoautotrophic cultivation, and NADH-dependent mediated electron transfer under photoautotrophic, heterotrophic, and mixotrophic conditions. In these five modes, the maximum current outputs were computed as 0.198, 0.7918, 0.198, 0.4652, and 0.4424 A gDW⁻¹, respectively. Comparison of the five operational modes suggests that plastoquinol-/c-type cytochrome-targeted electricity generation had an advantage of liberating the highest current output achievable for Synechocystis sp. PCC 6803. On the other hand, the analysis indicates that the currency metabolite, NADH-, dependent electricity generation can rely on a number of reactions from different pathways, and is thus more robust against environmental perturbations.
NASA Astrophysics Data System (ADS)
Tsalamengas, John L.
2018-07-01
We study plane-wave electromagnetic scattering by radially and strongly inhomogeneous dielectric cylinders at oblique incidence. The method of analysis relies on an exact reformulation of the underlying field equations as a first-order 4 × 4 system of differential equations and on the ability to restate the associated initial-value problem in the form of a system of coupled linear Volterra integral equations of the second kind. The integral equations so derived are discretized via a sophisticated variant of the Nyström method. The proposed method yields results accurate up to machine precision without relying on approximations. Numerical results and case studies ably demonstrate the efficiency and high accuracy of the algorithms.
Thermal Transients Excite Neurons through Universal Intramembrane Mechanoelectrical Effects
NASA Astrophysics Data System (ADS)
Plaksin, Michael; Shapira, Einat; Kimmel, Eitan; Shoham, Shy
2018-01-01
Modern advances in neurotechnology rely on effectively harnessing physical tools and insights towards remote neural control, thereby creating major new scientific and therapeutic opportunities. Specifically, rapid temperature pulses were shown to increase membrane capacitance, causing capacitive currents that explain neural excitation, but the underlying biophysics is not well understood. Here, we show that an intramembrane thermal-mechanical effect wherein the phospholipid bilayer undergoes axial narrowing and lateral expansion accurately predicts a potentially universal thermal capacitance increase rate of ˜0.3 % /°C . This capacitance increase and concurrent changes in the surface charge related fields lead to predictable exciting ionic displacement currents. The new MechanoElectrical Thermal Activation theory's predictions provide an excellent agreement with multiple experimental results and indirect estimates of latent biophysical quantities. Our results further highlight the role of electro-mechanics in neural excitation; they may also help illuminate subthreshold and novel physical cellular effects, and could potentially lead to advanced new methods for neural control.
Oxide Heteroepitaxy for Flexible Optoelectronics.
Bitla, Yugandhar; Chen, Ching; Lee, Hsien-Chang; Do, Thi Hien; Ma, Chun-Hao; Qui, Le Van; Huang, Chun-Wei; Wu, Wen-Wei; Chang, Li; Chiu, Po-Wen; Chu, Ying-Hao
2016-11-30
The emerging technological demands for flexible and transparent electronic devices have compelled researchers to look beyond the current silicon-based electronics. However, fabrication of devices on conventional flexible substrates with superior performance are constrained by the trade-off between processing temperature and device performance. Here, we propose an alternative strategy to circumvent this issue via the heteroepitaxial growth of transparent conducting oxides (TCO) on the flexible mica substrate with performance comparable to that of their rigid counterparts. With the examples of ITO and AZO as a case study, a strong emphasis is laid upon the growth of flexible yet epitaxial TCO relying muscovite's superior properties compared to those of conventional flexible substrates and its compatibility with the present fabrication methods. Besides excellent optoelectro-mechanical properties, an additional functionality of high-temperature stability, normally lacking in the current state-of-the-art transparent flexitronics, is provided by these heterostructures. These epitaxial TCO electrodes with good chemical and thermal stabilities as well as mechanical durability can significantly contribute to the field of flexible, light-weight, and portable smart electronics.
Manufacture of tumor- and virus-specific T lymphocytes for adoptive cell therapies
Wang, X; Rivière, I
2015-01-01
Adoptive transfer of tumor-infiltrating lymphocytes (TILs) and genetically engineered T lymphocytes expressing chimeric antigen receptors (CARs) or conventional alpha/beta T-cell receptors (TCRs), collectively termed adoptive cell therapy (ACT), is an emerging novel strategy to treat cancer patients. Application of ACT has been constrained by the ability to isolate and expand functional tumor-reactive T cells. The transition of ACT from a promising experimental regimen to an established standard of care treatment relies largely on the establishment of safe, efficient, robust and cost-effective cell manufacturing protocols. The manufacture of cellular products under current good manufacturing practices (cGMPs) has a critical role in the process. Herein, we review current manufacturing methods for the large-scale production of clinical-grade TILs, virus-specific and genetically modified CAR or TCR transduced T cells in the context of phase I/II clinical trials as well as the regulatory pathway to get these complex personalized cellular products to the clinic. PMID:25721207
Bambus 2: scaffolding metagenomes.
Koren, Sergey; Treangen, Todd J; Pop, Mihai
2011-11-01
Sequencing projects increasingly target samples from non-clonal sources. In particular, metagenomics has enabled scientists to begin to characterize the structure of microbial communities. The software tools developed for assembling and analyzing sequencing data for clonal organisms are, however, unable to adequately process data derived from non-clonal sources. We present a new scaffolder, Bambus 2, to address some of the challenges encountered when analyzing metagenomes. Our approach relies on a combination of a novel method for detecting genomic repeats and algorithms that analyze assembly graphs to identify biologically meaningful genomic variants. We compare our software to current assemblers using simulated and real data. We demonstrate that the repeat detection algorithms have higher sensitivity than current approaches without sacrificing specificity. In metagenomic datasets, the scaffolder avoids false joins between distantly related organisms while obtaining long-range contiguity. Bambus 2 represents a first step toward automated metagenomic assembly. Bambus 2 is open source and available from http://amos.sf.net. mpop@umiacs.umd.edu. Supplementary data are available at Bioinformatics online.
Bambus 2: scaffolding metagenomes
Koren, Sergey; Treangen, Todd J.; Pop, Mihai
2011-01-01
Motivation: Sequencing projects increasingly target samples from non-clonal sources. In particular, metagenomics has enabled scientists to begin to characterize the structure of microbial communities. The software tools developed for assembling and analyzing sequencing data for clonal organisms are, however, unable to adequately process data derived from non-clonal sources. Results: We present a new scaffolder, Bambus 2, to address some of the challenges encountered when analyzing metagenomes. Our approach relies on a combination of a novel method for detecting genomic repeats and algorithms that analyze assembly graphs to identify biologically meaningful genomic variants. We compare our software to current assemblers using simulated and real data. We demonstrate that the repeat detection algorithms have higher sensitivity than current approaches without sacrificing specificity. In metagenomic datasets, the scaffolder avoids false joins between distantly related organisms while obtaining long-range contiguity. Bambus 2 represents a first step toward automated metagenomic assembly. Availability: Bambus 2 is open source and available from http://amos.sf.net. Contact: mpop@umiacs.umd.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:21926123
Manufacture of tumor- and virus-specific T lymphocytes for adoptive cell therapies.
Wang, X; Rivière, I
2015-03-01
Adoptive transfer of tumor-infiltrating lymphocytes (TILs) and genetically engineered T lymphocytes expressing chimeric antigen receptors (CARs) or conventional alpha/beta T-cell receptors (TCRs), collectively termed adoptive cell therapy (ACT), is an emerging novel strategy to treat cancer patients. Application of ACT has been constrained by the ability to isolate and expand functional tumor-reactive T cells. The transition of ACT from a promising experimental regimen to an established standard of care treatment relies largely on the establishment of safe, efficient, robust and cost-effective cell manufacturing protocols. The manufacture of cellular products under current good manufacturing practices (cGMPs) has a critical role in the process. Herein, we review current manufacturing methods for the large-scale production of clinical-grade TILs, virus-specific and genetically modified CAR or TCR transduced T cells in the context of phase I/II clinical trials as well as the regulatory pathway to get these complex personalized cellular products to the clinic.
Support for linguistic macrofamilies from weighted sequence alignment
Jäger, Gerhard
2015-01-01
Computational phylogenetics is in the process of revolutionizing historical linguistics. Recent applications have shed new light on controversial issues, such as the location and time depth of language families and the dynamics of their spread. So far, these approaches have been limited to single-language families because they rely on a large body of expert cognacy judgments or grammatical classifications, which is currently unavailable for most language families. The present study pursues a different approach. Starting from raw phonetic transcription of core vocabulary items from very diverse languages, it applies weighted string alignment to track both phonetic and lexical change. Applied to a collection of ∼1,000 Eurasian languages and dialects, this method, combined with phylogenetic inference, leads to a classification in excellent agreement with established findings of historical linguistics. Furthermore, it provides strong statistical support for several putative macrofamilies contested in current historical linguistics. In particular, there is a solid signal for the Nostratic/Eurasiatic macrofamily. PMID:26403857
Deterministic Creation of Macroscopic Cat States
Lombardo, Daniel; Twamley, Jason
2015-01-01
Despite current technological advances, observing quantum mechanical effects outside of the nanoscopic realm is extremely challenging. For this reason, the observation of such effects on larger scale systems is currently one of the most attractive goals in quantum science. Many experimental protocols have been proposed for both the creation and observation of quantum states on macroscopic scales, in particular, in the field of optomechanics. The majority of these proposals, however, rely on performing measurements, making them probabilistic. In this work we develop a completely deterministic method of macroscopic quantum state creation. We study the prototypical optomechanical Membrane In The Middle model and show that by controlling the membrane’s opacity, and through careful choice of the optical cavity initial state, we can deterministically create and grow the spatial extent of the membrane’s position into a large cat state. It is found that by using a Bose-Einstein condensate as a membrane high fidelity cat states with spatial separations of up to ∼300 nm can be achieved. PMID:26345157
Evaluating scholarship productivity in COAMFTE-accredited PhD programs.
Jared DuPree, W; White, Mark B; Meredith, William H; Ruddick, Lindsay; Anderson, Michael P
2009-04-01
Due to an increasing trend among states to cut higher education funds, many universities are relying more on private donations and federal funding to keep programs afloat. Scholarship productivity in general has become an integral factor in terms of universities granting tenure to faculty, allocating resources, and supporting program goals due to the fact that more research in a particular area tends to increase the likelihood that one will obtain funding from federal, state, and private sources. In the past, ranking systems have also been used to evaluate programs. However, most ranking systems use methodologies that do not quantify research productivity or evaluate factors that match current university trends. The purpose of this article is to explore current scholarship productivity trends among COAMFTE-accredited doctoral programs through the use of several evaluation methods. Specifically, productivity was examined in regard to the following areas: (a) family therapy journal publications; (b) family science journal publications; (c) historic journal publication trends; and (d) recent journal publication trends.
Cluster detection methods applied to the Upper Cape Cod cancer data.
Ozonoff, Al; Webster, Thomas; Vieira, Veronica; Weinberg, Janice; Ozonoff, David; Aschengrau, Ann
2005-09-15
A variety of statistical methods have been suggested to assess the degree and/or the location of spatial clustering of disease cases. However, there is relatively little in the literature devoted to comparison and critique of different methods. Most of the available comparative studies rely on simulated data rather than real data sets. We have chosen three methods currently used for examining spatial disease patterns: the M-statistic of Bonetti and Pagano; the Generalized Additive Model (GAM) method as applied by Webster; and Kulldorff's spatial scan statistic. We apply these statistics to analyze breast cancer data from the Upper Cape Cancer Incidence Study using three different latency assumptions. The three different latency assumptions produced three different spatial patterns of cases and controls. For 20 year latency, all three methods generally concur. However, for 15 year latency and no latency assumptions, the methods produce different results when testing for global clustering. The comparative analyses of real data sets by different statistical methods provides insight into directions for further research. We suggest a research program designed around examining real data sets to guide focused investigation of relevant features using simulated data, for the purpose of understanding how to interpret statistical methods applied to epidemiological data with a spatial component.
Rueda, Sylvia; Fathima, Sana; Knight, Caroline L; Yaqub, Mohammad; Papageorghiou, Aris T; Rahmatullah, Bahbibi; Foi, Alessandro; Maggioni, Matteo; Pepe, Antonietta; Tohka, Jussi; Stebbing, Richard V; McManigle, John E; Ciurte, Anca; Bresson, Xavier; Cuadra, Meritxell Bach; Sun, Changming; Ponomarev, Gennady V; Gelfand, Mikhail S; Kazanov, Marat D; Wang, Ching-Wei; Chen, Hsiang-Chou; Peng, Chun-Wei; Hung, Chu-Mei; Noble, J Alison
2014-04-01
This paper presents the evaluation results of the methods submitted to Challenge US: Biometric Measurements from Fetal Ultrasound Images, a segmentation challenge held at the IEEE International Symposium on Biomedical Imaging 2012. The challenge was set to compare and evaluate current fetal ultrasound image segmentation methods. It consisted of automatically segmenting fetal anatomical structures to measure standard obstetric biometric parameters, from 2D fetal ultrasound images taken on fetuses at different gestational ages (21 weeks, 28 weeks, and 33 weeks) and with varying image quality to reflect data encountered in real clinical environments. Four independent sub-challenges were proposed, according to the objects of interest measured in clinical practice: abdomen, head, femur, and whole fetus. Five teams participated in the head sub-challenge and two teams in the femur sub-challenge, including one team who tackled both. Nobody attempted the abdomen and whole fetus sub-challenges. The challenge goals were two-fold and the participants were asked to submit the segmentation results as well as the measurements derived from the segmented objects. Extensive quantitative (region-based, distance-based, and Bland-Altman measurements) and qualitative evaluation was performed to compare the results from a representative selection of current methods submitted to the challenge. Several experts (three for the head sub-challenge and two for the femur sub-challenge), with different degrees of expertise, manually delineated the objects of interest to define the ground truth used within the evaluation framework. For the head sub-challenge, several groups produced results that could be potentially used in clinical settings, with comparable performance to manual delineations. The femur sub-challenge had inferior performance to the head sub-challenge due to the fact that it is a harder segmentation problem and that the techniques presented relied more on the femur's appearance.
Predicting Lameness in Sheep Activity Using Tri-Axial Acceleration Signals
Barwick, Jamie; Lamb, David; Dobos, Robin; Schneider, Derek; Welch, Mitchell; Trotter, Mark
2018-01-01
Simple Summary Monitoring livestock farmed under extensive conditions is challenging and this is particularly difficult when observing animal behaviour at an individual level. Lameness is a disease symptom that has traditionally relied on visual inspection to detect those animals with an abnormal walking pattern. More recently, accelerometer sensors have been used in other livestock industries to detect lame animals. These devices are able to record changes in activity intensity, allowing us to differentiate between a grazing, walking, and resting animal. Using these on-animal sensors, grazing, standing, walking, and lame walking were accurately detected from an ear attached sensor. With further development, this classification algorithm could be linked with an automatic livestock monitoring system to provide real time information on individual health status, something that is practically not possible under current extensive livestock production systems. Abstract Lameness is a clinical symptom associated with a number of sheep diseases around the world, having adverse effects on weight gain, fertility, and lamb birth weight, and increasing the risk of secondary diseases. Current methods to identify lame animals rely on labour intensive visual inspection. The aim of this current study was to determine the ability of a collar, leg, and ear attached tri-axial accelerometer to discriminate between sound and lame gait movement in sheep. Data were separated into 10 s mutually exclusive behaviour epochs and subjected to Quadratic Discriminant Analysis (QDA). Initial analysis showed the high misclassification of lame grazing events with sound grazing and standing from all deployment modes. The final classification model, which included lame walking and all sound activity classes, yielded a prediction accuracy for lame locomotion of 82%, 35%, and 87% for the ear, collar, and leg deployments, respectively. Misclassification of sound walking with lame walking within the leg accelerometer dataset highlights the superiority of an ear mode of attachment for the classification of lame gait characteristics based on time series accelerometer data. PMID:29324700
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin
2017-01-01
Abstract Introduction Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono‐substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry‐based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. Objective To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Method Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Results Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. Conclusions DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence‐based identification are necessary before DNA‐based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. PMID:28906059
Terrestrial laser scanning-based bridge structural condition assessment : InTrans project reports.
DOT National Transportation Integrated Search
2016-05-01
Objective, accurate, and fast assessment of a bridges structural condition is critical to the timely assessment of safety risks. : Current practices for bridge condition assessment rely on visual observations and manual interpretation of reports a...
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Mozalevskis, Antons; Manzanares-Laya, Sandra; García de Olalla, Patricia; Moreno, Antonio; Jacques-Aviñó, Constanza; Caylà, Joan A
2015-11-01
The evidence that supports the preventive effect of combination antiretroviral treatment (cART) in HIV sexual transmission suggested the so-called 'treatment as prevention' (TAP) strategy as a promising tool for slowing down HIV transmission. As the messages and attitudes towards condom use in the context of TAP appear to be somehow confusing, the aim here is to assess whether relying on cART alone to prevent HIV transmission can currently be recommended from the Public Health perspective. A review is made of the literature on the effects of TAP strategy on HIV transmission and the epidemiology of other sexual transmitted infections (STIs) in the cART era, and recommendations from Public Health institutions on the TAP as of February 2014. The evolution of HIV and other STIs in Barcelona from 2007 to 2012 has also been analysed. Given that the widespread use of cART has coincided with an increasing incidence of HIV and other STIs, mainly amongst men who have sex with men, a combination and diversified prevention methods should always be considered and recommended in counselling. An informed decision on whether to stop using condoms should only be made by partners within stable couples, and after receiving all the up-to-date information regarding TAP. From the public health perspective, primary prevention should be a priority; therefore relying on cART alone is not a sufficient strategy to prevent new HIV and other STIs. Copyright © 2014 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Sola, J; Braun, F; Muntane, E; Verjus, C; Bertschi, M; Hugon, F; Manzano, S; Benissa, M; Gervaix, A
2016-08-01
Pneumonia remains the worldwide leading cause of children mortality under the age of five, with every year 1.4 million deaths. Unfortunately, in low resource settings, very limited diagnostic support aids are provided to point-of-care practitioners. Current UNICEF/WHO case management algorithm relies on the use of a chronometer to manually count breath rates on pediatric patients: there is thus a major need for more sophisticated tools to diagnose pneumonia that increase sensitivity and specificity of breath-rate-based algorithms. These tools should be low cost, and adapted to practitioners with limited training. In this work, a novel concept of unsupervised tool for the diagnosis of childhood pneumonia is presented. The concept relies on the automated analysis of respiratory sounds as recorded by a point-of-care electronic stethoscope. By identifying the presence of auscultation sounds at different chest locations, this diagnostic tool is intended to estimate a pneumonia likelihood score. After presenting the overall architecture of an algorithm to estimate pneumonia scores, the importance of a robust unsupervised method to identify inspiratory and expiratory phases of a respiratory cycle is highlighted. Based on data from an on-going study involving pediatric pneumonia patients, a first algorithm to segment respiratory sounds is suggested. The unsupervised algorithm relies on a Mel-frequency filter bank, a two-step Gaussian Mixture Model (GMM) description of data, and a final Hidden Markov Model (HMM) interpretation of inspiratory-expiratory sequences. Finally, illustrative results on first recruited patients are provided. The presented algorithm opens the doors to a new family of unsupervised respiratory sound analyzers that could improve future versions of case management algorithms for the diagnosis of pneumonia in low-resources settings.
A Method for Improving Temporal and Spatial Resolution of Carbon Dioxide Emissions
NASA Astrophysics Data System (ADS)
Gregg, J. S.; Andres, R. J.
2003-12-01
Using United States data, a method is developed to estimate the monthly consumption of solid, liquid and gaseous fossil fuels for each state in the union. This technique employs monthly sales data to estimate the relative monthly proportions of the total annual national fossil fuel use. These proportions are then used to estimate the total monthly carbon dioxide emissions for each state. To assess the success of this technique, the results from this method are compared with the data obtained from other independent methods. To determine the temporal success of the method, the resulting national time series is compared to the model produced by Carbon Dioxide Information Analysis Center (CDIAC) and the current model being developed by T. J. Blasing and C. Broniak at the Oak Ridge National Laboratory (ORNL). The University of North Dakota (UND) method fits well temporally with the results of the CDIAC and current ORNL research. To determine the success of the spatial component, the individual state results are compared to the annual state totals calculated by ORNL. Using ordinary least squares regression, the annual state totals of this method are plotted against the ORNL data. This allows a direct comparison of estimates in the form of ordered pairs against a one-to-one ideal correspondence line, and allows for easy detection of outliers in the results obtained by this estimation method. Analyzing the residuals of the linear regression model for each type of fuel permits an improved understanding of the strengths and shortcomings of the spatial component of this estimation technique. Spatially, the model is successful when compared to the current ORNL research. The primary advantages of this method are its ease of implementation and universal applicability. In general, this technique compares favorably to more labor-intensive methods that rely on more detailed data. The more detailed data is generally not available for most countries in the world. The methodology used here will be applied to other nations in the world to better understand their sub-annual cycle and sub-national spatial distribution of carbon dioxide emissions from fossil fuel consumption. Better understanding of the cycle will lead to better models used for predicting and responding to global environmental changes currently observed and anticipated.
Vineyard management in virtual reality: autonomous control of a transformable drone
NASA Astrophysics Data System (ADS)
Griffiths, H.; Shen, H.; Li, N.; Rojas, S.; Perkins, N.; Liu, M.
2017-05-01
Grape vines are susceptible to many diseases. Routine scouting is critically important to keep vineyards in healthy condition. Currently, scouting relies on experienced farm workers to inspect acres of land while arduously filling out reports to document crop health conditions. This process is both labor and time consuming. Using drones to assist farm workers in scouting has great potential to improve the efficiency of vineyard management. Due to the complexity in grape farm disease detection, the drones are normally used to detect suspicious areas to help farm workers to prioritize scouting activities. Operations still rely heavily on humans for further inspection to be certain about the health conditions of the vines. This paper introduces an autonomous transition flight control method for a transformable drone, which is suitable for the future virtual presence of humans in further inspecting suspicious areas. The transformable drone adopts a tilt-rotor mechanism to automatically switch between hover and horizontal flight modes, following commands from virtual reality devices held in the ground control station. The conceptual design and transformation dynamics of the drone will be first discussed, followed by a model predictive control system developed to automatically control the transition flight. Simulation is also provided to show the effectiveness of the proposed control system.
Xu, Hanfu; O'Brochta, David A.
2015-01-01
Genetic technologies based on transposon-mediated transgenesis along with several recently developed genome-editing technologies have become the preferred methods of choice for genetically manipulating many organisms. The silkworm, Bombyx mori, is a Lepidopteran insect of great economic importance because of its use in silk production and because it is a valuable model insect that has greatly enhanced our understanding of the biology of insects, including many agricultural pests. In the past 10 years, great advances have been achieved in the development of genetic technologies in B. mori, including transposon-based technologies that rely on piggyBac-mediated transgenesis and genome-editing technologies that rely on protein- or RNA-guided modification of chromosomes. The successful development and application of these technologies has not only facilitated a better understanding of B. mori and its use as a silk production system, but also provided valuable experiences that have contributed to the development of similar technologies in non-model insects. This review summarizes the technologies currently available for use in B. mori, their application to the study of gene function and their use in genetically modifying B. mori for biotechnology applications. The challenges, solutions and future prospects associated with the development and application of genetic technologies in B. mori are also discussed. PMID:26108630
NASA Astrophysics Data System (ADS)
Mat Jafri, Mohd. Zubir; Abdulbaqi, Hayder Saad; Mutter, Kussay N.; Mustapha, Iskandar Shahrim; Omar, Ahmad Fairuz
2017-06-01
A brain tumour is an abnormal growth of tissue in the brain. Most tumour volume measurement processes are carried out manually by the radiographer and radiologist without relying on any auto program. This manual method is a timeconsuming task and may give inaccurate results. Treatment, diagnosis, signs and symptoms of the brain tumours mainly depend on the tumour volume and its location. In this paper, an approach is proposed to improve volume measurement of brain tumors as well as using a new method to determine the brain tumour location. The current study presents a hybrid method that includes two methods. One method is hidden Markov random field - expectation maximization (HMRFEM), which employs a positive initial classification of the image. The other method employs the threshold, which enables the final segmentation. In this method, the tumour volume is calculated using voxel dimension measurements. The brain tumour location was determined accurately in T2- weighted MRI image using a new algorithm. According to the results, this process was proven to be more useful compared to the manual method. Thus, it provides the possibility of calculating the volume and determining location of a brain tumour.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
Emerging technologies for pediatric and adult trauma care.
Moulton, Steven L; Haley-Andrews, Stephanie; Mulligan, Jane
2010-06-01
Current Emergency Medical Service protocols rely on provider-directed care for evaluation, management and triage of injured patients from the field to a trauma center. New methods to quickly diagnose, support and coordinate the movement of trauma patients from the field to the most appropriate trauma center are in development. These methods will enhance trauma care and promote trauma system development. Recent advances in machine learning, statistical methods, device integration and wireless communication are giving rise to new methods for vital sign data analysis and a new generation of transport monitors. These monitors will collect and synchronize exponentially growing amounts of vital sign data with electronic patient care information. The application of advanced statistical methods to these complex clinical data sets has the potential to reveal many important physiological relationships and treatment effects. Several emerging technologies are converging to yield a new generation of smart sensors and tightly integrated transport monitors. These technologies will assist prehospital providers in quickly identifying and triaging the most severely injured children and adults to the most appropriate trauma centers. They will enable the development of real-time clinical support systems of increasing complexity, able to provide timelier, more cost-effective, autonomous care.
An Efficient Statistical Method to Compute Molecular Collisional Rate Coefficients
NASA Astrophysics Data System (ADS)
Loreau, Jérôme; Lique, François; Faure, Alexandre
2018-01-01
Our knowledge about the “cold” universe often relies on molecular spectra. A general property of such spectra is that the energy level populations are rarely at local thermodynamic equilibrium. Solving the radiative transfer thus requires the availability of collisional rate coefficients with the main colliding partners over the temperature range ∼10–1000 K. These rate coefficients are notoriously difficult to measure and expensive to compute. In particular, very few reliable collisional data exist for inelastic collisions involving reactive radicals or ions. In this Letter, we explore the use of a fast quantum statistical method to determine molecular collisional excitation rate coefficients. The method is benchmarked against accurate (but costly) rigid-rotor close-coupling calculations. For collisions proceeding through the formation of a strongly bound complex, the method is found to be highly satisfactory up to room temperature. Its accuracy decreases with decreasing potential well depth and with increasing temperature, as expected. This new method opens the way to the determination of accurate inelastic collisional data involving key reactive species such as {{{H}}}3+, H2O+, and H3O+ for which exact quantum calculations are currently not feasible.
Peptide Identification by Database Search of Mixture Tandem Mass Spectra*
Wang, Jian; Bourne, Philip E.; Bandeira, Nuno
2011-01-01
In high-throughput proteomics the development of computational methods and novel experimental strategies often rely on each other. In certain areas, mass spectrometry methods for data acquisition are ahead of computational methods to interpret the resulting tandem mass spectra. Particularly, although there are numerous situations in which a mixture tandem mass spectrum can contain fragment ions from two or more peptides, nearly all database search tools still make the assumption that each tandem mass spectrum comes from one peptide. Common examples include mixture spectra from co-eluting peptides in complex samples, spectra generated from data-independent acquisition methods, and spectra from peptides with complex post-translational modifications. We propose a new database search tool (MixDB) that is able to identify mixture tandem mass spectra from more than one peptide. We show that peptides can be reliably identified with up to 95% accuracy from mixture spectra while considering only a 0.01% of all possible peptide pairs (four orders of magnitude speedup). Comparison with current database search methods indicates that our approach has better or comparable sensitivity and precision at identifying single-peptide spectra while simultaneously being able to identify 38% more peptides from mixture spectra at significantly higher precision. PMID:21862760
iTemplate: A template-based eye movement data analysis approach.
Xiao, Naiqi G; Lee, Kang
2018-02-08
Current eye movement data analysis methods rely on defining areas of interest (AOIs). Due to the fact that AOIs are created and modified manually, variances in their size, shape, and location are unavoidable. These variances affect not only the consistency of the AOI definitions, but also the validity of the eye movement analyses based on the AOIs. To reduce the variances in AOI creation and modification and achieve a procedure to process eye movement data with high precision and efficiency, we propose a template-based eye movement data analysis method. Using a linear transformation algorithm, this method registers the eye movement data from each individual stimulus to a template. Thus, users only need to create one set of AOIs for the template in order to analyze eye movement data, rather than creating a unique set of AOIs for all individual stimuli. This change greatly reduces the error caused by the variance from manually created AOIs and boosts the efficiency of the data analysis. Furthermore, this method can help researchers prepare eye movement data for some advanced analysis approaches, such as iMap. We have developed software (iTemplate) with a graphic user interface to make this analysis method available to researchers.
NASA Astrophysics Data System (ADS)
Shimada, M.; Shimada, J.; Tsunashima, K.; Aoyama, C.
2017-12-01
Methane hydrate is anticipated to be the unconventional natural gas energy resource. Two types of methane hydrates are known to exist, based on the settings: "shallow" type and "sand layer" type. In comparison, shallow type is considered an advantage due to its high purity and the more simple exploration. However, not much development methods have been made in the area of extraction techniques. Currently, heating and depressurization are used as methods to collect sand layer methane hydrate, but these methods are still under examination and not yet to be implemented. This is probably because fossil fuel is used for the extraction process instead of natural energy. It is necessary to utilize natural energy instead of relying on fossil fuel. This is why sunlight is believed to be the most significant alternative. Solar power generation is commonly used to extract sunlight, but it is said that this process causes extreme energy loss since solar energy converted to electricity requires conversion to heat energy. A new method is contrived to accelerate the decomposition of methane hydrate with direct sunlight utilizing optical fibers. Authors will present details of this new method to collect methane hydrate with direct sunlight exposure.
Deductive Derivation and Turing-Computerization of Semiparametric Efficient Estimation
Frangakis, Constantine E.; Qian, Tianchen; Wu, Zhenke; Diaz, Ivan
2015-01-01
Summary Researchers often seek robust inference for a parameter through semiparametric estimation. Efficient semiparametric estimation currently requires theoretical derivation of the efficient influence function (EIF), which can be a challenging and time-consuming task. If this task can be computerized, it can save dramatic human effort, which can be transferred, for example, to the design of new studies. Although the EIF is, in principle, a derivative, simple numerical differentiation to calculate the EIF by a computer masks the EIF’s functional dependence on the parameter of interest. For this reason, the standard approach to obtaining the EIF relies on the theoretical construction of the space of scores under all possible parametric submodels. This process currently depends on the correctness of conjectures about these spaces, and the correct verification of such conjectures. The correct guessing of such conjectures, though successful in some problems, is a nondeductive process, i.e., is not guaranteed to succeed (e.g., is not computerizable), and the verification of conjectures is generally susceptible to mistakes. We propose a method that can deductively produce semiparametric locally efficient estimators. The proposed method is computerizable, meaning that it does not need either conjecturing, or otherwise theoretically deriving the functional form of the EIF, and is guaranteed to produce the desired estimates even for complex parameters. The method is demonstrated through an example. PMID:26237182
Deductive derivation and turing-computerization of semiparametric efficient estimation.
Frangakis, Constantine E; Qian, Tianchen; Wu, Zhenke; Diaz, Ivan
2015-12-01
Researchers often seek robust inference for a parameter through semiparametric estimation. Efficient semiparametric estimation currently requires theoretical derivation of the efficient influence function (EIF), which can be a challenging and time-consuming task. If this task can be computerized, it can save dramatic human effort, which can be transferred, for example, to the design of new studies. Although the EIF is, in principle, a derivative, simple numerical differentiation to calculate the EIF by a computer masks the EIF's functional dependence on the parameter of interest. For this reason, the standard approach to obtaining the EIF relies on the theoretical construction of the space of scores under all possible parametric submodels. This process currently depends on the correctness of conjectures about these spaces, and the correct verification of such conjectures. The correct guessing of such conjectures, though successful in some problems, is a nondeductive process, i.e., is not guaranteed to succeed (e.g., is not computerizable), and the verification of conjectures is generally susceptible to mistakes. We propose a method that can deductively produce semiparametric locally efficient estimators. The proposed method is computerizable, meaning that it does not need either conjecturing, or otherwise theoretically deriving the functional form of the EIF, and is guaranteed to produce the desired estimates even for complex parameters. The method is demonstrated through an example. © 2015, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Casalegno, Mosè; Bernardi, Andrea; Raos, Guido
2013-07-01
Numerical approaches can provide useful information about the microscopic processes underlying photocurrent generation in organic solar cells (OSCs). Among them, the Kinetic Monte Carlo (KMC) method is conceptually the simplest, but computationally the most intensive. A less demanding alternative is potentially represented by so-called Master Equation (ME) approaches, where the equations describing particle dynamics rely on the mean-field approximation and their solution is attained numerically, rather than stochastically. The description of charge separation dynamics, the treatment of electrostatic interactions and numerical stability are some of the key issues which have prevented the application of these methods to OSC modelling, despite of their successes in the study of charge transport in disordered system. Here we describe a three-dimensional ME approach to photocurrent generation in OSCs which attempts to deal with these issues. The reliability of the proposed method is tested against reference KMC simulations on bilayer heterojunction solar cells. Comparison of the current-voltage curves shows that the model well approximates the exact result for most devices. The largest deviations in current densities are mainly due to the adoption of the mean-field approximation for electrostatic interactions. The presence of deep traps, in devices characterized by strong energy disorder, may also affect result quality. Comparison of the simulation times reveals that the ME algorithm runs, on the average, one order of magnitude faster than KMC.
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narashimha S.
2013-01-01
Current approaches to satellite observation data storage and distribution implement separate visualization and data access methodologies which often leads to the need in time consuming data ordering and coding for applications requiring both visual representation as well as data handling and modeling capabilities. We describe an approach we implemented for a data-encoded web map service based on storing numerical data within server map tiles and subsequent client side data manipulation and map color rendering. The approach relies on storing data using the lossless compression Portable Network Graphics (PNG) image data format which is natively supported by web-browsers allowing on-the-fly browser rendering and modification of the map tiles. The method is easy to implement using existing software libraries and has the advantage of easy client side map color modifications, as well as spatial subsetting with physical parameter range filtering. This method is demonstrated for the ASTER-GDEM elevation model and selected MODIS data products and represents an alternative to the currently used storage and data access methods. One additional benefit includes providing multiple levels of averaging due to the need in generating map tiles at varying resolutions for various map magnification levels. We suggest that such merged data and mapping approach may be a viable alternative to existing static storage and data access methods for a wide array of combined simulation, data access and visualization purposes.
Photojunction field-effect transistor based on a colloidal quantum dot absorber channel layer.
Adinolfi, Valerio; Kramer, Illan J; Labelle, André J; Sutherland, Brandon R; Hoogland, S; Sargent, Edward H
2015-01-27
The performance of photodetectors is judged via high responsivity, fast speed of response, and low background current. Many previously reported photodetectors based on size-tuned colloidal quantum dots (CQDs) have relied either on photodiodes, which, since they are primary photocarrier devices, lack gain; or photoconductors, which provide gain but at the expense of slow response (due to delayed charge carrier escape from sensitizing centers) and an inherent dark current vs responsivity trade-off. Here we report a photojunction field-effect transistor (photoJFET), which provides gain while breaking prior photoconductors' response/speed/dark current trade-off. This is achieved by ensuring that, in the dark, the channel is fully depleted due to a rectifying junction between a deep-work-function transparent conductive top contact (MoO3) and a moderately n-type CQD film (iodine treated PbS CQDs). We characterize the rectifying behavior of the junction and the linearity of the channel characteristics under illumination, and we observe a 10 μs rise time, a record for a gain-providing, low-dark-current CQD photodetector. We prove, using an analytical model validated using experimental measurements, that for a given response time the device provides a two-orders-of-magnitude improvement in photocurrent-to-dark-current ratio compared to photoconductors. The photoJFET, which relies on a junction gate-effect, enriches the growing family of CQD photosensitive transistors.
Prevalidation of an Acute Inhalation Toxicity Test Using the EpiAirway In Vitro Human Airway Model
Jackson, George R.; Maione, Anna G.; Klausner, Mitchell
2018-01-01
Abstract Introduction: Knowledge of acute inhalation toxicity potential is important for establishing safe use of chemicals and consumer products. Inhalation toxicity testing and classification procedures currently accepted within worldwide government regulatory systems rely primarily on tests conducted in animals. The goal of the current work was to develop and prevalidate a nonanimal (in vitro) test for determining acute inhalation toxicity using the EpiAirway™ in vitro human airway model as a potential alternative for currently accepted animal tests. Materials and Methods: The in vitro test method exposes EpiAirway tissues to test chemicals for 3 hours, followed by measurement of tissue viability as the test endpoint. Fifty-nine chemicals covering a broad range of toxicity classes, chemical structures, and physical properties were evaluated. The in vitro toxicity data were utilized to establish a prediction model to classify the chemicals into categories corresponding to the currently accepted Globally Harmonized System (GHS) and the Environmental Protection Agency (EPA) system. Results: The EpiAirway prediction model identified in vivo rat-based GHS Acute Inhalation Toxicity Category 1–2 and EPA Acute Inhalation Toxicity Category I–II chemicals with 100% sensitivity and specificity of 43.1% and 50.0%, for GHS and EPA acute inhalation toxicity systems, respectively. The sensitivity and specificity of the EpiAirway prediction model for identifying GHS specific target organ toxicity-single exposure (STOT-SE) Category 1 human toxicants were 75.0% and 56.5%, respectively. Corrosivity and electrophilic and oxidative reactivity appear to be the predominant mechanisms of toxicity for the most highly toxic chemicals. Conclusions: These results indicate that the EpiAirway test is a promising alternative to the currently accepted animal tests for acute inhalation toxicity. PMID:29904643
Prevalidation of an Acute Inhalation Toxicity Test Using the EpiAirway In Vitro Human Airway Model.
Jackson, George R; Maione, Anna G; Klausner, Mitchell; Hayden, Patrick J
2018-06-01
Introduction: Knowledge of acute inhalation toxicity potential is important for establishing safe use of chemicals and consumer products. Inhalation toxicity testing and classification procedures currently accepted within worldwide government regulatory systems rely primarily on tests conducted in animals. The goal of the current work was to develop and prevalidate a nonanimal ( in vitro ) test for determining acute inhalation toxicity using the EpiAirway™ in vitro human airway model as a potential alternative for currently accepted animal tests. Materials and Methods: The in vitro test method exposes EpiAirway tissues to test chemicals for 3 hours, followed by measurement of tissue viability as the test endpoint. Fifty-nine chemicals covering a broad range of toxicity classes, chemical structures, and physical properties were evaluated. The in vitro toxicity data were utilized to establish a prediction model to classify the chemicals into categories corresponding to the currently accepted Globally Harmonized System (GHS) and the Environmental Protection Agency (EPA) system. Results: The EpiAirway prediction model identified in vivo rat-based GHS Acute Inhalation Toxicity Category 1-2 and EPA Acute Inhalation Toxicity Category I-II chemicals with 100% sensitivity and specificity of 43.1% and 50.0%, for GHS and EPA acute inhalation toxicity systems, respectively. The sensitivity and specificity of the EpiAirway prediction model for identifying GHS specific target organ toxicity-single exposure (STOT-SE) Category 1 human toxicants were 75.0% and 56.5%, respectively. Corrosivity and electrophilic and oxidative reactivity appear to be the predominant mechanisms of toxicity for the most highly toxic chemicals. Conclusions: These results indicate that the EpiAirway test is a promising alternative to the currently accepted animal tests for acute inhalation toxicity.
Schwartz, Andrew J.; Walton, Courtney L.; Williams, Kelsey L.; Hieftje, Gary M.
2016-01-01
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-like spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma. PMID:28451101
Retinal Imaging Techniques for Diabetic Retinopathy Screening
Goh, James Kang Hao; Cheung, Carol Y.; Sim, Shaun Sebastian; Tan, Pok Chien; Tan, Gavin Siew Wei; Wong, Tien Yin
2016-01-01
Due to the increasing prevalence of diabetes mellitus, demand for diabetic retinopathy (DR) screening platforms is steeply increasing. Early detection and treatment of DR are key public health interventions that can greatly reduce the likelihood of vision loss. Current DR screening programs typically employ retinal fundus photography, which relies on skilled readers for manual DR assessment. However, this is labor-intensive and suffers from inconsistency across sites. Hence, there has been a recent proliferation of automated retinal image analysis software that may potentially alleviate this burden cost-effectively. Furthermore, current screening programs based on 2-dimensional fundus photography do not effectively screen for diabetic macular edema (DME). Optical coherence tomography is becoming increasingly recognized as the reference standard for DME assessment and can potentially provide a cost-effective solution for improving DME detection in large-scale DR screening programs. Current screening techniques are also unable to image the peripheral retina and require pharmacological pupil dilation; ultra-widefield imaging and confocal scanning laser ophthalmoscopy, which address these drawbacks, possess great potential. In this review, we summarize the current DR screening methods using various retinal imaging techniques, and also outline future possibilities. Advances in retinal imaging techniques can potentially transform the management of patients with diabetes, providing savings in health care costs and resources. PMID:26830491
Retinal Imaging Techniques for Diabetic Retinopathy Screening.
Goh, James Kang Hao; Cheung, Carol Y; Sim, Shaun Sebastian; Tan, Pok Chien; Tan, Gavin Siew Wei; Wong, Tien Yin
2016-02-01
Due to the increasing prevalence of diabetes mellitus, demand for diabetic retinopathy (DR) screening platforms is steeply increasing. Early detection and treatment of DR are key public health interventions that can greatly reduce the likelihood of vision loss. Current DR screening programs typically employ retinal fundus photography, which relies on skilled readers for manual DR assessment. However, this is labor-intensive and suffers from inconsistency across sites. Hence, there has been a recent proliferation of automated retinal image analysis software that may potentially alleviate this burden cost-effectively. Furthermore, current screening programs based on 2-dimensional fundus photography do not effectively screen for diabetic macular edema (DME). Optical coherence tomography is becoming increasingly recognized as the reference standard for DME assessment and can potentially provide a cost-effective solution for improving DME detection in large-scale DR screening programs. Current screening techniques are also unable to image the peripheral retina and require pharmacological pupil dilation; ultra-widefield imaging and confocal scanning laser ophthalmoscopy, which address these drawbacks, possess great potential. In this review, we summarize the current DR screening methods using various retinal imaging techniques, and also outline future possibilities. Advances in retinal imaging techniques can potentially transform the management of patients with diabetes, providing savings in health care costs and resources. © 2016 Diabetes Technology Society.
2010-03-01
United States Air Force relies heavily on computer networks to transmit vast amounts of information throughout its organizations and with agencies...4 1.5. Thesis Organization ...and concepts are presented and explored. 1.5. Thesis Organization Chapter II provides background information on the current technologies that
Collection of empirical data for assessing 800MHz coverage models
DOT National Transportation Integrated Search
2004-12-01
Wireless communications plays an important role in KDOT operations. Currently, decisions pertaining to KDOTs : 800MHz radio system are made on the basis of coverage models that rely on antenna and terrain characteristics to model the : coverage. W...
DOT National Transportation Integrated Search
2004-10-01
Communications in current railroad operations rely heavily on voice communications. Radio congestion impairs roadway workers ability to communicate effectively with dispatchers at the Central Traffic Control Center and has adverse consequences for...
DOT National Transportation Integrated Search
2004-10-31
Communications in current railroad operations rely heavily on voice communications. Radio congestion impairs roadway workers ability to communicate effectively with dispatchers at the Central Traffic Control Center and has adverse consequences for...
Carroll, Dustin; Howard, Diana; Zhu, Haining; Paumi, Christian M; Vore, Mary; Bondada, Subbarao; Liang, Ying; Wang, Chi; St Clair, Daret K
2016-08-01
Cellular redox balance plays a significant role in the regulation of hematopoietic stem-progenitor cell (HSC/MPP) self-renewal and differentiation. Unregulated changes in cellular redox homeostasis are associated with the onset of most hematological disorders. However, accurate measurement of the redox state in stem cells is difficult because of the scarcity of HSC/MPPs. Glutathione (GSH) constitutes the most abundant pool of cellular antioxidants. Thus, GSH metabolism may play a critical role in hematological disease onset and progression. A major limitation to studying GSH metabolism in HSC/MPPs has been the inability to measure quantitatively GSH concentrations in small numbers of HSC/MPPs. Current methods used to measure GSH levels not only rely on large numbers of cells, but also rely on the chemical/structural modification or enzymatic recycling of GSH and therefore are likely to measure only total glutathione content accurately. Here, we describe the validation of a sensitive method used for the direct and simultaneous quantitation of both oxidized and reduced GSH via liquid chromatography followed by tandem mass spectrometry (LC-MS/MS) in HSC/MPPs isolated from bone marrow. The lower limit of quantitation (LLOQ) was determined to be 5.0ng/mL for GSH and 1.0ng/mL for GSSG with lower limits of detection at 0.5ng/mL for both glutathione species. Standard addition analysis utilizing mouse bone marrow shows that this method is both sensitive and accurate with reproducible analyte recovery. This method combines a simple extraction with a platform for the high-throughput analysis, allows for efficient determination of GSH/GSSG concentrations within the HSC/MPP populations in mouse, chemotherapeutic treatment conditions within cell culture, and human normal/leukemia patient samples. The data implicate the importance of the modulation of GSH/GSSG redox couple in stem cells related diseases. Copyright © 2016 Elsevier Inc. All rights reserved.
Generic dynamical phase transition in one-dimensional bulk-driven lattice gases with exclusion
NASA Astrophysics Data System (ADS)
Lazarescu, Alexandre
2017-06-01
Dynamical phase transitions are crucial features of the fluctuations of statistical systems, corresponding to boundaries between qualitatively different mechanisms of maintaining unlikely values of dynamical observables over long periods of time. They manifest themselves in the form of non-analyticities in the large deviation function of those observables. In this paper, we look at bulk-driven exclusion processes with open boundaries. It is known that the standard asymmetric simple exclusion process exhibits a dynamical phase transition in the large deviations of the current of particles flowing through it. That phase transition has been described thanks to specific calculation methods relying on the model being exactly solvable, but more general methods have also been used to describe the extreme large deviations of that current, far from the phase transition. We extend those methods to a large class of models based on the ASEP, where we add arbitrary spatial inhomogeneities in the rates and short-range potentials between the particles. We show that, as for the regular ASEP, the large deviation function of the current scales differently with the size of the system if one considers very high or very low currents, pointing to the existence of a dynamical phase transition between those two regimes: high current large deviations are extensive in the system size, and the typical states associated to them are Coulomb gases, which are highly correlated; low current large deviations do not depend on the system size, and the typical states associated to them are anti-shocks, consistently with a hydrodynamic behaviour. Finally, we illustrate our results numerically on a simple example, and we interpret the transition in terms of the current pushing beyond its maximal hydrodynamic value, as well as relate it to the appearance of Tracy-Widom distributions in the relaxation statistics of such models. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Alexandre Lazarescu was selected by the Editorial Board of J. Phys. A as an Emerging Talent.
Dictionary-driven prokaryotic gene finding.
Shibuya, Tetsuo; Rigoutsos, Isidore
2002-06-15
Gene identification, also known as gene finding or gene recognition, is among the important problems of molecular biology that have been receiving increasing attention with the advent of large scale sequencing projects. Previous strategies for solving this problem can be categorized into essentially two schools of thought: one school employs sequence composition statistics, whereas the other relies on database similarity searches. In this paper, we propose a new gene identification scheme that combines the best characteristics from each of these two schools. In particular, our method determines gene candidates among the ORFs that can be identified in a given DNA strand through the use of the Bio-Dictionary, a database of patterns that covers essentially all of the currently available sample of the natural protein sequence space. Our approach relies entirely on the use of redundant patterns as the agents on which the presence or absence of genes is predicated and does not employ any additional evidence, e.g. ribosome-binding site signals. The Bio-Dictionary Gene Finder (BDGF), the algorithm's implementation, is a single computational engine able to handle the gene identification task across distinct archaeal and bacterial genomes. The engine exhibits performance that is characterized by simultaneous very high values of sensitivity and specificity, and a high percentage of correctly predicted start sites. Using a collection of patterns derived from an old (June 2000) release of the Swiss-Prot/TrEMBL database that contained 451 602 proteins and fragments, we demonstrate our method's generality and capabilities through an extensive analysis of 17 complete archaeal and bacterial genomes. Examples of previously unreported genes are also shown and discussed in detail.
Shear-wave velocity profiling according to three alternative approaches: A comparative case study
NASA Astrophysics Data System (ADS)
Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.
2016-11-01
The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.
Passive versus active hazard detection and avoidance systems
NASA Astrophysics Data System (ADS)
Neveu, D.; Mercier, G.; Hamel, J.-F.; Simard Bilodeau, V.; Woicke, S.; Alger, M.; Beaudette, D.
2015-06-01
Upcoming planetary exploration missions will require advanced guidance, navigation and control technologies to reach landing sites with high precision and safety. Various technologies are currently in development to meet that goal. Some technologies rely on passive sensors and benefit from the low mass and power of such solutions while others rely on active sensors and benefit from an improved robustness and accuracy. This paper presents two different hazard detection and avoidance (HDA) system design approaches. The first architecture relies only on a camera as the passive HDA sensor while the second relies, in addition, on a Lidar as the active HDA sensor. Both options use in common an innovative hazard map fusion algorithm aiming at identifying the safest landing locations. This paper presents the simulation tools and reports the closed-loop software simulation results obtained using each design option. The paper also reports the Monte Carlo simulation campaign that was used to assess the robustness of each design option. The performance of each design option is compared against each other in terms of performance criteria such as percentage of success, mean distance to nearest hazard, etc. The applicability of each design option to planetary exploration missions is also discussed.
Simulated Prosthetic Vision: The Benefits of Computer-Based Object Recognition and Localization.
Macé, Marc J-M; Guivarch, Valérian; Denis, Grégoire; Jouffrais, Christophe
2015-07-01
Clinical trials with blind patients implanted with a visual neuroprosthesis showed that even the simplest tasks were difficult to perform with the limited vision restored with current implants. Simulated prosthetic vision (SPV) is a powerful tool to investigate the putative functions of the upcoming generations of visual neuroprostheses. Recent studies based on SPV showed that several generations of implants will be required before usable vision is restored. However, none of these studies relied on advanced image processing. High-level image processing could significantly reduce the amount of information required to perform visual tasks and help restore visuomotor behaviors, even with current low-resolution implants. In this study, we simulated a prosthetic vision device based on object localization in the scene. We evaluated the usability of this device for object recognition, localization, and reaching. We showed that a very low number of electrodes (e.g., nine) are sufficient to restore visually guided reaching movements with fair timing (10 s) and high accuracy. In addition, performance, both in terms of accuracy and speed, was comparable with 9 and 100 electrodes. Extraction of high level information (object recognition and localization) from video images could drastically enhance the usability of current visual neuroprosthesis. We suggest that this method-that is, localization of targets of interest in the scene-may restore various visuomotor behaviors. This method could prove functional on current low-resolution implants. The main limitation resides in the reliability of the vision algorithms, which are improving rapidly. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Analysis of decentralized variable structure control for collective search by mobile robots
NASA Astrophysics Data System (ADS)
Goldsmith, Steven Y.; Feddema, John T.; Robinett, Rush D., III
1998-10-01
This paper presents an analysis of a decentralized coordination strategy for organizing and controlling a team of mobile robots performing collective search. The alpha- beta coordination strategy is a family of collective search algorithms that allow teams of communicating robots to implicitly coordinate their search activities through a division of labor based on self-selected roles. In an alpha- beta team, alpha agents are motivated to improve their status by exploring new regions of the search space. Beta agents are conservative, and rely on the alpha agents to provide advanced information on favorable regions of the search space. An agent selects its current role dynamically based on its current status value relative to the current status values of the other team members. Status is determined by some function of the agent's sensor readings, and is generally a measurement of source intensity at the agent's current location. Variations on the decision rules determining alpha and beta behavior produce different versions of the algorithm that lead to different global properties. The alpha-beta strategy is based on a simple finite-state machine that implements a form of Variable Structure Control (VSC). The VSC system changes the dynamics of the collective system by abruptly switching at defined states to alternative control laws. In VSC, Lyapunov's direct method is often used to design control surfaces which guide the system to a given goal. We introduce the alpha- beta algorithm and present an analysis of the equilibrium point and the global stability of the alpha-beta algorithm based on Lyapunov's method.
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication.
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin; de Boer, Hugo
2018-03-01
Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono-substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry-based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence-based identification are necessary before DNA-based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd.
Permanent spin currents in cavity-qubit systems
NASA Astrophysics Data System (ADS)
Kulkarni, Manas; Hein, Sven M.; Kapit, Eliot; Aron, Camille
2018-02-01
In a recent experiment [P. Roushan et al., Nat. Phys. 13, 146 (2017), 10.1038/nphys3930], a spin current in an architecture of three superconducting qubits was produced during a few microseconds by creating synthetic magnetic fields. The lifetime of the current was set by the typical dissipative mechanisms that occur in those systems. We propose a scheme for the generation of permanent currents, even in the presence of such imperfections, and scalable to larger system sizes. It relies on striking a subtle balance between multiple nonequilibrium drives and the dissipation mechanisms, in order to engineer and stimulate chiral excited states which can carry current.
Spectral methods on arbitrary grids
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Gottlieb, David
1995-01-01
Stable and spectrally accurate numerical methods are constructed on arbitrary grids for partial differential equations. These new methods are equivalent to conventional spectral methods but do not rely on specific grid distributions. Specifically, we show how to implement Legendre Galerkin, Legendre collocation, and Laguerre Galerkin methodology on arbitrary grids.
Computational Methods for Analyzing Health News Coverage
ERIC Educational Resources Information Center
McFarlane, Delano J.
2011-01-01
Researchers that investigate the media's coverage of health have historically relied on keyword searches to retrieve relevant health news coverage, and manual content analysis methods to categorize and score health news text. These methods are problematic. Manual content analysis methods are labor intensive, time consuming, and inherently…
Tunnel Field-Effect Transistors in 2-D Transition Metal Dichalcogenide Materials
NASA Astrophysics Data System (ADS)
Ilatikhameneh, Hesameddin; Tan, Yaohua; Novakovic, Bozidar; Klimeck, Gerhard; Rahman, Rajib; Appenzeller, Joerg
2015-12-01
In this work, the performance of Tunnel Field-Effect Transistors (TFETs) based on two-dimensional Transition Metal Dichalcogenide (TMD) materials is investigated by atomistic quantum transport simulations. One of the major challenges of TFETs is their low ON-currents. 2D material based TFETs can have tight gate control and high electric fields at the tunnel junction, and can in principle generate high ON-currents along with a sub-threshold swing smaller than 60 mV/dec. Our simulations reveal that high performance TMD TFETs, not only require good gate control, but also rely on the choice of the right channel material with optimum band gap, effective mass and source/drain doping level. Unlike previous works, a full band atomistic tight binding method is used self-consistently with 3D Poisson equation to simulate ballistic quantum transport in these devices. The effect of the choice of TMD material on the performance of the device and its transfer characteristics are discussed. Moreover, the criteria for high ON-currents are explained with a simple analytic model, showing the related fundamental factors. Finally, the subthreshold swing and energy-delay of these TFETs are compared with conventional CMOS devices.
Use of whole genome sequencing in surveillance of drug resistant tuberculosis.
McNerney, Ruth; Zignol, Matteo; Clark, Taane G
2018-05-01
The threat of resistance to anti-tuberculosis drugs is of global concern. Current efforts to monitor resistance rely on phenotypic testing where cultured bacteria are exposed to critical concentrations of the drugs. Capacity for such testing is low in TB endemic countries. Drug resistance is caused by mutations in the Mycobacterium tuberculosis genome and whole genome sequencing to detect these mutations offers an alternative means of assessing resistance. Areas covered: The challenges of assessing TB drug resistance are discussed. Progress in elucidating the M. tuberculosis resistome and evidence of the accuracy of next generation sequencing for detecting resistance is reviewed. Expert Commentary: There are considerable advantages to using next generation sequencing for TB drug resistance surveillance. Accuracy is high for detecting resistance to the major first-line drugs but is currently lower for the second-line drugs due to our incomplete knowledge regarding resistance causing mutations. With the advances in sequencing technology and the opportunity to replace phenotypic drug susceptibility testing with safer and more cost effective methods it would appear that the question is when to implement. Current bottlenecks are sample extraction to allow whole genome sequencing directly from sputum and the lack of bioinformatics expertise in some TB endemic countries.
Negative Example Selection for Protein Function Prediction: The NoGO Database
Youngs, Noah; Penfold-Brown, Duncan; Bonneau, Richard; Shasha, Dennis
2014-01-01
Negative examples – genes that are known not to carry out a given protein function – are rarely recorded in genome and proteome annotation databases, such as the Gene Ontology database. Negative examples are required, however, for several of the most powerful machine learning methods for integrative protein function prediction. Most protein function prediction efforts have relied on a variety of heuristics for the choice of negative examples. Determining the accuracy of methods for negative example prediction is itself a non-trivial task, given that the Open World Assumption as applied to gene annotations rules out many traditional validation metrics. We present a rigorous comparison of these heuristics, utilizing a temporal holdout, and a novel evaluation strategy for negative examples. We add to this comparison several algorithms adapted from Positive-Unlabeled learning scenarios in text-classification, which are the current state of the art methods for generating negative examples in low-density annotation contexts. Lastly, we present two novel algorithms of our own construction, one based on empirical conditional probability, and the other using topic modeling applied to genes and annotations. We demonstrate that our algorithms achieve significantly fewer incorrect negative example predictions than the current state of the art, using multiple benchmarks covering multiple organisms. Our methods may be applied to generate negative examples for any type of method that deals with protein function, and to this end we provide a database of negative examples in several well-studied organisms, for general use (The NoGO database, available at: bonneaulab.bio.nyu.edu/nogo.html). PMID:24922051
Frankiewicz, Mikołaj; Połom, Wojciech; Matuszewski, Marcin
2018-01-01
Great advances in medical research concerning methods of contraception have been achieved in recent years, however, more than 25% of couples worldwide still rely on condoms - a method with poor efficacy. Even though there is a spectrum of 11 different contraceptive methods for women, there are only 4 commonly used by men (condoms, periodic abstinence, withdrawal and vasectomy). In this review, advances and present, state-of-the-art, both hormonal and non-hormonal male contraceptive methods will be presented and evaluated. Potential novel targets that warrant greater research will be highlighted. A comprehensive literature search without a time limit was performed using the Medline database on May 2017. The terms 'male contraception' in conjunction with 'reversible inhibition of sperm under guidance' (RISUG), 'hormonal', 'non-hormonal', 'vasectomy' or 'testosterone' were used. The articles were limited to those published in English, Polish or French. There are various contraceptives currently available to regulate male fertility. Vasectomy is still the most effective permanent form of male contraceptive with a failure rate lower than 1%. Reversible, non hormonal methods of male contraception, like reversible inhibition of sperm under guidance, are very promising and close to being introduced into the market. In regards to hormonal contraception research, the use of testosterone injections has been widely studied yet they often harbor undesirable side effects and require further development. Despite continuous efforts worldwide, it seems that another several years of research is needed to provide safe, effective and affordable male contraceptives which will allow both men and women to participate fully in family planning.
Lu, Bingxin; Leong, Hon Wai
2016-02-01
Genomic islands (GIs) are clusters of functionally related genes acquired by lateral genetic transfer (LGT), and they are present in many bacterial genomes. GIs are extremely important for bacterial research, because they not only promote genome evolution but also contain genes that enhance adaption and enable antibiotic resistance. Many methods have been proposed to predict GI. But most of them rely on either annotations or comparisons with other closely related genomes. Hence these methods cannot be easily applied to new genomes. As the number of newly sequenced bacterial genomes rapidly increases, there is a need for methods to detect GI based solely on sequences of a single genome. In this paper, we propose a novel method, GI-SVM, to predict GIs given only the unannotated genome sequence. GI-SVM is based on one-class support vector machine (SVM), utilizing composition bias in terms of k-mer content. From our evaluations on three real genomes, GI-SVM can achieve higher recall compared with current methods, without much loss of precision. Besides, GI-SVM allows flexible parameter tuning to get optimal results for each genome. In short, GI-SVM provides a more sensitive method for researchers interested in a first-pass detection of GI in newly sequenced genomes.
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters
NASA Technical Reports Server (NTRS)
Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani
2013-01-01
This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.
A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters
NASA Technical Reports Server (NTRS)
Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani
2013-01-01
This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes, the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.
Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model
NASA Astrophysics Data System (ADS)
Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef
2016-10-01
We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.
NASA Astrophysics Data System (ADS)
Ristau, Henry
Many tasks in smart environments can be implemented using message based communication paradigms that decouple applications in time, space, synchronization and semantics. Current solutions for decoupled message based communication either do not support message processing and thus semantic decoupling or rely on clearly defined network structures. In this paper we present ASP, a novel concept for such communication that can directly operate on neighbor relations between brokers and does not rely on a homogeneous addressing scheme or anymore than simple link layer communication. We show by simulation that ASP performs well in a heterogeneous scenario with mobile nodes and decreases network or processor load significantly compared to message flooding.
NASA Astrophysics Data System (ADS)
Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan
2004-05-01
We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.
Boehlen, Anne; Henneberger, Christian; Erchova, Irina
2013-01-01
The temporal lobe is well known for its oscillatory activity associated with exploration, navigation, and learning. Intrinsic membrane potential oscillations (MPOs) and resonance of stellate cells (SCs) in layer II of the entorhinal cortex are thought to contribute to network oscillations and thereby to the encoding of spatial information. Generation of both MPOs and resonance relies on the expression of specific voltage-dependent ion currents such as the hyperpolarization-activated cation current (IH), the persistent sodium current (INaP), and the noninactivating muscarine-modulated potassium current (IM). However, the differential contributions of these currents remain a matter of debate. We therefore examined how they modify neuronal excitability near threshold and generation of near-threshold MPOs and resonance in vitro. We found that resonance mainly relied on IH and was reduced by IH blockers and modulated by cAMP and an IM enhancer but that neither of the currents exhibited full control over MPOs in these cells. As previously reported, IH controlled a theta-frequency component of MPOs such that blockade of IH resulted in fewer regular oscillations that retained low-frequency components and high peak amplitude. However, pharmacological inhibition and augmentation of IM also affected MPO frequencies and amplitudes. In contrast to other cell types, inhibition of INaP did not result in suppression of MPOs but only in a moderation of their properties. We reproduced the experimentally observed effects in a single-compartment stochastic model of SCs, providing further insight into the interactions between different ionic conductances. PMID:23076110
Petascale turbulence simulation using a highly parallel fast multipole method on GPUs
NASA Astrophysics Data System (ADS)
Yokota, Rio; Barba, L. A.; Narumi, Tetsu; Yasuoka, Kenji
2013-03-01
This paper reports large-scale direct numerical simulations of homogeneous-isotropic fluid turbulence, achieving sustained performance of 1.08 petaflop/s on GPU hardware using single precision. The simulations use a vortex particle method to solve the Navier-Stokes equations, with a highly parallel fast multipole method (FMM) as numerical engine, and match the current record in mesh size for this application, a cube of 40963 computational points solved with a spectral method. The standard numerical approach used in this field is the pseudo-spectral method, relying on the FFT algorithm as the numerical engine. The particle-based simulations presented in this paper quantitatively match the kinetic energy spectrum obtained with a pseudo-spectral method, using a trusted code. In terms of parallel performance, weak scaling results show the FMM-based vortex method achieving 74% parallel efficiency on 4096 processes (one GPU per MPI process, 3 GPUs per node of the TSUBAME-2.0 system). The FFT-based spectral method is able to achieve just 14% parallel efficiency on the same number of MPI processes (using only CPU cores), due to the all-to-all communication pattern of the FFT algorithm. The calculation time for one time step was 108 s for the vortex method and 154 s for the spectral method, under these conditions. Computing with 69 billion particles, this work exceeds by an order of magnitude the largest vortex-method calculations to date.
Assessing the motor carrier industry and its segments : current and prospective issues, 2006.
DOT National Transportation Integrated Search
2006-04-01
This report updates the Motor Carrier Industry Profile: An Update 2004-2005 to reflect recent changes in the motor carrier industry. This document relies heavily on industry trade journals, company annual reports and other industry surveys as sources...
A spatially explicit model for estimating risks of pesticide exposure on bird populations
Product Description (FY17 Key Product): Current ecological risk assessment for pesticides under FIFRA relies on risk quotients (RQs), which suffer from significant methodological shortcomings. For example, RQs do not integrate adverse effects arising from multiple demographic pr...
Stochastic Modeling and Global Warming Trend Extraction For Ocean Acoustic Travel Times.
1995-01-06
consideration and that these models can not currently be relied upon by themselves to predict global warming . Experimental data is most certainly needed, not...only to measure global warming itself, but to help improve the ocean model themselves. (AN)
Newcastle disease: current vaccine research
USDA-ARS?s Scientific Manuscript database
Newcastle disease (ND) is one of the most important infectious diseases that affect poultry due to its devastating economic impact and world-wide distribution and contribution towards malnutrition in countries that rely on production of village chickens as a source of animal protein. Besides biosec...
Predicting wind-driven waves in small reservoirs
USDA-ARS?s Scientific Manuscript database
The earthen levees commonly used for irrigation reservoirs are subjected to significant embankment erosion due to wind-generated waves. The design of bank protection measures relies on adequate prediction of wave characteristics based on wind conditions and fetch length. Current formulations are ba...
Sustainable approaches to control postharvest diseases of apples
USDA-ARS?s Scientific Manuscript database
Long term storage of apples faces challenges in maintaining fruit quality and reducing losses from postharvest diseases. Currently, the apple industry relies mainly on synthetic fungicides to control postharvest decays. However, the limitations to fungicides such as the development of resistance i...
Rotational Stiffness of Precast Beam-Column Connection using Finite Element Method
NASA Astrophysics Data System (ADS)
Hashim, N.; Agarwal, J.
2018-04-01
Current design practice in structural analysis is to assume the connection as pinned or rigid, however this cannot be relied upon for safety against collapse because during services the actual connection reacts differently where the connection has rotated in relevance. This situation may lead to different reactions and consequently affect design results and other frame responses. In precast concrete structures, connections play an important part in ensuring the safety of the whole structure. Thus, investigates on the actual connection behavior by construct the moment-rotation relationship is significant. Finite element (FE) method is chosen for modeling a 3-dimensional beam-column connection. The model is built in symmetry to reduce analysis time. Results demonstrate that precast billet connection is categorized as semi-rigid connection with Sini of 23,138kNm/rad. This is definitely different from the assumption of pinned or rigid connection used in design practice. Validation were made by comparing with mathematical equation and small differences were achieved that led to the conclusion where precast billet connection using FE method is acceptable.
Fast online deconvolution of calcium imaging data
Zhou, Pengcheng; Paninski, Liam
2017-01-01
Fluorescent calcium indicators are a popular means for observing the spiking activity of large neuronal populations, but extracting the activity of each neuron from raw fluorescence calcium imaging data is a nontrivial problem. We present a fast online active set method to solve this sparse non-negative deconvolution problem. Importantly, the algorithm 3progresses through each time series sequentially from beginning to end, thus enabling real-time online estimation of neural activity during the imaging session. Our algorithm is a generalization of the pool adjacent violators algorithm (PAVA) for isotonic regression and inherits its linear-time computational complexity. We gain remarkable increases in processing speed: more than one order of magnitude compared to currently employed state of the art convex solvers relying on interior point methods. Unlike these approaches, our method can exploit warm starts; therefore optimizing model hyperparameters only requires a handful of passes through the data. A minor modification can further improve the quality of activity inference by imposing a constraint on the minimum spike size. The algorithm enables real-time simultaneous deconvolution of O(105) traces of whole-brain larval zebrafish imaging data on a laptop. PMID:28291787
A Pilot Study to Directly Measure the Dynamical Masses of ULIRGs at Intermediate Redshifts
NASA Astrophysics Data System (ADS)
Rothberg, Barry
2012-02-01
We propose a pilot study to use the Calcium II Triplet stellar absorption lines (rest-frame 0.85 microns) in conjunction with publicly available, high-resolution rest-frame optical HST imaging to directly measure the dynamical masses (M_dyn) and estimate central black hole masses (M_BH) in a small sample of intermediate redshift ULIRGs (0.4 < z < 1.0). It is the same method we have used to measure M_dyn and M_BH in local ULIRGs, and has successfully shown that these systems are statistically indistinguishable from nearby (z < 0.4) QSOs. At 0.4 < z < 1.0, the star-formation rates, gas fractions, and (presumably) masses, are believed to be significantly higher than in the local universe. However, mass is a critical parameter in most galaxy scaling relations, and current methods to estimate mass at intermediate redshifts rely heavily on unproven assumptions. Using stellar velocity dispersions is a straight-forward method to measuring M_dyn, and we will use it to: 1) conf! irm higher masses at 0.4 < z < 1.0; and 2) provide a calibration for other techniques.
Askari, Sina; Zhang, Mo; Won, Deborah S
2010-01-01
Current methods for assessing the efficacy of treatments for Parkinson's disease (PD) rely on physician rated scores. These methods pose three major shortcomings: 1) the subjectivity of the assessments, 2) the lack of precision on the rating scale (6 discrete levels), and 3) the inability to assess symptoms except under very specific conditions and/or for very specific tasks. To address these shortcomings, a portable system was developed to continuously monitor Parkinsonian symptoms with quantitative measures based on electrical signals from muscle activity (EMG). Here, we present the system design and the implementation of methods for system validation. This system was designed to provide continuous measures of tremor, rigidity, and bradykinesia which are related to the neurophysiological source without the need for multiple bulky experimental apparatuses, thus allowing more precise, quantitative indicators of the symptoms which can be measured during practical daily living tasks. This measurement system has the potential to improve the diagnosis of PD as well as the evaluation of PD treatments, which is an important step in the path to improving PD treatments.
NASA Astrophysics Data System (ADS)
Mahan, Matthew
Microbial keratitis (MK) is an infection of the cornea by pathogenic organisms that causes inflammation and irritation. It can lead to full or partial blindness if left untreated. Current clinical treatment methods rely on high frequency application of topical drugs which are subject to the issues of patient compliance and microbial resistance. In this work, gold nanoparticles (AuNP) were proposed as an alternative treatment method in light-based therapies. Particle formulation methods were investigated and assessed using transmission electron microscopy (TEM) and ultraviolet/visible spectroscopy (UV-Vis). AuNP of 20 nm diameter were used as platforms to attach monoclonal antibodies anti-FLAG or anti-F1 to enhance their cell-targeting ability as well as polyethylene glycol to reduce non-specific binding and protein adsorption. These functionalized particles were qualitatively assessed using UV-Vis. The antibody-functionalized AuNP were then assessed for their ability to attach directly to Pseudomonas aeruginosa, expressing FLAG peptide, or Aspergillus fumigatus, expressing the F1 receptor. Attachment was imaged using dark field microscopy, transmission electron microscopy, and fluorescence microscopy.