Recurrence Quantifcation Analysis of Sentence-Level Speech Kinematics
ERIC Educational Resources Information Center
Jackson, Eric S.; Tiede, Mark; Riley, Michael A.; Whalen, D. H.
2016-01-01
Purpose: Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach--recurrence quantification analysis (RQA)--via a procedural example…
Richens, Joanna L; Urbanowicz, Richard A; Lunt, Elizabeth AM; Metcalf, Rebecca; Corne, Jonathan; Fairclough, Lucy; O'Shea, Paul
2009-01-01
Chronic obstructive pulmonary disease (COPD) is a treatable and preventable disease state, characterised by progressive airflow limitation that is not fully reversible. Although COPD is primarily a disease of the lungs there is now an appreciation that many of the manifestations of disease are outside the lung, leading to the notion that COPD is a systemic disease. Currently, diagnosis of COPD relies on largely descriptive measures to enable classification, such as symptoms and lung function. Here the limitations of existing diagnostic strategies of COPD are discussed and systems biology approaches to diagnosis that build upon current molecular knowledge of the disease are described. These approaches rely on new 'label-free' sensing technologies, such as high-throughput surface plasmon resonance (SPR), that we also describe. PMID:19386108
Lepora, Nathan F; Blomeley, Craig P; Hoyland, Darren; Bracci, Enrico; Overton, Paul G; Gurney, Kevin
2011-11-01
The study of active and passive neuronal dynamics usually relies on a sophisticated array of electrophysiological, staining and pharmacological techniques. We describe here a simple complementary method that recovers many findings of these more complex methods but relies only on a basic patch-clamp recording approach. Somatic short and long current pulses were applied in vitro to striatal medium spiny (MS) and fast spiking (FS) neurons from juvenile rats. The passive dynamics were quantified by fitting two-compartment models to the short current pulse data. Lumped conductances for the active dynamics were then found by compensating this fitted passive dynamics within the current-voltage relationship from the long current pulse data. These estimated passive and active properties were consistent with previous more complex estimations of the neuron properties, supporting the approach. Relationships within the MS and FS neuron types were also evident, including a graduation of MS neuron properties consistent with recent findings about D1 and D2 dopamine receptor expression. Application of the method to simulated neuron data supported the hypothesis that it gives reasonable estimates of membrane properties and gross morphology. Therefore detailed information about the biophysics can be gained from this simple approach, which is useful for both classification of neuron type and biophysical modelling. Furthermore, because these methods rely upon no manipulations to the cell other than patch clamping, they are ideally suited to in vivo electrophysiology. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.
Alternatives for Jet Engine Control
NASA Technical Reports Server (NTRS)
Leake, R. J.; Sain, M. K.
1976-01-01
Approaches are developed as alternatives to current design methods which rely heavily on linear quadratic and Riccati equation methods. The main alternatives are discussed in two broad categories, local multivariable frequency domain methods and global nonlinear optimal methods.
In-Situ Transfer Standard and Coincident-View Intercomparisons for Sensor Cross-Calibration
NASA Technical Reports Server (NTRS)
Thome, Kurt; McCorkel, Joel; Czapla-Myers, Jeff
2013-01-01
There exist numerous methods for accomplishing on-orbit calibration. Methods include the reflectance-based approach relying on measurements of surface and atmospheric properties at the time of a sensor overpass as well as invariant scene approaches relying on knowledge of the temporal characteristics of the site. The current work examines typical cross-calibration methods and discusses the expected uncertainties of the methods. Data from the Advanced Land Imager (ALI), Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), Moderate Resolution Imaging Spectroradiometer (MODIS), and Thematic Mapper (TM) are used to demonstrate the limits of relative sensor-to-sensor calibration as applied to current sensors while Landsat-5 TM and Landsat-7 ETM+ are used to evaluate the limits of in situ site characterizations for SI-traceable cross calibration. The current work examines the difficulties in trending of results from cross-calibration approaches taking into account sampling issues, site-to-site variability, and accuracy of the method. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The results show that cross calibrations with absolute uncertainties lesser than 1.5 percent (1 sigma) are currently achievable even for sensors without coincident views.
Microfluidics-based, time-resolved mechanical phenotyping of cells using high-speed imaging
NASA Astrophysics Data System (ADS)
Belotti, Yuri; Conneely, Michael; Huang, Tianjun; McKenna, Stephen; Nabi, Ghulam; McGloin, David
2017-07-01
We demonstrate a single channel hydrodynamic stretching microfluidic device that relies on high-speed imaging to allow repeated dynamic cell deformation measurements. Experiments on prostate cancer cells suggest richer data than current approaches.
Development of a hazard-based method for evaluating the fire safety of passenger trains
DOT National Transportation Integrated Search
1999-01-01
The fire safety of U.S. passenger rail trains currently is addressed through small-scale flammability and smoke emission tests and performance criteria promulgated by the Federal Railroad Administration (FRA). The FRA approach relies heavily on test ...
Sustainable approaches to control postharvest diseases of apples
USDA-ARS?s Scientific Manuscript database
Long term storage of apples faces challenges in maintaining fruit quality and reducing losses from postharvest diseases. Currently, the apple industry relies mainly on synthetic fungicides to control postharvest decays. However, the limitations to fungicides such as the development of resistance i...
ERIC Educational Resources Information Center
Claes, Ellen; Hooghe, Marc
2017-01-01
Citizenship education has evolved substantially in recent decades, with a rapid proliferation of education forms and approaches. The currently available evaluation studies, however, do not allow us to determine what kind of approach can be considered as a best practice for schools and education systems. In this article, we rely on the results of a…
USE OF MACROACTIVITY APPROACH TO ASSESS DERMAL EXPOSURE
Currently, data on children's exposures and activities are very limited and insufficient to support quantitative assessments that do not rely heavily on major default assumptions as substitutes for missing information (Cohen Hubal et al. 2000a, b). Cohen Hubal et al. (2000a, b...
DOT National Transportation Integrated Search
2015-06-01
Diverse states like Virginia, with a mix of urban, suburban, and rural environments and transportation systems, cannot rely on a single approach to increasing transportation sustainability, but require an understanding of what has worked and what mig...
Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality...
Outlook: directed development: catalysing a global biotech industry.
Sun, Anthony; Perkins, Tom
2005-09-01
Governments are increasingly relying on directed development tools or proactive public-policy approaches to stimulate scientific and economic development for their biotechnology industries. This article will discuss the four main tools of directed development in biotechnology and the lessons learned from current global efforts utilizing these tools.
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Evaluating a push-pull strategy for management of Drosophila suzukii Matsumura in red raspberry
USDA-ARS?s Scientific Manuscript database
Drosophilia suzukii Matsumura is a serious pest of small fruits and cherries that lays its eggs in ripe and ripening fruit. Current management strategies rely on an unsustainable schedule of foliar applications of chemical insecticides. Alternative approaches to suppressing oviposition are under inv...
A Condition Based Maintenance Approach to Forecasting B-1 Aircraft Parts
2017-03-23
1 Problem Statement...aimed at making the USAF aware of CBM methods, and recommending which techniques to consider for implementation. Problem Statement The USAF relies on... problem , this research will seek to highlight common CBM forecasting methods that are well established and evaluate its suitability with current USAF
Wilderness solitude: Beyond the social-spatial perspective
Steven J. Hollenhorst; Christopher D. Jones
2001-01-01
The current scholarly and management approach to wilderness solitude has relied on substitute measures such as crowding and privacy to measure solitude. Lackluster findings have been only partially explained by additional social-spatial factors such as encounter norms, displacement, product shift, and rationalization. Missing from the discussion has been an exploration...
Adoptive cell therapy for sarcoma
Mata, Melinda; Gottschalk, Stephen
2015-01-01
Current therapy for sarcomas, though effective in treating local disease, is often ineffective for patients with recurrent or metastatic disease. To improve outcomes, novel approaches are needed and cell therapy has the potential to meet this need since it does not rely on the cytotoxic mechanisms of conventional therapies. The recent successes of T-cell therapies for hematological malignancies have led to renewed interest in exploring cell therapies for solid tumors such as sarcomas. In this review, we will discuss current cell therapies for sarcoma with special emphasis on genetic approaches to improve the effector function of adoptively transferred cells. PMID:25572477
Mexico’s Drug War and Its Unintended Regional Consequences
2013-03-01
multiple approaches are designed to solve the problem.9 Analysis of the current strategic environment, relying on the environmental assessment model...current environmental assessment, this paper will provide a brief description of a more desired environment and also a problem statement that depicts...the 1980s the U.S. focused its counter drug efforts in Peru and Bolivia, then the world’s leaders in coca leaf supply. In the meantime, Colombian
Deep Learning for Brain MRI Segmentation: State of the Art and Future Directions.
Akkus, Zeynettin; Galimzianova, Alfiia; Hoogi, Assaf; Rubin, Daniel L; Erickson, Bradley J
2017-08-01
Quantitative analysis of brain MRI is routine for many neurological diseases and conditions and relies on accurate segmentation of structures of interest. Deep learning-based segmentation approaches for brain MRI are gaining interest due to their self-learning and generalization ability over large amounts of data. As the deep learning architectures are becoming more mature, they gradually outperform previous state-of-the-art classical machine learning algorithms. This review aims to provide an overview of current deep learning-based segmentation approaches for quantitative brain MRI. First we review the current deep learning architectures used for segmentation of anatomical brain structures and brain lesions. Next, the performance, speed, and properties of deep learning approaches are summarized and discussed. Finally, we provide a critical assessment of the current state and identify likely future developments and trends.
Formula Funding, the Delaware Study, and the University of North Carolina
ERIC Educational Resources Information Center
Carrigan, Sarah D.
2008-01-01
Public higher education has relied on a variety of funding structures since the 1950s. Layzell (2007) describes five general approaches in contemporary use in the United States. "Incremental (baseline) budgeting" uses the current year budget as the base and then makes adjustments to account for expected changes in activities, revenues,…
Learning Progressions as Evolving Tools in Joint Enterprises for Educational Improvement
ERIC Educational Resources Information Center
Penuel, William R.
2015-01-01
In their article, "Using Learning Progressions to Design Vertical Scales that Support Coherent Inferences about Student Growth," Briggs and Peck (this issue) argue that an important goal of assessment should be "to support coherent and actionable inferences of growth." They suggest that current approaches to test design rely on…
Using Oral Exams to Assess Communication Skills in Business Courses
ERIC Educational Resources Information Center
Burke-Smalley, Lisa A.
2014-01-01
Business, like many other fields in higher education, continues to rely largely on conventional testing methods for assessing student learning. In the current article, another evaluation approach--the oral exam--is examined as a means for building and evaluating the professional communication and oral dialogue skills needed and utilized by…
USDA-ARS?s Scientific Manuscript database
Magnaporthe oryzae, the rice blast pathogen, causes significant annual yield loss of rice worldwide. Currently, the most effective disease control approach is deployment of host resistance through introduction of resistance (R) genes into elite cultivars. The function of each R gene relies on the sp...
Update on the DNT In Vitro Alternative Methods Project at the USEPA
Current approaches to toxicity testing rely heavily on the use of animals, can cost millions of dollars and can take years to complete for a single chemical. To implement the predictive toxicity testing envisioned in the NAS report on Toxicity Testing in the 21st century, rapid a...
Social Networking Sites, Literacy, and the Authentic Identity Problem
ERIC Educational Resources Information Center
Kimmons, Royce
2014-01-01
Current interest in social media for educational purposes has led many to consider the importance of literacy development in online spaces (e.g., new media literacies, digital literacies, etc.). Relying heavily upon New Literacy Studies (NLS) as a base, these approaches treat literacy expansively to include socio-cultural factors beyond mere skill…
Don, Rob; Ioset, Jean-Robert
2014-01-01
The Drugs for Neglected Diseases initiative (DNDi) has defined and implemented an early discovery strategy over the last few years, in fitting with its virtual R&D business model. This strategy relies on a medium- to high-throughput phenotypic assay platform to expedite the screening of compound libraries accessed through its collaborations with partners from the pharmaceutical industry. We review the pragmatic approaches used to select compound libraries for screening against kinetoplastids, taking into account screening capacity. The advantages, limitations and current achievements in identifying new quality series for further development into preclinical candidates are critically discussed, together with attractive new approaches currently under investigation.
Developing health care workforces for uncertain futures.
Gorman, Des
2015-04-01
Conventional approaches to health care workforce planning are notoriously unreliable. In part, this is due to the uncertainty of the future health milieu. An approach to health care workforce planning that accommodates this uncertainty is not only possible but can also generate intelligence on which planning and consequent development can be reliably based. Drawing on the experience of Health Workforce New Zealand, the author outlines some of the approaches being used in New Zealand. Instead of relying simply on health care data, which provides a picture of current circumstances in health systems, the author argues that workforce planning should rely on health care intelligence--looking beyond the numbers to build understanding of how to achieve desired outcomes. As health care systems throughout the world respond to challenges such as reform efforts, aging populations of patients and providers, and maldistribution of physicians (to name a few), New Zealand's experience may offer a model for rethinking workforce planning to truly meet health care needs.
A Data Augmentation Approach to Short Text Classification
ERIC Educational Resources Information Center
Rosario, Ryan Robert
2017-01-01
Text classification typically performs best with large training sets, but short texts are very common on the World Wide Web. Can we use resampling and data augmentation to construct larger texts using similar terms? Several current methods exist for working with short text that rely on using external data and contexts, or workarounds. Our focus is…
A Communication Model for Teaching a Course in Mass Media and Society.
ERIC Educational Resources Information Center
Crumley, Wilma; Stricklin, Michael
Many professors of mass media and society courses have relied on a teaching model implying that students are sponges soaking up information. A more appropriate model invites concern with an active audience, transaction, the interpersonal mass media mix, a general systems approach, and process and change--in other words, utilization of current and…
Online Learner Satisfaction and Collaborative Learning: Evidence from Saudi Arabia
ERIC Educational Resources Information Center
Alkhalaf, Salem; Nguyen, Jeremy; Nguyen, Anne; Drew, Steve
2013-01-01
Despite the considerable potential for e-learning to improve learning outcomes, particularly for female students and students who need to rely on distance learning, feedback from current users of e-learning systems in the Kingdom of Saudi Arabia (KSA) suggests a relatively low level of satisfaction. This study adopts a mixed-methods approach in…
ERIC Educational Resources Information Center
Van den Broeck, Anja; Lens, Willy; De Witte, Hans; Van Coillie, Hermina
2013-01-01
The current study compares the quantitative and the qualitative viewpoints on work motivation by relying on Self-Determination Theory's differentiation between autonomous and controlled motivation. Specifically, we employed a person-centered approach to identify workers' naturally occurring motivational profiles and compared them in terms of…
2009-01-01
being done, in part, in response to Executive Order 13327, which mandates a pragmatic and consistent approach to Federal agency management of real...move forward. The U.S. Army Research and Development Center, Construction Engineering Research Laboratory was tasked with surveying a number of...assessment in use within USACE. (All rely on a deficiency-based approach, i.e., deviations from standards or from known benchmarks, to inspection.); (2
Renaissance of protein crystallization and precipitation in biopharmaceuticals purification.
Dos Santos, Raquel; Carvalho, Ana Luísa; Roque, A Cecília A
The current chromatographic approaches used in protein purification are not keeping pace with the increasing biopharmaceutical market demand. With the upstream improvements, the bottleneck shifted towards the downstream process. New approaches rely in Anything But Chromatography methodologies and revisiting former techniques with a bioprocess perspective. Protein crystallization and precipitation methods are already implemented in the downstream process of diverse therapeutic biological macromolecules, overcoming the current chromatographic bottlenecks. Promising work is being developed in order to implement crystallization and precipitation in the purification pipeline of high value therapeutic molecules. This review focuses in the role of these two methodologies in current industrial purification processes, and highlights their potential implementation in the purification pipeline of high value therapeutic molecules, overcoming chromatographic holdups. Copyright © 2016 Elsevier Inc. All rights reserved.
Towards large-scale, human-based, mesoscopic neurotechnologies.
Chang, Edward F
2015-04-08
Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
Intelligent Automation Approach for Improving Pilot Situational Awareness
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly
2004-01-01
Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.
Xu, Wei
2014-01-01
This paper first discusses the major inefficiencies faced in current human factors and ergonomics (HFE) approaches: (1) delivering an optimal end-to-end user experience (UX) to users of a solution across its solution lifecycle stages; (2) strategically influencing the product business and technology capability roadmaps from a UX perspective and (3) proactively identifying new market opportunities and influencing the platform architecture capabilities on which the UX of end products relies. In response to these challenges, three case studies are presented to demonstrate how enhanced ergonomics design approaches have effectively addressed the challenges faced in current HFE approaches. Then, the enhanced ergonomics design approaches are conceptualised by a user-experience ecosystem (UXE) framework, from a UX ecosystem perspective. Finally, evidence supporting the UXE, the advantage and the formalised process for executing UXE and methodological considerations are discussed. Practitioner Summary: This paper presents enhanced ergonomics approaches to product design via three case studies to effectively address current HFE challenges by leveraging a systematic end-to-end UX approach, UX roadmaps and emerging UX associated with prioritised user needs and usages. Thus, HFE professionals can be more strategic, creative and influential.
An Investigation of Automatic Change Detection for Topographic Map Updating
NASA Astrophysics Data System (ADS)
Duncan, P.; Smit, J.
2012-08-01
Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.
Coutinho-Abreu, Iliano V.; Zhu, Kun Yan; Ramalho-Ortigao, Marcelo
2009-01-01
Insect-borne diseases cause significant human morbidity and mortality. Current control and preventive methods against vector-borne diseases rely mainly on insecticides. The emergence of insecticide resistance in many disease vectors highlights the necessity to develop new strategies to control these insects. Vector transgenesis and paratransgenesis are novel strategies that aim at reducing insect vectorial capacity, or seek to eliminate transmission of pathogens such as Plasmodium sp., Trypanosoma sp., and Dengue virus currently being developed. Vector transgenesis relies on direct genetic manipulation of disease vectors making them incapable of functioning as vectors of a given pathogen. Paratransgenesis focuses on utilizing genetically modified insect symbionts to express molecules within the vector that are deleterious to pathogens they transmit. Despite the many successes achieved in developing such techniques in the last several years, many significant barriers remain and need to be overcome prior to any of these approaches become a reality. Here, we highlight the current status of these strategies, pointing out advantages and constraints, and also explore issues that need to be resolved before the establishment of transgenesis and paratransgenesis as tools to prevent vector-borne diseases. PMID:19819346
Kocot, Kevin M; Citarella, Mathew R; Moroz, Leonid L; Halanych, Kenneth M
2013-01-01
Molecular phylogenetics relies on accurate identification of orthologous sequences among the taxa of interest. Most orthology inference programs available for use in phylogenomics rely on small sets of pre-defined orthologs from model organisms or phenetic approaches such as all-versus-all sequence comparisons followed by Markov graph-based clustering. Such approaches have high sensitivity but may erroneously include paralogous sequences. We developed PhyloTreePruner, a software utility that uses a phylogenetic approach to refine orthology inferences made using phenetic methods. PhyloTreePruner checks single-gene trees for evidence of paralogy and generates a new alignment for each group containing only sequences inferred to be orthologs. Importantly, PhyloTreePruner takes into account support values on the tree and avoids unnecessarily deleting sequences in cases where a weakly supported tree topology incorrectly indicates paralogy. A test of PhyloTreePruner on a dataset generated from 11 completely sequenced arthropod genomes identified 2,027 orthologous groups sampled for all taxa. Phylogenetic analysis of the concatenated supermatrix yielded a generally well-supported topology that was consistent with the current understanding of arthropod phylogeny. PhyloTreePruner is freely available from http://sourceforge.net/projects/phylotreepruner/.
NASA Technical Reports Server (NTRS)
Barghouty, A. F.
2014-01-01
Accurate estimates of electroncapture cross sections at energies relevant to the modeling of the transport, acceleration, and interaction of energetic neutral atoms (ENA) in space (approximately few MeV per nucleon) and especially for multi-electron ions must rely on detailed, but computationally expensive, quantum-mechanical description of the collision process. Kuang's semi-classical approach is an elegant and efficient way to arrive at these estimates. Motivated by ENA modeling efforts for apace applications, we shall briefly present this approach along with sample applications and report on current progress.
The manager's role in marketing. The Health Care Group.
1991-06-01
With the impending reductions in physician reimbursements, the key to a practice's ongoing vitality will be its ability to increase volume and gain greater market share. Traditionally, most doctors have relied on word-of-mouth referrals from current patients and physicians to bring in new patients. In today's health care environment, however, this approach to practice building is not enough to assure growth.
Manipulating perceptual parameters in a continuous performance task.
Shalev, Nir; Humphreys, Glyn; Demeyere, Nele
2018-02-01
Sustained attention (SA) is among the most studied faculties of human cognition, and thought to be crucial for many aspects of behavior. Measuring SA often relies on performance on a continuous, low-demanding task. Such continuous performance tasks (CPTs) have many variations, and sustained attention is typically estimated based on variability in reaction times. While relying on reaction times may be useful in some cases, it can pose a challenge when working with clinical populations. To increase interpersonal variability in task parameters that do not rely on speed, researchers have increased demands for memory and response inhibition. These approaches, however, may be confounded when used to assess populations that suffer from multiple cognitive deficits. In the current study, we propose a new approach for increasing task variability by increasing the attentional demands. In order to do so, we created a new variation of a CPT - a masked version, where inattention is more likely to cause misidentifying a target. After establishing that masking indeed decreases target detection, we further investigated which task parameter may influence response biases. To do so, we contrasted two versions of the CPT with different target/distractor ratio. We then established how perceptual parameters can be controlled independently in a CPT. Following the experimental manipulations, we tested the MCCPT with aging controls and chronic stroke patients to assure the task can be used with target populations. The results confirm the MCCPT as a task providing high sensitivity without relying on reaction speed, and feasible for patients.
van Rooij, Antonius J; Van Looy, Jan; Billieux, Joël
2017-07-01
Some people have serious problems controlling their Internet and video game use. The DSM-5 now includes a proposal for 'Internet Gaming Disorder' (IGD) as a condition in need of further study. Various studies aim to validate the proposed diagnostic criteria for IGD and multiple new scales have been introduced that cover the suggested criteria. Using a structured approach, we demonstrate that IGD might be better interpreted as a formative construct, as opposed to the current practice of conceptualizing it as a reflective construct. Incorrectly approaching a formative construct as a reflective one causes serious problems in scale development, including: (i) incorrect reliance on item-to-total scale correlation to exclude items and incorrectly relying on indices of inter-item reliability that do not fit the measurement model (e.g., Cronbach's α); (ii) incorrect interpretation of composite or mean scores that assume all items are equal in contributing value to a sum score; and (iii) biased estimation of model parameters in statistical models. We show that these issues are impacting current validation efforts through two recent examples. A reinterpretation of IGD as a formative construct has broad consequences for current validation efforts and provides opportunities to reanalyze existing data. We discuss three broad implications for current research: (i) composite latent constructs should be defined and used in models; (ii) item exclusion and selection should not rely on item-to-total scale correlations; and (iii) existing definitions of IGD should be enriched further. © 2016 The Authors. Psychiatry and Clinical Neurosciences © 2016 Japanese Society of Psychiatry and Neurology.
P300 brain computer interface: current challenges and emerging trends
Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea
2012-01-01
A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397
Crystal S. Stonesifer; Dave Calkin; Matthew P. Thompson; Keith D. Stockmann
2016-01-01
Large airtanker use is widespread in wildfire suppression in the United States. The current approach to nationally dispatching the fleet of federal contract airtankers relies on filling requests for airtankers to achieve suppression objectives identified by fire managers at the incident level. In general, demand is met if resources are available, and the...
Passive versus active hazard detection and avoidance systems
NASA Astrophysics Data System (ADS)
Neveu, D.; Mercier, G.; Hamel, J.-F.; Simard Bilodeau, V.; Woicke, S.; Alger, M.; Beaudette, D.
2015-06-01
Upcoming planetary exploration missions will require advanced guidance, navigation and control technologies to reach landing sites with high precision and safety. Various technologies are currently in development to meet that goal. Some technologies rely on passive sensors and benefit from the low mass and power of such solutions while others rely on active sensors and benefit from an improved robustness and accuracy. This paper presents two different hazard detection and avoidance (HDA) system design approaches. The first architecture relies only on a camera as the passive HDA sensor while the second relies, in addition, on a Lidar as the active HDA sensor. Both options use in common an innovative hazard map fusion algorithm aiming at identifying the safest landing locations. This paper presents the simulation tools and reports the closed-loop software simulation results obtained using each design option. The paper also reports the Monte Carlo simulation campaign that was used to assess the robustness of each design option. The performance of each design option is compared against each other in terms of performance criteria such as percentage of success, mean distance to nearest hazard, etc. The applicability of each design option to planetary exploration missions is also discussed.
Fixism and conservation science.
Robert, Alexandre; Fontaine, Colin; Veron, Simon; Monnet, Anne-Christine; Legrand, Marine; Clavel, Joanne; Chantepie, Stéphane; Couvet, Denis; Ducarme, Frédéric; Fontaine, Benoît; Jiguet, Frédéric; le Viol, Isabelle; Rolland, Jonathan; Sarrazin, François; Teplitsky, Céline; Mouchet, Maud
2017-08-01
The field of biodiversity conservation has recently been criticized as relying on a fixist view of the living world in which existing species constitute at the same time targets of conservation efforts and static states of reference, which is in apparent disagreement with evolutionary dynamics. We reviewed the prominent role of species as conservation units and the common benchmark approach to conservation that aims to use past biodiversity as a reference to conserve current biodiversity. We found that the species approach is justified by the discrepancy between the time scales of macroevolution and human influence and that biodiversity benchmarks are based on reference processes rather than fixed reference states. Overall, we argue that the ethical and theoretical frameworks underlying conservation research are based on macroevolutionary processes, such as extinction dynamics. Current species, phylogenetic, community, and functional conservation approaches constitute short-term responses to short-term human effects on these reference processes, and these approaches are consistent with evolutionary principles. © 2016 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Nony, Laurent; Bocquet, Franck; Para, Franck; Loppacher, Christian
2016-09-01
A combined experimental and theoretical approach to the coupling between frequency-shift (Δ f ) , damping, and tunneling current (It) in combined noncontact atomic force microscopy/scanning tunneling microscopy using quartz tuning forks (QTF)-based probes is reported. When brought into oscillating tunneling conditions, the tip located at the QTF prong's end radiates an electromagnetic field which couples to the QTF prong motion via its piezoelectric tensor and loads its electrodes by induction. Our approach explains how those It-related effects ultimately modify the Δ f and the damping measurements. This paradigm to the origin of the coupling between It and the nc-AFM regular signals relies on both the intrinsic piezoelectric nature of the quartz constituting the QTF and its electrodes design.
Simultaneous confidence sets for several effective doses.
Tompsett, Daniel M; Biedermann, Stefanie; Liu, Wei
2018-04-03
Construction of simultaneous confidence sets for several effective doses currently relies on inverting the Scheffé type simultaneous confidence band, which is known to be conservative. We develop novel methodology to make the simultaneous coverage closer to its nominal level, for both two-sided and one-sided simultaneous confidence sets. Our approach is shown to be considerably less conservative than the current method, and is illustrated with an example on modeling the effect of smoking status and serum triglyceride level on the probability of the recurrence of a myocardial infarction. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Next generation DRM: cryptography or forensics?
NASA Astrophysics Data System (ADS)
Robert, Arnaud
2009-02-01
Current content protection systems rely primarily on applied cryptographic techniques but there is an increased use of forensic solutions in images, music and video distribution alike. The two approaches differ significantly, both in terms of technology and in terms of strategy, and thus it begs the question: will one approach take over in the long run, and if so which one? Discussing the evolution of both cryptographic and forensic solutions, we conclude that neither approach is ideal for all constituents, and that in the video space at least they will continue to co-exist for the foreseeable future - even if this may not be the case for other media types. We also analyze shortcomings of these approaches, and suggest that new solutions are necessary in this still emerging marketplace.
Postdoctoral Fellow | Center for Cancer Research
The lab is interested in understanding the regulation of RNA localization by cancer-associated proteins and the contribution of localized RNAs to tumor progression. The work relies on a variety of cell biological, microscopical and biochemical approaches in 2D and 3D cell culture systems. Some of the current projects aim to investigate the effect of the mechanical properties of the extracellular matrix on RNA localization, and the coupling between RNA localization and translation using single-molecule imaging approaches. This research program is funded by the NIH Intramural Research Program and is supported by state-of-the-art facilities on the NIH campus.
Health care quality: from data to accountability.
Darby, M
1998-08-01
The many audiences for information about the quality of health care have different and sometimes conflicting interests and priorities. This is reflected in the diversity of current efforts to use health care data to identify, measure, and demonstrate quality. The author surveys three of these approaches in depth: (1) the professional approach, which relies on the actions of private-sector accreditation groups, trade associations and health plans, hospitals, and other providers to assure quality; (2) the market-driven approach, which relies on the use of quality data by health care purchasers and consumers in choosing plans and providers; and (3) the public-sector approach, which relies on the regulatory, oversight, and purchasing actions of government at the federal, state, and local levels to assure quality. The author concludes that efforts to measure and report the quality of health care invariably confront a variety of technical and political issues. Several observers maintain that it is more important for participants in quality issues to reach consensus on the issues than to reach technical perfection in the way the data are handled. Important obstacles in the technical realm include inadequate investment in sufficiently sophisticated and compatible information systems and the fact that where such systems are in place, they generally cannot be linked. But efforts, both technical and legal, are under way to overcome these obstacles. Even so, some of the issues of health care quality will remain moving targets because of constant changes in the health care environment and in technology. The author closes with the hope that the various actors within the health care industry may coordinate their efforts in dealing with these issues.
Antimicrobial Nanomaterials Derived from Natural Products—A Review
Wang, Ji; Vermerris, Wilfred
2016-01-01
Modern medicine has relied heavily on the availability of effective antibiotics to manage infections and enable invasive surgery. With the emergence of antibiotic-resistant bacteria, novel approaches are necessary to prevent the formation of biofilms on sensitive surfaces such as medical implants. Advances in nanotechnology have resulted in novel materials and the ability to create novel surface topographies. This review article provides an overview of advances in the fabrication of antimicrobial nanomaterials that are derived from biological polymers or that rely on the incorporation of natural compounds with antimicrobial activity in nanofibers made from synthetic materials. The availability of these novel materials will contribute to ensuring that the current level of medical care can be maintained as more bacteria are expected to develop resistance against existing antibiotics. PMID:28773379
Antimicrobial nanomaterials derived from natural products—A review
Wang, Ji; Vermerris, Wilfred
2016-03-30
Modern medicine has relied heavily on the availability of effective antibiotics to manage infections and enable invasive surgery. With the emergence of antibiotic-resistant bacteria, novel approaches are necessary to prevent the formation of biofilms on sensitive surfaces such as medical implants. Advances in nanotechnology have resulted in novel materials and the ability to create novel surface topographies. This review article provides an overview of advances in the fabrication of antimicrobial nanomaterials that are derived from biological polymers or that rely on the incorporation of natural compounds with antimicrobial activity in nanofibers made from synthetic materials. Furthermore, the availability of thesemore » novel materials will contribute to ensuring that the current level of medical care can be maintained as more bacteria are expected to develop resistance against existing antibiotics.« less
Antimicrobial nanomaterials derived from natural products—A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ji; Vermerris, Wilfred
Modern medicine has relied heavily on the availability of effective antibiotics to manage infections and enable invasive surgery. With the emergence of antibiotic-resistant bacteria, novel approaches are necessary to prevent the formation of biofilms on sensitive surfaces such as medical implants. Advances in nanotechnology have resulted in novel materials and the ability to create novel surface topographies. This review article provides an overview of advances in the fabrication of antimicrobial nanomaterials that are derived from biological polymers or that rely on the incorporation of natural compounds with antimicrobial activity in nanofibers made from synthetic materials. Furthermore, the availability of thesemore » novel materials will contribute to ensuring that the current level of medical care can be maintained as more bacteria are expected to develop resistance against existing antibiotics.« less
PaPrBaG: A machine learning approach for the detection of novel pathogens from NGS data
NASA Astrophysics Data System (ADS)
Deneke, Carlus; Rentzsch, Robert; Renard, Bernhard Y.
2017-01-01
The reliable detection of novel bacterial pathogens from next-generation sequencing data is a key challenge for microbial diagnostics. Current computational tools usually rely on sequence similarity and often fail to detect novel species when closely related genomes are unavailable or missing from the reference database. Here we present the machine learning based approach PaPrBaG (Pathogenicity Prediction for Bacterial Genomes). PaPrBaG overcomes genetic divergence by training on a wide range of species with known pathogenicity phenotype. To that end we compiled a comprehensive list of pathogenic and non-pathogenic bacteria with human host, using various genome metadata in conjunction with a rule-based protocol. A detailed comparative study reveals that PaPrBaG has several advantages over sequence similarity approaches. Most importantly, it always provides a prediction whereas other approaches discard a large number of sequencing reads with low similarity to currently known reference genomes. Furthermore, PaPrBaG remains reliable even at very low genomic coverages. CombiningPaPrBaG with existing approaches further improves prediction results.
Multiparametric Breast MRI of Breast Cancer
Rahbar, Habib; Partridge, Savannah C.
2015-01-01
Synopsis Breast MRI has increased in popularity over the past two decades due to evidence for its high sensitivity for cancer detection. Current clinical MRI approaches rely on the use of a dynamic contrast enhanced (DCE-MRI) acquisition that facilitates morphologic and semi-quantitative kinetic assessments of breast lesions. The use of more functional and quantitative parameters, such as pharmacokinetic features from high temporal resolution DCE-MRI, apparent diffusion coefficient (ADC) and intravoxel incoherent motion (IVIM) on diffusion weighted MRI, and choline concentrations on MR spectroscopy, hold promise to broaden the utility of MRI and improve its specificity. However, due to wide variations in approach among centers for measuring these parameters and the considerable technical challenges, robust multicenter data supporting their routine use is not yet available, limiting current applications of many of these tools to research purposes. PMID:26613883
[Recurrent chronic parotiditis in childhood: An update of the literature].
Donoso-Hofer, Francisca; Gutiérrez Díaz, Rodrigo; Ortiz Cárdenas, Rodrigo; Osorio Herrera, Gustavo; Landaeta Mendoza, Mirtha
2017-01-01
Recurrent childhood chronic parotiditis (RCCP) is a relevant pathology. Its diagnosis is mainly clinical, but it relies on imaging tests. The current treatment approach is diverse. The aim of this article is to update the clinical features, complementary tests, etiopathogenic models and therapeutic protocols of this disease. A bibliographic search was performed in PUBMED using the free terms and MESH terms: RCCP, recurrent parotiditis, chronic parotiditis and parotiditis. The filters used were human patients, up to 18 years old, with abstract. In SCIELO the free terms included were Parotiditis and chronic. Articles published in English, Spanish or Portuguese until 2017 were included. In PUBMED 119 articles were found and 44 were included. The exclusion of the remaining articles was due to language, access to the article or absence of relationship between the article and the proposed revision. In SCIELO 6 articles were found 6 of which 5 were selected. The multidisciplinary asses of patients with RCCP is considered the appropriate treatment. Its diagnosis is clinical but it relies on imaging tests, such as echography and sialography. The current treatment approach is conservative, and the best available evidence supports the use of sialendoscopy with irrigation and administration of antibiotics and/or corticosteroids via the parotid duct. However, there would be proper results with intraglandular lavage with physiological solutions without the need for a sialendoscope.
Font adaptive word indexing of modern printed documents.
Marinai, Simone; Marino, Emanuele; Soda, Giovanni
2006-08-01
We propose an approach for the word-level indexing of modern printed documents which are difficult to recognize using current OCR engines. By means of word-level indexing, it is possible to retrieve the position of words in a document, enabling queries involving proximity of terms. Web search engines implement this kind of indexing, allowing users to retrieve Web pages on the basis of their textual content. Nowadays, digital libraries hold collections of digitized documents that can be retrieved either by browsing the document images or relying on appropriate metadata assembled by domain experts. Word indexing tools would therefore increase the access to these collections. The proposed system is designed to index homogeneous document collections by automatically adapting to different languages and font styles without relying on OCR engines for character recognition. The approach is based on three main ideas: the use of Self Organizing Maps (SOM) to perform unsupervised character clustering, the definition of one suitable vector-based word representation whose size depends on the word aspect-ratio, and the run-time alignment of the query word with indexed words to deal with broken and touching characters. The most appropriate applications are for processing modern printed documents (17th to 19th centuries) where current OCR engines are less accurate. Our experimental analysis addresses six data sets containing documents ranging from books of the 17th century to contemporary journals.
The road less taken: modularization and waterways as a domestic disaster response mechanism.
Donahue, Donald A; Cunnion, Stephen O; Godwin, Evelyn A
2013-01-01
Preparedness scenarios project the need for significant healthcare surge capacity. Current planning draws heavily from the military model, leveraging deployable infrastructure to augment or replace extant capabilities. This approach would likely prove inadequate in a catastrophic disaster, as the military model relies on forewarning and an extended deployment cycle. Local equipping for surge capacity is prohibitively costly while movement of equipment can be subject to a single point of failure. Translational application of maritime logistical techniques and an ancient mode of transportation can provide a robust and customizable approach to disaster relief for greater than 90 percent of the American population.
Predicting ESI/MS Signal Change for Anions in Different Solvents.
Kruve, Anneli; Kaupmees, Karl
2017-05-02
LC/ESI/MS is a technique widely used for qualitative and quantitative analysis in various fields. However, quantification is currently possible only for compounds for which the standard substances are available, as the ionization efficiency of different compounds in ESI source differs by orders of magnitude. In this paper we present an approach for quantitative LC/ESI/MS analysis without standard substances. This approach relies on accurately predicting the ionization efficiencies in ESI source based on a model, which uses physicochemical parameters of analytes. Furthermore, the model has been made transferable between different mobile phases and instrument setups by using a suitable set of calibration compounds. This approach has been validated both in flow injection and chromatographic mode with gradient elution.
Compositional Solution Space Quantification for Probabilistic Software Analysis
NASA Technical Reports Server (NTRS)
Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem
2014-01-01
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions
2017-01-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.
Nantha, Yogarabindranath Swarna
2017-11-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.
Rothfuss, Michael A; Unadkat, Jignesh V; Gimbel, Michael L; Mickle, Marlin H; Sejdić, Ervin
2017-03-01
Totally implantable wireless ultrasonic blood flowmeters provide direct-access chronic vessel monitoring in hard-to-reach places without using wired bedside monitors or imaging equipment. Although wireless implantable Doppler devices are accurate for most applications, device size and implant lifetime remain vastly underdeveloped. We review past and current approaches to miniaturization and implant lifetime extension for wireless implantable Doppler devices and propose approaches to reduce device size and maximize implant lifetime for the next generation of devices. Additionally, we review current and past approaches to accurate blood flow measurements. This review points toward relying on increased levels of monolithic customization and integration to reduce size. Meanwhile, recommendations to maximize implant lifetime should include alternative sources of power, such as transcutaneous wireless power, that stand to extend lifetime indefinitely. Coupling together the results will pave the way for ultra-miniaturized totally implantable wireless blood flow monitors for truly chronic implantation. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
First demonstration of HF-driven ionospheric currents
NASA Astrophysics Data System (ADS)
Papadopoulos, K.; Chang, C.-L.; Labenski, J.; Wallace, T.
2011-10-01
The first experimental demonstration of HF driven currents in the ionosphere at low ELF/ULF frequencies without relying in the presence of electrojets is presented. The effect was predicted by theoretical/computational means in a recent letter and given the name Ionospheric Current Drive (ICD). The effect relies on modulated F-region HF heating to generate Magneto-Sonic (MS) waves that drive Hall currents when they reach the E-region. The Hall currents inject ELF waves into the Earth-Ionosphere waveguide and helicon and Shear Alfven (SA) waves in the magnetosphere. The proof-of-concept experiments were conducted using the HAARP heater in Alaska under the BRIOCHE program. Waves between 0.1-70 Hz were measured at both near and far sites. The letter discusses the differences between ICD generated waves and those relying on modulation of electrojets.
Collins, Kathleen; Nilsen, Timothy W
2013-08-01
Current investigation of RNA transcriptomes relies heavily on the use of retroviral reverse transcriptases. It is well known that these enzymes have many limitations because of their intrinsic properties. This commentary highlights the recent biochemical characterization of a new family of reverse transcriptases, those encoded by group II intron retrohoming elements. The novel properties of these enzymes endow them with the potential to revolutionize how we approach RNA analyses.
Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B
2015-03-04
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.
Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.
2015-01-01
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092
A Gendered Approach to Science Ethics for US and UK Physicists.
Ecklund, Elaine Howard; Di, Di
2017-02-01
Some research indicates that women professionals-when compared to men-may be more ethical in the workplace. Existing literature that discusses gender and ethics is confined to the for-profit business sector and primarily to a US context. In particular, there is little attention paid to gender and ethics in science professions in a global context. This represents a significant gap, as science is a rapidly growing and global professional sector, as well as one with ethically ambiguous areas. Adopting an international comparative perspective, this paper relies on 121 semi-structured interviews with US and UK academic physicists to examine how physicists perceive the impact of gender on science ethics. Findings indicate that some US and UK physicists believe that female scientists handle ethical issues within science in a feminine way whereas their male colleagues approach ethics in a masculine way. Some of these physicists further claim that these different approaches to science ethics lead to male and female scientists' different levels of competitiveness in academic physics. In both the US and the UK, there are "gender-blind" physicists, who do not think gender is related to professional ethics. Relying on physicists' nuanced descriptions this paper contributes to the current understanding of gender and science and engineering ethics.
Heat Transfer Analysis in Wire Bundles for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Rickman, S. L.; Iamello, C. J.
2016-01-01
Design of wiring for aerospace vehicles relies on an understanding of "ampacity" which refers to the current carrying capacity of wires, either, individually or in wire bundles. Designers rely on standards to derate allowable current flow to prevent exceedance of wire temperature limits due to resistive heat dissipation within the wires or wire bundles. These standards often add considerable margin and are based on empirical data. Commercial providers are taking an aggressive approach to wire sizing which challenges the conventional wisdom of the established standards. Thermal modelling of wire bundles may offer significant mass reduction in a system if the technique can be generalized to produce reliable temperature predictions for arbitrary bundle configurations. Thermal analysis has been applied to the problem of wire bundles wherein any or all of the wires within the bundle may carry current. Wire bundles present analytical challenges because the heat transfer path from conductors internal to the bundle is tortuous, relying on internal radiation and thermal interface conductance to move the heat from within the bundle to the external jacket where it can be carried away by convective and radiative heat transfer. The problem is further complicated by the dependence of wire electrical resistivity on temperature. Reduced heat transfer out of the bundle leads to higher conductor temperatures and, hence, increased resistive heat dissipation. Development of a generalized wire bundle thermal model is presented and compared with test data. The steady state heat balance for a single wire is derived and extended to the bundle configuration. The generalized model includes the effects of temperature varying resistance, internal radiation and thermal interface conductance, external radiation and temperature varying convective relief from the free surface. The sensitivity of the response to uncertainties in key model parameters is explored using Monte Carlo analysis.
Nasal juvenile angiofibroma: Current perspectives with emphasis on management.
López, Fernando; Triantafyllou, Asterios; Snyderman, Carl H; Hunt, Jennifer L; Suárez, Carlos; Lund, Valerie J; Strojan, Primož; Saba, Nabil F; Nixon, Iain J; Devaney, Kenneth O; Alobid, Isam; Bernal-Sprekelsen, Manuel; Hanna, Ehab Y; Rinaldo, Alessandra; Ferlito, Alfio
2017-05-01
Juvenile angiofibroma is an uncommon, benign, locally aggressive vascular tumor. It is found almost exclusively in young men. Common presenting symptoms include nasal obstruction and epistaxis. More advanced tumors may present with facial swelling and visual or neurological disturbances. The evaluation of patients with juvenile angiofibroma relies on diagnostic imaging. Preoperative biopsy is not recommended. The mainstay of treatment is resection combined with preoperative embolization. Endoscopic surgery is the approach of choice in early stages, whereas, in advanced stages, open or endoscopic approaches are feasible in expert hands. Postoperative radiotherapy (RT) or stereotactic radiosurgery seem valuable in long-term control of juvenile angiofibroma, particularly those that extend to anatomically critical areas unsuitable for complete resection. Chemotherapy and hormone therapy are ineffective. The purpose of the present review was to update current aspects of knowledge related to this rare and challenging disease. © 2017 Wiley Periodicals, Inc. Head Neck 39: 1033-1045, 2017. © 2017 Wiley Periodicals, Inc.
Caso, Francesco; Costa, Luisa; Del Puente, Antonio; Di Minno, Matteo Nicola Dario; Lupoli, Gelsy; Scarpa, Raffaele; Peluso, Rosario
2015-01-01
Spondyloarthritis represents a heterogeneous group of articular inflammatory diseases that share common genetic, clinical and radiological features. The therapy target of spondyloarthritis relies mainly in improving patients’ quality of life, controlling articular inflammation, preventing the structural joints damage and preserving the functional abilities, autonomy and social participation of patients. Among these, traditional disease-modifying antirheumatic drugs have been demonstrated to be effective in the management of peripheral arthritis; moreover, in the last decade, biological therapies have improved the approach to spondyloarthritis. In patients with axial spondyloarthritis, tumor necrosis factor α inhibitors are currently the only effective therapy in patients for whom conventional therapy with nonsteroidal anti-inflammatory drugs has failed. The aim of this review is to summarize the current experience and evidence about the pharmacological approach in spondyloarthritis patients. PMID:26568809
Enhancement of the MODIS Snow and Ice Product Suite Utilizing Image Segmentation
NASA Technical Reports Server (NTRS)
Tilton, James C.; Hall, Dorothy K.; Riggs, George A.
2006-01-01
A problem has been noticed with the current NODIS Snow and Ice Product in that fringes of certain snow fields are labeled as "cloud" whereas close inspection of the data indicates that the correct labeling is a non-cloud category such as snow or land. This occurs because the current MODIS Snow and Ice Product generation algorithm relies solely on the MODIS Cloud Mask Product for the labeling of image pixels as cloud. It is proposed here that information obtained from image segmentation can be used to determine when it is appropriate to override the cloud indication from the cloud mask product. Initial tests show that this approach can significantly reduce the cloud "fringing" in modified snow cover labeling. More comprehensive testing is required to determine whether or not this approach consistently improves the accuracy of the snow and ice product.
Vázquez-Rowe, Ian; Iribarren, Diego
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.
Vázquez-Rowe, Ian
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136
Current and Emerging Therapies for Lupus Nephritis
Parikh, Samir V.
2016-01-01
The introduction of corticosteroids and later, cyclophosphamide dramatically improved survival in patients with proliferative lupus nephritis, and combined administration of these agents became the standard-of-care treatment for this disease. However, treatment failures were still common and the rate of progression to ESRD remained unacceptably high. Additionally, treatment was associated with significant morbidity. Therefore, as patient survival improved, the goals for advancing lupus nephritis treatment shifted to identifying therapies that could improve long-term renal outcomes and minimize treatment-related toxicity. Unfortunately, progress has been slow and the current approaches to the management of lupus nephritis continue to rely on high-dose corticosteroids plus a broad-spectrum immunosuppressive agent. Over the past decade, an improved understanding of lupus nephritis pathogenesis fueled several clinical trials of novel drugs, but none have been found to be superior to the combination of a cytotoxic agent and corticosteroids. Despite these trial failures, efforts to translate mechanistic advances into new treatment approaches continue. In this review, we discuss current therapeutic strategies for lupus nephritis, briefly review recent advances in understanding the pathogenesis of this disease, and describe emerging approaches developed on the basis of these advances that promise to improve upon the standard-of-care lupus nephritis treatments. PMID:27283496
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1990-01-01
A novel method of microwave power conversion to direct current is discussed that relies on a modification of well known resonant linear relativistic electron accelerator techniques. An analysis is presented that shows how, by establishing a 'slow' electromagnetic field in a waveguide, electrons liberated from an array of field emission cathodes, are resonantly accelerated to several times their rest energy, thus establishing an electric current over a large potential difference. Such an approach is not limited to the relatively low frequencies that characterize the operation of rectennas, and can, with appropriate waveguide and slow wave structure design, be employed in the 300 to 600 GHz range where much smaller transmitting and receiving antennas are needed.
Computing singularities of perturbation series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kvaal, Simen; Jarlebring, Elias; Michiels, Wim
2011-03-15
Many properties of current ab initio approaches to the quantum many-body problem, both perturbational and otherwise, are related to the singularity structure of the Rayleigh-Schroedinger perturbation series. A numerical procedure is presented that in principle computes the complete set of singularities, including the dominant singularity which limits the radius of convergence. The method approximates the singularities as eigenvalues of a certain generalized eigenvalue equation which is solved using iterative techniques. It relies on computation of the action of the Hamiltonian matrix on a vector and does not rely on the terms in the perturbation series. The method can be usefulmore » for studying perturbation series of typical systems of moderate size, for fundamental development of resummation schemes, and for understanding the structure of singularities for typical systems. Some illustrative model problems are studied, including a helium-like model with {delta}-function interactions for which Moeller-Plesset perturbation theory is considered and the radius of convergence found.« less
Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick
2017-02-01
Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cima, Igor; Wen Yee, Chay; Iliescu, Florina S; Phyo, Wai Min; Lim, Kiat Hon; Iliescu, Ciprian; Tan, Min Han
2013-01-01
This review will cover the recent advances in label-free approaches to isolate and manipulate circulating tumor cells (CTCs). In essence, label-free approaches do not rely on antibodies or biological markers for labeling the cells of interest, but enrich them using the differential physical properties intrinsic to cancer and blood cells. We will discuss technologies that isolate cells based on their biomechanical and electrical properties. Label-free approaches to analyze CTCs have been recently invoked as a valid alternative to "marker-based" techniques, because classical epithelial and tumor markers are lost on some CTC populations and there is no comprehensive phenotypic definition for CTCs. We will highlight the advantages and drawbacks of these technologies and the status on their implementation in the clinics.
Clegg, Paul S; Tavacoli, Joe W; Wilde, Pete J
2016-01-28
Multiple emulsions have great potential for application in food science as a means to reduce fat content or for controlled encapsulation and release of actives. However, neither production nor stability is straightforward. Typically, multiple emulsions are prepared via two emulsification steps and a variety of approaches have been deployed to give long-term stability. It is well known that multiple emulsions can be prepared in a single step by harnessing emulsion inversion, although the resulting emulsions are usually short lived. Recently, several contrasting methods have been demonstrated which give rise to stable multiple emulsions via one-step production processes. Here we review the current state of microfluidic, polymer-stabilized and particle-stabilized approaches; these rely on phase separation, the role of electrolyte and the trapping of solvent with particles respectively.
A novel approach to enhance antibody sensitivity and specificity by peptide cross-linking
Namiki, Takeshi; Valencia, Julio C.; Hall, Matthew D.; Hearing, Vincent J.
2008-01-01
Most current techniques employed to improve antigen-antibody signals in western blotting and in immunohistochemistry rely on sample processing prior to staining (e.g. microwaving) or using a more robust reporter (e.g. a secondary antibody with biotin-streptavidin). We have developed and optimized a new approach intended to stabilize the complexes formed between antigens and their respective primary antibodies by cupric ions at high pH. This technique improves the affinity and lowers cross-reactivity with non-specific bands of ∼20% of antibodies tested (5/25). Here we report that this method can enhance antigen-antibody specificity and can improve the utility of some poorly reactive primary antibodies. PMID:18801330
Fast and sensitive detection of an oscillating charge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bian, X.; Hasko, D. G.; Milne, W. I.
We investigate the high-frequency operation of a percolation field effect transistor to monitor microwave excited single trapped charge. Readout is accomplished by measuring the effect of the polarization field associated with the oscillating charge on the AC signal generated in the channel due to charge pumping. This approach is sensitive to the relative phase between the polarization field and the pumped current, which is different from the conventional approach relying on the amplitude only. Therefore, despite the very small influence of the single oscillating trapped electron, a large signal can be detected. Experimental results show large improvement in both signal-to-noisemore » ratio and measurement bandwidth.« less
Introduction to Global Urban Climatology
NASA Astrophysics Data System (ADS)
Varquez, A. C. G.; Kanda, M.; Kawano, N.; Darmanto, N. S.; Dong, Y.
2016-12-01
Urban heat island (UHI) is a widely investigated phenomenon in the field of urban climate characterized by the warming of urban areas relative to its surrounding rural environs. Being able to understand the mechanism behind the UHI formation of a city and distinguish its impact from that of global climate change is indispensable when identifying adaptation and mitigation strategies. However, the lack of UHI studies many cities especially for developing countries makes it difficult to generalize the mechanism for UHI formation. Thus, there is an impending demand for studies that focus on the simultaneous analyses of UHI and its trends throughout the world. Hence, we propose a subfield of urban climatology, called "global urban climatology" (GUC), which mainly focuses on the uniform understanding of urban climates across all cities, globally. By using globally applicable methodologies to quantify and compare urban heat islands of cities with diverse backgrounds, including their geography, climate, socio-demography, and other factors, a universal understanding of the mechanisms underlying the formation of the phenomenon can be established. The implementation of GUC involves the use of globally acquired historical observation networks, gridded meteorological parameters from climate models, global geographic information system datasets; the construction of a distributed urban parameter database; and the development of techniques necessary to model the urban climate. Research under GUC can be categorized into three approaches. The collaborative approach (1st) relies on the collection of data from micro-scale experiments conducted worldwide with the aid or development of professional social networking platforms; the analytical approach (2nd) relies on the use of global weather station datasets and their corresponding objectively analysed global outputs; and the numerical approach (3rd) relies on the global estimation of high-resolution urban-representative parameters as inputs to global weather modelling. The GUC concept, the pathways through which GUC assessments can be undertaken, and current implementations are introduced. Acknowledgment: This research was supported by the Environment Research and Technology Development Fund (S-14) of the Ministry of the Environment, Japan.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
The currency and tempo of extinction.
Regan, H M; Lupia, R; Drinnan, A N; Burgman, M A
2001-01-01
This study examines estimates of extinction rates for the current purported biotic crisis and from the fossil record. Studies that compare current and geological extinctions sometimes use metrics that confound different sources of error and reflect different features of extinction processes. The per taxon extinction rate is a standard measure in paleontology that avoids some of the pitfalls of alternative approaches. Extinction rates reported in the conservation literature are rarely accompanied by measures of uncertainty, despite many elements of the calculations being subject to considerable error. We quantify some of the most important sources of uncertainty and carry them through the arithmetic of extinction rate calculations using fuzzy numbers. The results emphasize that estimates of current and future rates rely heavily on assumptions about the tempo of extinction and on extrapolations among taxa. Available data are unlikely to be useful in measuring magnitudes or trends in current extinction rates.
Beyond Fine Tuning: Adding capacity to leverage few labels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodas, Nathan O.; Shaffer, Kyle J.; Yankov, Artem
2017-12-09
In this paper we present a technique to train neural network models on small amounts of data. Current methods for training neural networks on small amounts of rich data typically rely on strategies such as fine-tuning a pre-trained neural networks or the use of domain-specific hand-engineered features. Here we take the approach of treating network layers, or entire networks, as modules and combine pre-trained modules with untrained modules, to learn the shift in distributions between data sets. The central impact of using a modular approach comes from adding new representations to a network, as opposed to replacing representations via fine-tuning.more » Using this technique, we are able surpass results using standard fine-tuning transfer learning approaches, and we are also able to significantly increase performance over such approaches when using smaller amounts of data.« less
Shear-wave velocity profiling according to three alternative approaches: A comparative case study
NASA Astrophysics Data System (ADS)
Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.
2016-11-01
The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.
Huhman, Marian; Quick, Brian L; Payne, Laura
2016-05-01
A primary objective of health care reform is to provide affordable and quality health insurance to individuals. Currently, promotional efforts have been moderately successful in registering older, more mature adults yet comparatively less successful in registering younger adults. With this challenge in mind, we conducted extensive formative research to better understand the attitudes, subjective norms, and perceived behavioral control of community college students. More specifically, we examined how each relates to their intentions to enroll in a health insurance plan, maintain their current health insurance plan, and talk with their parents about their parents having health insurance. In doing so, we relied on the revised reasoned action approach advanced by Fishbein and his associates (Fishbein & Ajzen, 2010; Yzer, 2012, 2013). Results showed that the constructs predicted intentions to enroll in health insurance for those with no insurance and for those with government-sponsored insurance and intentions to maintain insurance for those currently insured. Our study demonstrates the applicability of the revised reasoned action framework within this context and is discussed with an emphasis on the practical and theoretical contributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knio, Omar
2017-05-05
The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solutionmore » can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.« less
Endoscopic ultrasound-guided techniques for diagnosing pancreatic mass lesions: Can we do better?
Storm, Andrew C; Lee, Linda S
2016-01-01
The diagnostic approach to a possible pancreatic mass lesion relies first upon various non-invasive imaging modalities, including computed tomography, ultrasound, and magnetic resonance imaging techniques. Once a suspect lesion has been identified, tissue acquisition for characterization of the lesion is often paramount in developing an individualized therapeutic approach. Given the high prevalence and mortality associated with pancreatic cancer, an ideal approach to diagnosing pancreatic mass lesions would be safe, highly sensitive, and reproducible across various practice settings. Tools, in addition to radiologic imaging, currently employed in the initial evaluation of a patient with a pancreatic mass lesion include serum tumor markers, endoscopic retrograde cholangiopancreatography, and endoscopic ultrasound-guided fine needle aspiration (EUS-FNA). EUS-FNA has grown to become the gold standard in tissue diagnosis of pancreatic lesions. PMID:27818584
Panthier, Frédéric; Lareyre, Fabien; Audouin, Marie; Raffort, Juliette
2018-03-01
Pelvi-ureteric junction obstruction corresponds to an impairment of urinary transport that can lead to renal dysfunction if not treated. Several mechanisms can cause the obstruction of the ureter including intrinsic factors or extrinsic factors such as the presence of crossing vessels. The treatment of the disease relies on surgical approaches, pyeloplasty being the standard reference. The technique consists in removing the pathologic ureteric segment and renal pelvis and transposing associated crossing vessels if present. The vascular anatomy of the pelvi-ureteric junction is complex and varies among individuals, and this can impact on the disease development and its surgical treatment. In this review, we summarize current knowledge on vascular anatomic variations in the pelvi-ureteric junction. Based on anatomic characteristics, we discuss implications for surgical approaches during pyeloplasty and vessel transposition.
Bilayer insulator tunnel barriers for graphene-based vertical hot-electron transistors
NASA Astrophysics Data System (ADS)
Vaziri, S.; Belete, M.; Dentoni Litta, E.; Smith, A. D.; Lupina, G.; Lemme, M. C.; Östling, M.
2015-07-01
Vertical graphene-based device concepts that rely on quantum mechanical tunneling are intensely being discussed in the literature for applications in electronics and optoelectronics. In this work, the carrier transport mechanisms in semiconductor-insulator-graphene (SIG) capacitors are investigated with respect to their suitability as electron emitters in vertical graphene base transistors (GBTs). Several dielectric materials as tunnel barriers are compared, including dielectric double layers. Using bilayer dielectrics, we experimentally demonstrate significant improvements in the electron injection current by promoting Fowler-Nordheim tunneling (FNT) and step tunneling (ST) while suppressing defect mediated carrier transport. High injected tunneling current densities approaching 103 A cm-2 (limited by series resistance), and excellent current-voltage nonlinearity and asymmetry are achieved using a 1 nm thick high quality dielectric, thulium silicate (TmSiO), as the first insulator layer, and titanium dioxide (TiO2) as a high electron affinity second layer insulator. We also confirm the feasibility and effectiveness of our approach in a full GBT structure which shows dramatic improvement in the collector on-state current density with respect to the previously reported GBTs. The device design and the fabrication scheme have been selected with future CMOS process compatibility in mind. This work proposes a bilayer tunnel barrier approach as a promising candidate to be used in high performance vertical graphene-based tunneling devices.
2016-09-08
10.1118/1.4935531. A new radiation detection method relies on high-energy current (HEC) formed by secondary charged particles in the detector material...photocurrent, radiation detection , self-powered, thin-film U U U SAR 17 Dr. Joseph Wander Reset A Self-powered thin-film radiation detector using intrinsic...Program, Lowell, MA 01854 Purpose: We introduce a radiation detection method that relies on high-energy current (HEC) formed by secondary 10 charged
Dynamic Routing of Aircraft in the Presence of Adverse Weather Using a POMDP Framework
NASA Technical Reports Server (NTRS)
Balaban, Edward; Roychoudhury, Indranil; Spirkovska, Lilly; Sankararaman, Shankar; Kulkarni, Chetan; Arnon, Tomer
2017-01-01
Each year weather-related airline delays result in hundreds of millions of dollars in additional fuel burn, maintenance, and lost revenue, not to mention passenger inconvenience. The current approaches for aircraft route planning in the presence of adverse weather still mainly rely on deterministic methods. In contrast, this work aims to deal with the problem using a Partially Observable Markov Decision Processes (POMDPs) framework, which allows for reasoning over uncertainty (including uncertainty in weather evolution over time) and results in solutions that are more robust to disruptions. The POMDP-based decision support system is demonstrated on several scenarios involving convective weather cells and is benchmarked against a deterministic planning system with functionality similar to those currently in use or under development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serne, R.J.; Wood, M.I.
1990-05-01
This report documents the currently available geochemical data base for release and retardation for actual Hanford Site materials (wastes and/or sediments). The report also recommends specific laboratory tests and presents the rationale for the recommendations. The purpose of this document is threefold: to summarize currently available information, to provide a strategy for generating additional data, and to provide recommendations on specific data collection methods and tests matrices. This report outlines a data collection approach that relies on feedback from performance analyses to ascertain when adequate data have been collected. The data collection scheme emphasizes laboratory testing based on empiricism. 196more » refs., 4 figs., 36 tabs.« less
Norris, Edmund J; Coats, Joel R
2017-01-29
Every year, approximately 700,000 people die from complications associated with etiologic disease agents transmitted by mosquitoes. While insecticide-based vector control strategies are important for the management of mosquito-borne diseases, insecticide-resistance and other logistical hurdles may lower the efficacy of this approach, especially in developing countries. Repellent technologies represent another fundamental aspect of preventing mosquito-borne disease transmission. Among these technologies, spatial repellents are promising alternatives to the currently utilized contact repellents and may significantly aid in the prevention of mosquito-borne disease if properly incorporated into integrated pest management approaches. As their deployment would not rely on prohibitively expensive or impractical novel accessory technologies and resources, they have potential utility in developing countries where the burden of mosquito-borne disease is most prevalent. This review aims to describe the history of various repellent technologies, highlight the potential of repellent technologies in preventing the spread of mosquito-borne disease, and discuss currently known mechanisms that confer resistance to current contact and spatial repellents, which may lead to the failures of these repellents. In the subsequent section, current and future research projects aimed at exploring long-lasting non-pyrethroid spatial repellent molecules along with new paradigms and rationale for their development will be discussed.
Norris, Edmund J.; Coats, Joel R.
2017-01-01
Every year, approximately 700,000 people die from complications associated with etiologic disease agents transmitted by mosquitoes. While insecticide-based vector control strategies are important for the management of mosquito-borne diseases, insecticide-resistance and other logistical hurdles may lower the efficacy of this approach, especially in developing countries. Repellent technologies represent another fundamental aspect of preventing mosquito-borne disease transmission. Among these technologies, spatial repellents are promising alternatives to the currently utilized contact repellents and may significantly aid in the prevention of mosquito-borne disease if properly incorporated into integrated pest management approaches. As their deployment would not rely on prohibitively expensive or impractical novel accessory technologies and resources, they have potential utility in developing countries where the burden of mosquito-borne disease is most prevalent. This review aims to describe the history of various repellent technologies, highlight the potential of repellent technologies in preventing the spread of mosquito-borne disease, and discuss currently known mechanisms that confer resistance to current contact and spatial repellents, which may lead to the failures of these repellents. In the subsequent section, current and future research projects aimed at exploring long-lasting non-pyrethroid spatial repellent molecules along with new paradigms and rationale for their development will be discussed. PMID:28146066
Petroleum Scarcity and Public Health: Considerations for Local Health Departments
Parker, Cindy L.; Caine, Virginia A.; McKee, Mary; Shirley, Lillian M.; Links, Jonathan M.
2011-01-01
Recognition of petroleum as a finite global resource has spurred increasing interest in the intersection between petroleum scarcity and public health. Local health departments represent a critical yet highly vulnerable component of the public health infrastructure. These frontline agencies currently face daunting resource constraints and rely heavily on petroleum for vital population-based health services. Against this backdrop, petroleum scarcity may necessitate reconfiguring local public health service approaches. We describe the anticipated impacts of petroleum scarcity on local health departments, recommend the use of the 10 Essential Public Health Services as a framework for examining attendant operational challenges and potential responses to them, and describe approaches that local health departments and their stakeholders could consider as part of timely planning efforts. PMID:21778471
Cover estimation and payload location using Markov random fields
NASA Astrophysics Data System (ADS)
Quach, Tu-Thach
2014-02-01
Payload location is an approach to find the message bits hidden in steganographic images, but not necessarily their logical order. Its success relies primarily on the accuracy of the underlying cover estimators and can be improved if more estimators are used. This paper presents an approach based on Markov random field to estimate the cover image given a stego image. It uses pairwise constraints to capture the natural two-dimensional statistics of cover images and forms a basis for more sophisticated models. Experimental results show that it is competitive against current state-of-the-art estimators and can locate payload embedded by simple LSB steganography and group-parity steganography. Furthermore, when combined with existing estimators, payload location accuracy improves significantly.
Statistical Analysis of Protein Ensembles
NASA Astrophysics Data System (ADS)
Máté, Gabriell; Heermann, Dieter
2014-04-01
As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.
How to know and choose online games: differences between current and potential players.
Teng, Ching-I; Lo, Shao-Kang; Wang, Pe-Cheng
2007-12-01
This study investigated how different adolescent players acquire game information and the criteria they use in choosing online games and found that (1) current players generally use comprehensive information sources more than potential players do; (2) current players rely on free trials and smooth display of motion graphics as choice criteria more than potential players do; (3) potential players rely on the look of advertisements more than current players do; (4) both current and potential players most likely use word-of-mouth and gaming programs on TV as information sources; and (5) endorser attractiveness is ranked the least important among six choice criteria by both current and potential players.
Clinical challenges in thyroid disease: Time for a new approach?
Juby, A G; Hanly, M G; Lukaczer, D
2016-05-01
Thyroid disease is common, and the prevalence is rising. Traditional diagnosis and monitoring relies on thyroid stimulating hormone (TSH) levels. This does not always result in symptomatic improvement in hypothyroid symptoms, to the disappointment of both patients and physicians. A non-traditional therapeutic approach would include evaluation of GI function as well as a dietary history and micronutrient evaluation. This approach also includes assessment of thyroid peroxidase (TPO) antibodies, T3, T4, and reverse T3 levels, and in some cases may require specific T3 supplementation in addition to standard T4 therapy. Both high and low TSH levels on treatment are associated with particular medical risks. In the case of high TSH this is primarily cardiac, whereas for low TSH it is predominantly bone health. This article discusses these important clinical issues in more detail, with some practical tips especially for an approach to the "non-responders" to the current traditional therapeutic approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Bambus 2: scaffolding metagenomes.
Koren, Sergey; Treangen, Todd J; Pop, Mihai
2011-11-01
Sequencing projects increasingly target samples from non-clonal sources. In particular, metagenomics has enabled scientists to begin to characterize the structure of microbial communities. The software tools developed for assembling and analyzing sequencing data for clonal organisms are, however, unable to adequately process data derived from non-clonal sources. We present a new scaffolder, Bambus 2, to address some of the challenges encountered when analyzing metagenomes. Our approach relies on a combination of a novel method for detecting genomic repeats and algorithms that analyze assembly graphs to identify biologically meaningful genomic variants. We compare our software to current assemblers using simulated and real data. We demonstrate that the repeat detection algorithms have higher sensitivity than current approaches without sacrificing specificity. In metagenomic datasets, the scaffolder avoids false joins between distantly related organisms while obtaining long-range contiguity. Bambus 2 represents a first step toward automated metagenomic assembly. Bambus 2 is open source and available from http://amos.sf.net. mpop@umiacs.umd.edu. Supplementary data are available at Bioinformatics online.
Bambus 2: scaffolding metagenomes
Koren, Sergey; Treangen, Todd J.; Pop, Mihai
2011-01-01
Motivation: Sequencing projects increasingly target samples from non-clonal sources. In particular, metagenomics has enabled scientists to begin to characterize the structure of microbial communities. The software tools developed for assembling and analyzing sequencing data for clonal organisms are, however, unable to adequately process data derived from non-clonal sources. Results: We present a new scaffolder, Bambus 2, to address some of the challenges encountered when analyzing metagenomes. Our approach relies on a combination of a novel method for detecting genomic repeats and algorithms that analyze assembly graphs to identify biologically meaningful genomic variants. We compare our software to current assemblers using simulated and real data. We demonstrate that the repeat detection algorithms have higher sensitivity than current approaches without sacrificing specificity. In metagenomic datasets, the scaffolder avoids false joins between distantly related organisms while obtaining long-range contiguity. Bambus 2 represents a first step toward automated metagenomic assembly. Availability: Bambus 2 is open source and available from http://amos.sf.net. Contact: mpop@umiacs.umd.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:21926123
Creating value in health care through big data: opportunities and policy implications.
Roski, Joachim; Bo-Linn, George W; Andrews, Timothy A
2014-07-01
Big data has the potential to create significant value in health care by improving outcomes while lowering costs. Big data's defining features include the ability to handle massive data volume and variety at high velocity. New, flexible, and easily expandable information technology (IT) infrastructure, including so-called data lakes and cloud data storage and management solutions, make big-data analytics possible. However, most health IT systems still rely on data warehouse structures. Without the right IT infrastructure, analytic tools, visualization approaches, work flows, and interfaces, the insights provided by big data are likely to be limited. Big data's success in creating value in the health care sector may require changes in current polices to balance the potential societal benefits of big-data approaches and the protection of patients' confidentiality. Other policy implications of using big data are that many current practices and policies related to data use, access, sharing, privacy, and stewardship need to be revised. Project HOPE—The People-to-People Health Foundation, Inc.
Support for linguistic macrofamilies from weighted sequence alignment
Jäger, Gerhard
2015-01-01
Computational phylogenetics is in the process of revolutionizing historical linguistics. Recent applications have shed new light on controversial issues, such as the location and time depth of language families and the dynamics of their spread. So far, these approaches have been limited to single-language families because they rely on a large body of expert cognacy judgments or grammatical classifications, which is currently unavailable for most language families. The present study pursues a different approach. Starting from raw phonetic transcription of core vocabulary items from very diverse languages, it applies weighted string alignment to track both phonetic and lexical change. Applied to a collection of ∼1,000 Eurasian languages and dialects, this method, combined with phylogenetic inference, leads to a classification in excellent agreement with established findings of historical linguistics. Furthermore, it provides strong statistical support for several putative macrofamilies contested in current historical linguistics. In particular, there is a solid signal for the Nostratic/Eurasiatic macrofamily. PMID:26403857
A unified framework for mesh refinement in random and physical space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Jing; Stinis, Panos
In recent work we have shown how an accurate reduced model can be utilized to perform mesh renement in random space. That work relied on the explicit knowledge of an accurate reduced model which is used to monitor the transfer of activity from the large to the small scales of the solution. Since this is not always available, we present in the current work a framework which shares the merits and basic idea of the previous approach but does not require an explicit knowledge of a reduced model. Moreover, the current framework can be applied for renement in both randommore » and physical space. In this manuscript we focus on the application to random space mesh renement. We study examples of increasing difficulty (from ordinary to partial differential equations) which demonstrate the effciency and versatility of our approach. We also provide some results from the application of the new framework to physical space mesh refinement.« less
Wallace, T.J.; Torre, T.; Grob, M.; Yu, J.; Avital, I.; Brücher, BLDM; Stojadinovic, A.; Man, Y.G.
2014-01-01
Prostate cancer is the most commonly diagnosed non-cutaneous neoplasm in men in the United States and the second leading cause of cancer mortality. One in 7 men will be diagnosed with prostate cancer during their lifetime. As a result, monitoring treatment response is of vital importance. The cornerstone of current approaches in monitoring treatment response remains the prostate-specific antigen (PSA). However, with the limitations of PSA come challenges in our ability to monitor treatment success. Defining PSA response is different depending on the individual treatment rendered potentially making it difficult for those not trained in urologic oncology to understand. Furthermore, standard treatment response criteria do not apply to prostate cancer further complicating the issue of treatment response. Historically, prostate cancer has been difficult to image and no single modality has been consistently relied upon to measure treatment response. However, with newer imaging modalities and advances in our understanding and utilization of specific biomarkers, the future for monitoring treatment response in prostate cancer looks bright. PMID:24396494
Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M
2013-10-01
Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.
Enhancing the USDA Global Crop Assessment Decision Support System Using SMAP Soil Moisture Data
NASA Astrophysics Data System (ADS)
Bolten, J. D.; Mladenova, I. E.; Crow, W. T.; Reynolds, C. A.
2016-12-01
The Foreign Agricultural Services (FAS) is a subdivision of U.S. Department of Agriculture (USDA) that is in charge with providing information on current and expected crop supply and demand estimates. Knowledge of the amount of water in the root zone is an essential source of information for the crop analysts as it governs the crop development and crop growth, which in turn determine the end-of-season yields. USDA FAS currently relies on root zone soil moisture (RZSM) estimates generated using the modified two-layer Palmer Model (PM). PM is a simple water-balance hydrologic model that is driven by daily precipitation observations and minimum and maximum temperature data. These forcing data are based on ground meteorological station measurements from the World Meteorological Organization (WMO), and gridded weather data from the former U.S. Air Force Weather Agency (AFWA), currently called U.S. Air Force 557th Weather Wing. The PM was extended by adding a data assimilation (DA) unit that provides the opportunity to routinely ingest satellite-based soil moisture observations. This allows us to adjust for precipitation-related inaccuracies and enhance the quality of the PM soil moisture estimates. The current operational DA system is based on a 1-D Ensample Kalman Filter approach and relies on observations obtained from the Soil Moisture Ocean Salinity Mission (SMOS). Our talk will demonstrate the value of assimilating two satellite products (i.e. a passive and active) and discuss work that is done in preparation for ingesting soil moisture observations from the Soil Moisture Active Passive (SMAP) mission.
Dual-Frequency Piezoelectric Transducers for Contrast Enhanced Ultrasound Imaging
Martin, K. Heath; Lindsey, Brooks D.; Ma, Jianguo; Lee, Mike; Li, Sibo; Foster, F. Stuart; Jiang, Xiaoning; Dayton, Paul A.
2014-01-01
For many years, ultrasound has provided clinicians with an affordable and effective imaging tool for applications ranging from cardiology to obstetrics. Development of microbubble contrast agents over the past several decades has enabled ultrasound to distinguish between blood flow and surrounding tissue. Current clinical practices using microbubble contrast agents rely heavily on user training to evaluate degree of localized perfusion. Advances in separating the signals produced from contrast agents versus surrounding tissue backscatter provide unique opportunities for specialized sensors designed to image microbubbles with higher signal to noise and resolution than previously possible. In this review article, we describe the background principles and recent developments of ultrasound transducer technology for receiving signals produced by contrast agents while rejecting signals arising from soft tissue. This approach relies on transmitting at a low-frequency and receiving microbubble harmonic signals at frequencies many times higher than the transmitted frequency. Design and fabrication of dual-frequency transducers and the extension of recent developments in transducer technology for dual-frequency harmonic imaging are discussed. PMID:25375755
From Information Society to Knowledge Society: The Ontology Issue
NASA Astrophysics Data System (ADS)
Roche, Christophe
2002-09-01
Information society, virtual enterprise, e-business rely more and more on communication and knowledge sharing between heterogeneous actors. But, no communication is possible, and all the more so no co-operation or collaboration, if those actors do not share the same or at least a compatible meaning for the terms they use. Ontology, understood as an agreed vocabulary of common terms and meanings, is a solution to that problem. Nevertheless, although there is quite a lot of experience in using ontologies, several barriers remain which stand against a real use of ontology. As a matter of fact, it is very difficult to build, reuse and share ontologies. We claim that the ontology problem requires a multidisciplinary approach based on sound epistemological, logical and linguistic principles. This article presents the Ontological Knowledge Station (OK Station©), a software environment for building and using ontologies which relies on such principles. The OK Station is currently being used in several industrial applications.
Dual-frequency piezoelectric transducers for contrast enhanced ultrasound imaging.
Martin, K Heath; Lindsey, Brooks D; Ma, Jianguo; Lee, Mike; Li, Sibo; Foster, F Stuart; Jiang, Xiaoning; Dayton, Paul A
2014-11-04
For many years, ultrasound has provided clinicians with an affordable and effective imaging tool for applications ranging from cardiology to obstetrics. Development of microbubble contrast agents over the past several decades has enabled ultrasound to distinguish between blood flow and surrounding tissue. Current clinical practices using microbubble contrast agents rely heavily on user training to evaluate degree of localized perfusion. Advances in separating the signals produced from contrast agents versus surrounding tissue backscatter provide unique opportunities for specialized sensors designed to image microbubbles with higher signal to noise and resolution than previously possible. In this review article, we describe the background principles and recent developments of ultrasound transducer technology for receiving signals produced by contrast agents while rejecting signals arising from soft tissue. This approach relies on transmitting at a low-frequency and receiving microbubble harmonic signals at frequencies many times higher than the transmitted frequency. Design and fabrication of dual-frequency transducers and the extension of recent developments in transducer technology for dual-frequency harmonic imaging are discussed.
Some Aspects of Advanced Tokamak Modeling in DIII-D
NASA Astrophysics Data System (ADS)
St John, H. E.; Petty, C. C.; Murakami, M.; Kinsey, J. E.
2000-10-01
We extend previous work(M. Murakami, et al., General Atomics Report GA-A23310 (1999).) done on time dependent DIII-D advanced tokamak simulations by introducing theoretical confinement models rather than relying on power balance derived transport coefficients. We explore using NBCD and off axis ECCD together with a self-consistent aligned bootstrap current, driven by the internal transport barrier dynamics generated with the GLF23 confinement model, to shape the hollow current profile and to maintain MHD stable conditions. Our theoretical modeling approach uses measured DIII-D initial conditions to start off the simulations in a smooth consistent manner. This mitigates the troublesome long lived perturbations in the ohmic current profile that is normally caused by inconsistent initial data. To achieve this goal our simulation uses a sequence of time dependent eqdsks generated autonomously by the EFIT MHD equilibrium code in analyzing experimental data to supply the history for the simulation.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Bidirectional composition on lie groups for gradient-based image alignment.
Mégret, Rémi; Authesserre, Jean-Baptiste; Berthoumieu, Yannick
2010-09-01
In this paper, a new formulation based on bidirectional composition on Lie groups (BCL) for parametric gradient-based image alignment is presented. Contrary to the conventional approaches, the BCL method takes advantage of the gradients of both template and current image without combining them a priori. Based on this bidirectional formulation, two methods are proposed and their relationship with state-of-the-art gradient based approaches is fully discussed. The first one, i.e., the BCL method, relies on the compositional framework to provide the minimization of the compensated error with respect to an augmented parameter vector. The second one, the projected BCL (PBCL), corresponds to a close approximation of the BCL approach. A comparative study is carried out dealing with computational complexity, convergence rate and frequence of convergence. Numerical experiments using a conventional benchmark show the performance improvement especially for asymmetric levels of noise, which is also discussed from a theoretical point of view.
Social Isolation in Later Life: Extending the Conversation.
Weldrick, Rachel; Grenier, Amanda
2018-03-01
As Canada's population continues to age, social isolation among older people is a growing concern and national-level priority. Although much is known about individual-level risks and negative health outcomes associated with social isolation in later life, the impact of life course trajectories and the more collective experiences are seldom considered. Current definitions and program responses tend to rely on individualized approaches to social isolation. Here, we argue that the conversation be extended to consider the social and cultural aspects of social isolation among older people. Specifically, we suggest that definitions and approaches consider three particular dimensions: temporal factors, spatial factors, and the relationship between social isolation and exclusion. Doing so, we argue, would result in a more inclusive approach to social isolation in late life, and the development of capacity to address social isolation among a wide range of older people, particularly the needs of vulnerable or marginalized groups.
Agyei, Dominic; Tsopmo, Apollinaire; Udenigwe, Chibuike C
2018-06-01
There are emerging advancements in the strategies used for the discovery and development of food-derived bioactive peptides because of their multiple food and health applications. Bioinformatics and peptidomics are two computational and analytical techniques that have the potential to speed up the development of bioactive peptides from bench to market. Structure-activity relationships observed in peptides form the basis for bioinformatics and in silico prediction of bioactive sequences encrypted in food proteins. Peptidomics, on the other hand, relies on "hyphenated" (liquid chromatography-mass spectrometry-based) techniques for the detection, profiling, and quantitation of peptides. Together, bioinformatics and peptidomics approaches provide a low-cost and effective means of predicting, profiling, and screening bioactive protein hydrolysates and peptides from food. This article discuses the basis, strengths, and limitations of bioinformatics and peptidomics approaches currently used for the discovery and analysis of food-derived bioactive peptides.
MetaSort untangles metagenome assembly by reducing microbial community complexity
Ji, Peifeng; Zhang, Yanming; Wang, Jinfeng; Zhao, Fangqing
2017-01-01
Most current approaches to analyse metagenomic data rely on reference genomes. Novel microbial communities extend far beyond the coverage of reference databases and de novo metagenome assembly from complex microbial communities remains a great challenge. Here we present a novel experimental and bioinformatic framework, metaSort, for effective construction of bacterial genomes from metagenomic samples. MetaSort provides a sorted mini-metagenome approach based on flow cytometry and single-cell sequencing methodologies, and employs new computational algorithms to efficiently recover high-quality genomes from the sorted mini-metagenome by the complementary of the original metagenome. Through extensive evaluations, we demonstrated that metaSort has an excellent and unbiased performance on genome recovery and assembly. Furthermore, we applied metaSort to an unexplored microflora colonized on the surface of marine kelp and successfully recovered 75 high-quality genomes at one time. This approach will greatly improve access to microbial genomes from complex or novel communities. PMID:28112173
Unbiased approaches to biomarker discovery in neurodegenerative diseases
Chen-Plotkin, Alice S.
2014-01-01
Neurodegenerative diseases such as Alzheimer’s disease, Parkinson’s disease, amyotrophic lateral sclerosis, and frontotemporal dementia have several important features in common. They are progressive, they affect a relatively inaccessible organ, and we have no disease-modifying therapies for them. For these brain-based diseases, current diagnosis and evaluation of disease severity rely almost entirely on clinical examination, which may only be a rough approximation of disease state. Thus, the development of biomarkers – objective, relatively easily measured and precise indicators of pathogenic processes – could improve patient care and accelerate therapeutic discovery. Yet existing, rigorously tested neurodegenerative disease biomarkers are few, and even fewer biomarkers have translated into clinical use. To find new biomarkers for these diseases, an unbiased, high-throughput screening approach may be needed. In this review, I will describe the potential utility of such an approach to biomarker discovery, using Parkinson’s disease as a case example. PMID:25442938
Expanding the metabolic engineering toolbox with directed evolution.
Abatemarco, Joseph; Hill, Andrew; Alper, Hal S
2013-12-01
Cellular systems can be engineered into factories that produce high-value chemicals from renewable feedstock. Such an approach requires an expanded toolbox for metabolic engineering. Recently, protein engineering and directed evolution strategies have started to play a growing and critical role within metabolic engineering. This review focuses on the various ways in which directed evolution can be applied in conjunction with metabolic engineering to improve product yields. Specifically, we discuss the application of directed evolution on both catalytic and non-catalytic traits of enzymes, on regulatory elements, and on whole genomes in a metabolic engineering context. We demonstrate how the goals of metabolic pathway engineering can be achieved in part through evolving cellular parts as opposed to traditional approaches that rely on gene overexpression and deletion. Finally, we discuss the current limitations in screening technology that hinder the full implementation of a metabolic pathway-directed evolution approach. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Trojan Horse Antibiotics—A Novel Way to Circumvent Gram-Negative Bacterial Resistance?
Tillotson, Glenn S.
2016-01-01
Antibiotic resistance has been emerged as a major global health problem. In particular, gram-negative species pose a significant clinical challenge as bacteria develop or acquire more resistance mechanisms. Often, these bacteria possess multiple resistance mechanisms, thus nullifying most of the major classes of drugs. Novel approaches to this issue are urgently required. However, the challenges of developing new agents are immense. Introducing novel agents is fraught with hurdles, thus adapting known antibiotic classes by altering their chemical structure could be a way forward. A chemical addition to existing antibiotics known as a siderophore could be a solution to the gram-negative resistance issue. Siderophore molecules rely on the bacterial innate need for iron ions and thus can utilize a Trojan Horse approach to gain access to the bacterial cell. The current approaches to using this potential method are reviewed. PMID:27773991
Trojan Horse Antibiotics-A Novel Way to Circumvent Gram-Negative Bacterial Resistance?
Tillotson, Glenn S
2016-01-01
Antibiotic resistance has been emerged as a major global health problem. In particular, gram-negative species pose a significant clinical challenge as bacteria develop or acquire more resistance mechanisms. Often, these bacteria possess multiple resistance mechanisms, thus nullifying most of the major classes of drugs. Novel approaches to this issue are urgently required. However, the challenges of developing new agents are immense. Introducing novel agents is fraught with hurdles, thus adapting known antibiotic classes by altering their chemical structure could be a way forward. A chemical addition to existing antibiotics known as a siderophore could be a solution to the gram-negative resistance issue. Siderophore molecules rely on the bacterial innate need for iron ions and thus can utilize a Trojan Horse approach to gain access to the bacterial cell. The current approaches to using this potential method are reviewed.
Linking temporal medical records using non-protected health information data.
Bonomi, Luca; Jiang, Xiaoqian
2017-01-01
Modern medical research relies on multi-institutional collaborations which enhance the knowledge discovery and data reuse. While these collaborations allow researchers to perform analytics otherwise impossible on individual datasets, they often pose significant challenges in the data integration process. Due to the lack of a unique identifier, data integration solutions often have to rely on patient's protected health information (PHI). In many situations, such information cannot leave the institutions or must be strictly protected. Furthermore, the presence of noisy values for these attributes may result in poor overall utility. While much research has been done to address these challenges, most of the current solutions are designed for a static setting without considering the temporal information of the data (e.g. EHR). In this work, we propose a novel approach that uses non-PHI for linking patient longitudinal data. Specifically, our technique captures the diagnosis dependencies using patterns which are shown to provide important indications for linking patient records. Our solution can be used as a standalone technique to perform temporal record linkage using non-protected health information data or it can be combined with Privacy Preserving Record Linkage solutions (PPRL) when protected health information is available. In this case, our approach can solve ambiguities in results. Experimental evaluations on real datasets demonstrate the effectiveness of our technique.
Evaluating for impact: what type of data can assist a health promoting school approach?
Joyce, Andrew; Dabrowski, Anna; Aston, Ruth; Carey, Gemma
2017-04-01
There is debate within the health promoting school (HPS) movement on whether schools should monitor health behaviour outcomes as part of an evaluation or rely more on process type measures, such as changes to school policies and the physical and social environment which yield information about (in)effective implementation. The debate is often framed around ideological considerations of the role of schools and there is little empirical work on how these indicators of effective implementation can influence change at a policy and practice level in real world settings. Information has potentially powerful effects in motivating a change process, but this will vary according to the type of information and the type of organizational culture into which it is presented. The current predominant model relies on process data, policy and environmental audit monitoring and benchmarking approaches, and there is little evidence of whether this engages school communities. Theoretical assertions on the importance of monitoring data to motivate change need to be empirically tested and, in doing so, we can learn which types of data influence adoption of HPS in which types of school and policy contexts. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Two Surface Temperature Retrieval Methods Compared Over Agricultural Lands
NASA Technical Reports Server (NTRS)
French, Andrew N.; Schmugge, Thomas J.; Jacob, Frederic; Ogawa, Kenta; Houser, Paul R. (Technical Monitor)
2002-01-01
Accurate, spatially distributed surface temperatures are required for modeling evapotranspiration (ET) over agricultural fields under wide ranging conditions, including stressed and unstressed vegetation. Modeling approaches that use surface temperature observations, however, have the burden of estimating surface emissivities. Emissivity estimation, the subject of much recent research, is facilitated by observations in multiple thermal infrared bands. But it is nevertheless a difficult task. Using observations from a multiband thermal sensor, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), estimated surface emissivities and temperatures are retrieved in two different ways: the temperature emissivity separation approach (TES) and the normalized emissivity approach (NEM). Both rely upon empirical relationships, but the assumed relationships are different. TES relies upon a relationship between the minimum spectral emissivity and the range of observed emissivities. NEM relies upon an assumption that at least one thermal band has a pre-determined emissivity (close to 1.0). The benefits and consequences of each approach will be demonstrated for two different landscapes: one in central Oklahoma, USA and another in southern New Mexico.
Surgical Management of Perineural Spread of Head and Neck Cancers.
Solares, C Arturo; Mason, Eric; Panizza, Benedict J
2016-04-01
The surgical management of perineural spread of head and neck cancers has become an integral part in the contemporary treatment of this pathology. We now understand that tumour spreads within the epineurium and in a continuous fashion. We also can rely on the accuracy of magnetic resonance neurography in detecting and defining the extent of disease. With modern skull base techniques and a greater understanding of the anatomy in this region, specific operations can be designed to help eradicate disease. We review the current approaches and techniques used that enable us to better obtain tumour free margins and hence improve survival.
An Adaptive Cross-Architecture Combination Method for Graph Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Yang; Song, Shuaiwen; Kerbyson, Darren J.
2014-06-18
Breadth-First Search (BFS) is widely used in many real-world applications including computational biology, social networks, and electronic design automation. The combination method, using both top-down and bottom-up techniques, is the most effective BFS approach. However, current combination methods rely on trial-and-error and exhaustive search to locate the optimal switching point, which may cause significant runtime overhead. To solve this problem, we design an adaptive method based on regression analysis to predict an optimal switching point for the combination method at runtime within less than 0.1% of the BFS execution time.
A New Approach of Designing Superalloys for Low Density
NASA Technical Reports Server (NTRS)
MacKay, Rebecca A.; Gabb, Timothy P.; Smialek, James L.; Nathal, Michael V.
2010-01-01
New low-density single-crystal (LDS) alloy, have bee. developed for turbine blade applications, which have the potential for significant improvements in the thrust-to-weight ratio over current production superalloys. An innovative alloying strategy was wed to achieve alloy density reductions, high-temperature creep resistance, microstructural stability, and cyclic oxidation resistance. The alloy design relies on molybdenum as a potent. lower-density solid-solution strengthener in the nickel-based superalloy. Low alloy density was also achieved with modest rhenium levels tmd the absence of tungsten. Microstructural, physical mechanical, and environmental testing demonstrated the feasibility of this new LDS superalloy design.
Robust image matching via ORB feature and VFC for mismatch removal
NASA Astrophysics Data System (ADS)
Ma, Tao; Fu, Wenxing; Fang, Bin; Hu, Fangyu; Quan, Siwen; Ma, Jie
2018-03-01
Image matching is at the base of many image processing and computer vision problems, such as object recognition or structure from motion. Current methods rely on good feature descriptors and mismatch removal strategies for detection and matching. In this paper, we proposed a robust image match approach based on ORB feature and VFC for mismatch removal. ORB (Oriented FAST and Rotated BRIEF) is an outstanding feature, it has the same performance as SIFT with lower cost. VFC (Vector Field Consensus) is a state-of-the-art mismatch removing method. The experiment results demonstrate that our method is efficient and robust.
Broadening the interface bandwidth in simulation based training
NASA Technical Reports Server (NTRS)
Somers, Larry E.
1989-01-01
Currently most computer based simulations rely exclusively on computer generated graphics to create the simulation. When training is involved, the method almost exclusively used to display information to the learner is text displayed on the cathode ray tube. MICROEXPERT Systems is concentrating on broadening the communications bandwidth between the computer and user by employing a novel approach to video image storage combined with sound and voice output. An expert system is used to combine and control the presentation of analog video, sound, and voice output with computer based graphics and text. Researchers are currently involved in the development of several graphics based user interfaces for NASA, the U.S. Army, and the U.S. Navy. Here, the focus is on the human factors considerations, software modules, and hardware components being used to develop these interfaces.
Calculation of AC loss in two-layer superconducting cable with equal currents in the layers
NASA Astrophysics Data System (ADS)
Erdogan, Muzaffer
2016-12-01
A new method for calculating AC loss of two-layer SC power transmission cables using the commercial software Comsol Multiphysics, relying on the approach of the equal partition of current between the layers is proposed. Applying the method to calculate the AC-loss in a cable composed of two coaxial cylindrical SC tubes, the results are in good agreement with the analytical ones of duoblock model. Applying the method to calculate the AC-losses of a cable composed of a cylindrical copper former, surrounded by two coaxial cylindrical layers of superconducting tapes embedded in an insulating medium with tape-on-tape and tape-on-gap configurations are compared. A good agreement between the duoblock model and the numerical results for the tape-on-gap cable is observed.
Biological fabrication of cellulose fibers with tailored properties
NASA Astrophysics Data System (ADS)
Natalio, Filipe; Fuchs, Regina; Cohen, Sidney R.; Leitus, Gregory; Fritz-Popovski, Gerhard; Paris, Oskar; Kappl, Michael; Butt, Hans-Jürgen
2017-09-01
Cotton is a promising basis for wearable smart textiles. Current approaches that rely on fiber coatings suffer from function loss during wear. We present an approach that allows biological incorporation of exogenous molecules into cotton fibers to tailor the material’s functionality. In vitro model cultures of upland cotton (Gossypium hirsutum) are incubated with 6-carboxyfluorescein-glucose and dysprosium-1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid-glucose, where the glucose moiety acts as a carrier capable of traveling from the vascular connection to the outermost cell layer of the ovule epidermis, becoming incorporated into the cellulose fibers. This yields fibers with unnatural properties such as fluorescence or magnetism. Combining biological systems with the appropriate molecular design offers numerous possibilities to grow functional composite materials and implements a material-farming concept.
Design and manufacturing challenges of optogenetic neural interfaces: a review
NASA Astrophysics Data System (ADS)
Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Costa, R. M.; Correia, J. H.
2017-08-01
Optogenetics is a relatively new technology to achieve cell-type specific neuromodulation with millisecond-scale temporal precision. Optogenetic tools are being developed to address neuroscience challenges, and to improve the knowledge about brain networks, with the ultimate aim of catalyzing new treatments for brain disorders and diseases. To reach this ambitious goal the implementation of mature and reliable engineered tools is required. The success of optogenetics relies on optical tools that can deliver light into the neural tissue. Objective/Approach: Here, the design and manufacturing approaches available to the scientific community are reviewed, and current challenges to accomplish appropriate scalable, multimodal and wireless optical devices are discussed. Significance: Overall, this review aims at presenting a helpful guidance to the engineering and design of optical microsystems for optogenetic applications.
Personality Diagnosis for Personalized eHealth Services
NASA Astrophysics Data System (ADS)
Cortellese, Fabio; Nalin, Marco; Morandi, Angelica; Sanna, Alberto; Grasso, Floriana
In this paper we present two different approaches to personality diagnosis, for the provision of innovative personalized services, as used in a case study where diabetic patients were supported in the improvement of physical activity in their daily life. The first approach presented relies on a static clustering of the population, with a specific motivation strategy designed for each cluster. The second approach relies on a dynamic population clustering, making use of recommendation systems and algorithms, like Collaborative Filtering. We discuss pro and cons of each approach and a possible combination of the two, as the most promising solution for this and other personalization services in eHealth.
CABINS: Case-based interactive scheduler
NASA Technical Reports Server (NTRS)
Miyashita, Kazuo; Sycara, Katia
1992-01-01
In this paper we discuss the need for interactive factory schedule repair and improvement, and we identify case-based reasoning (CBR) as an appropriate methodology. Case-based reasoning is the problem solving paradigm that relies on a memory for past problem solving experiences (cases) to guide current problem solving. Cases similar to the current case are retrieved from the case memory, and similarities and differences of the current case to past cases are identified. Then a best case is selected, and its repair plan is adapted to fit the current problem description. If a repair solution fails, an explanation for the failure is stored along with the case in memory, so that the user can avoid repeating similar failures in the future. So far we have identified a number of repair strategies and tactics for factory scheduling and have implemented a part of our approach in a prototype system, called CABINS. As a future work, we are going to scale up CABINS to evaluate its usefulness in a real manufacturing environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faulconer, D.W
2004-03-15
Certain devices aimed at magnetic confinement of thermonuclear plasma rely on the steady flow of an electric current in the plasma. In view of the dominant place it occupies in both the world magnetic-confinement fusion effort and the author's own activity, the tokamak toroidal configuration is selected as prototype for discussing the question of how such a current can be maintained. Tokamaks require a stationary toroidal plasma current, this being traditionally provided by a pulsed magnetic induction which drives the plasma ring as the secondary of a transformer. Since this mechanism is essentially transient, and steady-state fusion reactor operation hasmore » manifold advantages, significant effort is now devoted to developing alternate steady-state means of generating toroidal current. These methods are classed under the global heading of 'noninductive current drive' or simply 'current drive', generally, though not exclusively, employing the injection of waves and/or toroidally directed particle beams. In what follows we highlight the physical mechanisms underlying surprisingly various approaches to driving current in a tokamak, downplaying a number of practical and technical issues. When a significant data base exists for a given method, its experimental current drive efficiency and future prospects are detailed.« less
Extraction of sandy bedforms features through geodesic morphometry
NASA Astrophysics Data System (ADS)
Debese, Nathalie; Jacq, Jean-José; Garlan, Thierry
2016-09-01
State-of-art echosounders reveal fine-scale details of mobile sandy bedforms, which are commonly found on continental shelfs. At present, their dynamics are still far from being completely understood. These bedforms are a serious threat to navigation security, anthropic structures and activities, placing emphasis on research breakthroughs. Bedform geometries and their dynamics are closely linked; therefore, one approach is to develop semi-automatic tools aiming at extracting their structural features from bathymetric datasets. Current approaches mimic manual processes or rely on morphological simplification of bedforms. The 1D and 2D approaches cannot address the wide ranges of both types and complexities of bedforms. In contrast, this work attempts to follow a 3D global semi-automatic approach based on a bathymetric TIN. The currently extracted primitives are the salient ridge and valley lines of the sand structures, i.e., waves and mega-ripples. The main difficulty is eliminating the ripples that are found to heavily overprint any observations. To this end, an anisotropic filter that is able to discard these structures while still enhancing the wave ridges is proposed. The second part of the work addresses the semi-automatic interactive extraction and 3D augmented display of the main lines structures. The proposed protocol also allows geoscientists to interactively insert topological constraints.
Propulsion Trade Studies for Spacecraft Swarm Mission Design
NASA Technical Reports Server (NTRS)
Dono, Andres; Plice, Laura; Mueting, Joel; Conn, Tracie; Ho, Michael
2018-01-01
Spacecraft swarms constitute a challenge from an orbital mechanics standpoint. Traditional mission design involves the application of methodical processes where predefined maneuvers for an individual spacecraft are planned in advance. This approach does not scale to spacecraft swarms consisting of many satellites orbiting in close proximity; non-deterministic maneuvers cannot be preplanned due to the large number of units and the uncertainties associated with their differential deployment and orbital motion. For autonomous small sat swarms in LEO, we investigate two approaches for controlling the relative motion of a swarm. The first method involves modified miniature phasing maneuvers, where maneuvers are prescribed that cancel the differential delta V of each CubeSat's deployment vector. The second method relies on artificial potential functions (APFs) to contain the spacecraft within a volumetric boundary and avoid collisions. Performance results and required delta V budgets are summarized, indicating that each method has advantages and drawbacks for particular applications. The mini phasing maneuvers are more predictable and sustainable. The APF approach provides a more responsive and distributed performance, but at considerable propellant cost. After considering current state of the art CubeSat propulsion systems, we conclude that the first approach is feasible, but the modified APF method of requires too much control authority to be enabled by current propulsion systems.
Genetic testing for Lynch syndrome: family communication and motivation.
Leenen, Celine H M; Heijer, Mariska den; van der Meer, Conny; Kuipers, Ernst J; van Leerdam, Monique E; Wagner, Anja
2016-01-01
Current genetic counselling practice for Lynch syndrome (LS) relies on diagnosed index patients to inform their biological family about LS, referred to as the family-mediated approach. The objective of this study was to evaluate this approach and to identify factors influencing the uptake of genetic testing for LS. In 59 mutation carriers, 70 non carriers and 16 non-tested relatives socio-demographic characteristics, family communication regarding LS, experiences and attitudes towards the family-mediated approach and motivations for genetic testing, were assessed. The majority of all respondents (73 %) were satisfied with the family-mediated approach. Nevertheless, 59 % of the respondents experienced informing a family member and 57 % being informed by a family member as burdensome. Non-tested differed from tested respondents, in that they were younger, less closely related to the index patient and a lower proportion had children. The most important reasons for declining genetic testing were (1) anticipating problems with life insurance and mortgage, (2) being content with life as it is, and (3) not experiencing any physical complaints. In conclusion, the majority of respondents consider the current family-mediated information procedure acceptable, although the provision of information on LS by relatives may be burdensome. Special attention should be paid to communication of LS to more distant relatives.
Luchins, Daniel
2012-01-01
The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.
Activity-based exploitation of Full Motion Video (FMV)
NASA Astrophysics Data System (ADS)
Kant, Shashi
2012-06-01
Video has been a game-changer in how US forces are able to find, track and defeat its adversaries. With millions of minutes of video being generated from an increasing number of sensor platforms, the DOD has stated that the rapid increase in video is overwhelming their analysts. The manpower required to view and garner useable information from the flood of video is unaffordable, especially in light of current fiscal restraints. "Search" within full-motion video has traditionally relied on human tagging of content, and video metadata, to provision filtering and locate segments of interest, in the context of analyst query. Our approach utilizes a novel machine-vision based approach to index FMV, using object recognition & tracking, events and activities detection. This approach enables FMV exploitation in real-time, as well as a forensic look-back within archives. This approach can help get the most information out of video sensor collection, help focus the attention of overburdened analysts form connections in activity over time and conserve national fiscal resources in exploiting FMV.
Ramsingh, Brigit
2014-07-01
Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches.
Novel anti-microbial therapies for dental plaque-related diseases.
Allaker, Robert P; Douglas, C W Ian
2009-01-01
Control of dental plaque-related diseases has traditionally relied on non-specific removal of plaque by mechanical means. As our knowledge of oral disease mechanisms increases, future treatment is likely to be more targeted, for example at small groups of organisms, single species or at key virulence factors they produce. The aim of this review is to consider the current status as regards novel treatment approaches. Maintenance of oral hygiene often includes use of chemical agents; however, increasing problems of resistance to synthetic antimicrobials have encouraged the search for alternative natural products. Plants are the source of more than 25% of prescription and over-the-counter preparations, and the potential of natural agents for oral prophylaxis will therefore be considered. Targeted approaches may be directed at the black-pigmented anaerobes associated with periodontitis. Such pigments provide an opportunity for targeted phototherapy with high-intensity monochromatic light. Studies to date have demonstrated selective killing of Porphyromonas gingivalis and Prevotella intermedia in biofilms. Functional inhibition approaches, including the use of protease inhibitors, are also being explored to control periodontitis. Replacement therapy by which a resident pathogen is replaced with a non-pathogenic bacteriocin-producing variant is currently under development with respect to Streptococcus mutans and dental caries.
Rosenberg, Lena; Nygård, Louise
2017-12-01
Most research on learning in the field of dementia has studied teaching approaches, while little is known about learning as experienced and enacted by the people with dementia. The aim was to explore the lived experience of learning and maintaining knowledge related to technology among people with mild to moderate stage dementia. Seven persons with dementia were interviewed in-depth, and data were analyzed with a phenomenological approach. The participants positioned themselves on a continuum from 'Updating and expanding is not for me' to 'Updating and expanding is really for me'. They used different ways of learning in their everyday life - relying on one's habituated repertoire of actions, on other people or on technology itself, or belonging to a learning context. We have much to gain from better understanding of how people with dementia strive to learn and maintain their skills and knowledge related to technology. This is particularly important as they seem to use other approaches than those employed in current teaching methods. The necessity of learning stands out particularly when it comes to the interaction with the current multitude and ever-changing designs of technologies, including assistive technologies developed specifically to support people with dementia.
NASA Astrophysics Data System (ADS)
Harris, Courtney K.; Wiberg, Patricia L.
1997-09-01
Modeling shelf sediment transport rates and bed reworking depths is problematic when the wave and current forcing conditions are not precisely known, as is usually the case when long-term sedimentation patterns are of interest. Two approaches to modeling sediment transport under such circumstances are considered. The first relies on measured or simulated time series of flow conditions to drive model calculations. The second approach uses as model input probability distribution functions of bottom boundary layer flow conditions developed from wave and current measurements. Sediment transport rates, frequency of bed resuspension by waves and currents, and bed reworking calculated using the two methods are compared at the mid-shelf STRESS (Sediment TRansport on Shelves and Slopes) site on the northern California continental shelf. Current, wave and resuspension measurements at the site are used to generate model inputs and test model results. An 11-year record of bottom wave orbital velocity, calculated from surface wave spectra measured by the National Data Buoy Center (NDBC) Buoy 46013 and verified against bottom tripod measurements, is used to characterize the frequency and duration of wave-driven transport events and to estimate the joint probability distribution of wave orbital velocity and period. A 109-day record of hourly current measurements 10 m above bottom is used to estimate the probability distribution of bottom boundary layer current velocity at this site and to develop an auto-regressive model to simulate current velocities for times when direct measurements of currents are not available. Frequency of transport, the maximum volume of suspended sediment, and average flux calculated using measured wave and simulated current time series agree well with values calculated using measured time series. A probabilistic approach is more amenable to calculations over time scales longer than existing wave records, but it tends to underestimate net transport because it does not capture the episodic nature of transport events. Both methods enable estimates to be made of the uncertainty in transport quantities that arise from an incomplete knowledge of the specific timing of wave and current conditions. 1997 Elsevier Science Ltd
Subramanian, Savitha; Naimoli, Joseph; Matsubayashi, Toru; Peters, David H
2011-12-14
There is widespread agreement on the need for scaling up in the health sector to achieve the Millennium Development Goals (MDGs). But many countries are not on track to reach the MDG targets. The dominant approach used by global health initiatives promotes uniform interventions and targets, assuming that specific technical interventions tested in one country can be replicated across countries to rapidly expand coverage. Yet countries scale up health services and progress against the MDGs at very different rates. Global health initiatives need to take advantage of what has been learned about scaling up. A systematic literature review was conducted to identify conceptual models for scaling up health in developing countries, with the articles assessed according to the practical concerns of how to scale up, including the planning, monitoring and implementation approaches. We identified six conceptual models for scaling up in health based on experience with expanding pilot projects and diffusion of innovations. They place importance on paying attention to enhancing organizational, functional, and political capabilities through experimentation and adaptation of strategies in addition to increasing the coverage and range of health services. These scaling up approaches focus on fostering sustainable institutions and the constructive engagement between end users and the provider and financing organizations. The current approaches to scaling up health services to reach the MDGs are overly simplistic and not working adequately. Rather than relying on blueprint planning and raising funds, an approach characteristic of current global health efforts, experience with alternative models suggests that more promising pathways involve "learning by doing" in ways that engage key stakeholders, uses data to address constraints, and incorporates results from pilot projects. Such approaches should be applied to current strategies to achieve the MDGs.
NASA Astrophysics Data System (ADS)
Ruuskanen, J.; Stenvall, A.; Lahtinen, V.; Pardo, E.
2017-02-01
Superconducting magnets are the most expensive series of components produced in the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN). When developing such magnets beyond state-of-the-art technology, one possible option is to use high-temperature superconductors (HTS) that are capable of tolerating much higher magnetic fields than low-temperature superconductors (LTS), carrying simultaneously high current densities. Significant cost reductions due to decreased prototype construction needs can be achieved by careful modelling of the magnets. Simulations are used, e.g. for designing magnets fulfilling the field quality requirements of the beampipe, and adequate protection by studying the losses occurring during charging and discharging. We model the hysteresis losses and the magnetic field nonlinearity in the beampipe as a function of the magnet’s current. These simulations rely on the minimum magnetic energy variation principle, with optimization algorithms provided by the open-source optimization library interior point optimizer. We utilize this methodology to investigate a research and development accelerator magnet prototype made of REBCO Roebel cable. The applicability of this approach, when the magnetic field dependence of the superconductor’s critical current density is considered, is discussed. We also scrutinize the influence of the necessary modelling decisions one needs to make with this approach. The results show that different decisions can lead to notably different results, and experiments are required to study the electromagnetic behaviour of such magnets further.
Combating Daesh: A Socially Unconventional Strategy
2015-06-01
is relying on a minimalist strategy through military partnerships and air support. This research contends that this fairly conventional approach is...ultimately destroy Daesh, yet afraid to mire itself in another Middle Eastern conflict, the United States is relying on a minimalist strategy through
Air Force Maintenance Technician Performance Measurement.
1979-12-28
R G A N I Z A T IO N N A M E A N D A D R S A R E A P HO R U I T N U M B E R AFIT STUDENT AT: Arizona State Univ II. CONTROLLING OFFICE NAME AND...inflated, or provide incomplete and non -current coverage of maintenance organizations. The performance aopraisal method developed relies on subjective...highly inflated, or provided incomplete and non -current coverage of maintenance organizations. The performance appraisal method developed relied on
Gotti, Valeria Bisinoto; Feitosa, Victor Pinheiro; Sauro, Salvatore; Correr-Sobrinho, Lourenço; Correr, Americo Bortolazzo
2014-10-01
To evaluate the effects of an electric current-assisted application on the bond strength and interfacial morphology of self-adhesive resin cements bonded to dentin. Indirect resin composite build-ups were luted to prepared dentin surfaces using two self-adhesive resin cements (RelyX Unicem and BisCem) and an ElectroBond device under 0, 20, or 40 μA electrical current. All specimens were submitted to microtensile bond strength test and to interfacial SEM analysis. The electric current-assisted application induced no change (P > 0.05) on the overall bond strength, although RelyX Unicem showed significantly higher bond strength (P < 0.05) than BisCem. Similarly, no differences were observed in terms of interfacial integrity when using the electrical current applicator.
Mayhew, Terry M; Lucocq, John M
2011-03-01
Various methods for quantifying cellular immunogold labelling on transmission electron microscope thin sections are currently available. All rely on sound random sampling principles and are applicable to single immunolabelling across compartments within a given cell type or between different experimental groups of cells. Although methods are also available to test for colocalization in double/triple immunogold labelling studies, so far, these have relied on making multiple measurements of gold particle densities in defined areas or of inter-particle nearest neighbour distances. Here, we present alternative two-step approaches to codistribution and colocalization assessment that merely require raw counts of gold particles in distinct cellular compartments. For assessing codistribution over aggregate compartments, initial statistical evaluation involves combining contingency table and chi-squared analyses to provide predicted gold particle distributions. The observed and predicted distributions allow testing of the appropriate null hypothesis, namely, that there is no difference in the distribution patterns of proteins labelled by different sizes of gold particle. In short, the null hypothesis is that of colocalization. The approach for assessing colabelling recognises that, on thin sections, a compartment is made up of a set of sectional images (profiles) of cognate structures. The approach involves identifying two groups of compartmental profiles that are unlabelled and labelled for one gold marker size. The proportions in each group that are also labelled for the second gold marker size are then compared. Statistical analysis now uses a 2 × 2 contingency table combined with the Fisher exact probability test. Having identified double labelling, the profiles can be analysed further in order to identify characteristic features that might account for the double labelling. In each case, the approach is illustrated using synthetic and/or experimental datasets and can be refined to correct observed labelling patterns to specific labelling patterns. These simple and efficient approaches should be of more immediate utility to those interested in codistribution and colocalization in multiple immunogold labelling investigations.
Markovian master equations for quantum thermal machines: local versus global approach
NASA Astrophysics Data System (ADS)
Hofer, Patrick P.; Perarnau-Llobet, Martí; Miranda, L. David M.; Haack, Géraldine; Silva, Ralph; Bohr Brask, Jonatan; Brunner, Nicolas
2017-12-01
The study of quantum thermal machines, and more generally of open quantum systems, often relies on master equations. Two approaches are mainly followed. On the one hand, there is the widely used, but often criticized, local approach, where machine sub-systems locally couple to thermal baths. On the other hand, in the more established global approach, thermal baths couple to global degrees of freedom of the machine. There has been debate as to which of these two conceptually different approaches should be used in situations out of thermal equilibrium. Here we compare the local and global approaches against an exact solution for a particular class of thermal machines. We consider thermodynamically relevant observables, such as heat currents, as well as the quantum state of the machine. Our results show that the use of a local master equation is generally well justified. In particular, for weak inter-system coupling, the local approach agrees with the exact solution, whereas the global approach fails for non-equilibrium situations. For intermediate coupling, the local and the global approach both agree with the exact solution and for strong coupling, the global approach is preferable. These results are backed by detailed derivations of the regimes of validity for the respective approaches.
Current and future resources for functional metagenomics
Lam, Kathy N.; Cheng, Jiujun; Engel, Katja; Neufeld, Josh D.; Charles, Trevor C.
2015-01-01
Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries—physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research. PMID:26579102
Current and future resources for functional metagenomics.
Lam, Kathy N; Cheng, Jiujun; Engel, Katja; Neufeld, Josh D; Charles, Trevor C
2015-01-01
Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries-physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research.
NASA Astrophysics Data System (ADS)
Özer, Ahmet Özkan
2016-04-01
An infinite dimensional model for a three-layer active constrained layer (ACL) beam model, consisting of a piezoelectric elastic layer at the top and an elastic host layer at the bottom constraining a viscoelastic layer in the middle, is obtained for clamped-free boundary conditions by using a thorough variational approach. The Rao-Nakra thin compliant layer approximation is adopted to model the sandwich structure, and the electrostatic approach (magnetic effects are ignored) is assumed for the piezoelectric layer. Instead of the voltage actuation of the piezoelectric layer, the piezoelectric layer is proposed to be activated by a charge (or current) source. We show that, the closed-loop system with all mechanical feedback is shown to be uniformly exponentially stable. Our result is the outcome of the compact perturbation argument and a unique continuation result for the spectral problem which relies on the multipliers method. Finally, the modeling methodology of the paper is generalized to the multilayer ACL beams, and the uniform exponential stabilizability result is established analogously.
The Mechanics of Single Cell and Collective Migration of Tumor Cells
Lintz, Marianne; Muñoz, Adam; Reinhart-King, Cynthia A.
2017-01-01
Metastasis is a dynamic process in which cancer cells navigate the tumor microenvironment, largely guided by external chemical and mechanical cues. Our current understanding of metastatic cell migration has relied primarily on studies of single cell migration, most of which have been performed using two-dimensional (2D) cell culture techniques and, more recently, using three-dimensional (3D) scaffolds. However, the current paradigm focused on single cell movements is shifting toward the idea that collective migration is likely one of the primary modes of migration during metastasis of many solid tumors. Not surprisingly, the mechanics of collective migration differ significantly from single cell movements. As such, techniques must be developed that enable in-depth analysis of collective migration, and those for examining single cell migration should be adopted and modified to study collective migration to allow for accurate comparison of the two. In this review, we will describe engineering approaches for studying metastatic migration, both single cell and collective, and how these approaches have yielded significant insight into the mechanics governing each process. PMID:27814431
Non-Markovian electron dynamics in nanostructures coupled to dissipative contacts
NASA Astrophysics Data System (ADS)
Novakovic, B.; Knezevic, I.
2013-02-01
In quasiballistic semiconductor nanostructures, carrier exchange between the active region and dissipative contacts is the mechanism that governs relaxation. In this paper, we present a theoretical treatment of transient quantum transport in quasiballistic semiconductor nanostructures, which is based on the open system theory and valid on timescales much longer than the characteristic relaxation time in the contacts. The approach relies on a model interaction between the current-limiting active region and the contacts, given in the scattering-state basis. We derive a non-Markovian master equation for the irreversible evolution of the active region's many-body statistical operator by coarse-graining the exact dynamical map over the contact relaxation time. In order to obtain the response quantities of a nanostructure under bias, such as the potential and the charge and current densities, the non-Markovian master equation must be solved numerically together with the Schr\\"{o}dinger, Poisson, and continuity equations. We discuss how to numerically solve this coupled system of equations and illustrate the approach on the example of a silicon nin diode.
An illustration of new methods in machine condition monitoring, Part I: stochastic resonance
NASA Astrophysics Data System (ADS)
Worden, K.; Antoniadou, I.; Marchesiello, S.; Mba, C.; Garibaldi, L.
2017-05-01
There have been many recent developments in the application of data-based methods to machine condition monitoring. A powerful methodology based on machine learning has emerged, where diagnostics are based on a two-step procedure: extraction of damage-sensitive features, followed by unsupervised learning (novelty detection) or supervised learning (classification). The objective of the current pair of papers is simply to illustrate one state-of-the-art procedure for each step, using synthetic data representative of reality in terms of size and complexity. The first paper in the pair will deal with feature extraction. Although some papers have appeared in the recent past considering stochastic resonance as a means of amplifying damage information in signals, they have largely relied on ad hoc specifications of the resonator used. In contrast, the current paper will adopt a principled optimisation-based approach to the resonator design. The paper will also show that a discrete dynamical system can provide all the benefits of a continuous system, but also provide a considerable speed-up in terms of simulation time in order to facilitate the optimisation approach.
Non-mammalian models in behavioral neuroscience: consequences for biological psychiatry
Maximino, Caio; Silva, Rhayra Xavier do Carmo; da Silva, Suéllen de Nazaré Santos; Rodrigues, Laís do Socorro dos Santos; Barbosa, Hellen; de Carvalho, Tayana Silva; Leão, Luana Ketlen dos Reis; Lima, Monica Gomes; Oliveira, Karen Renata Matos; Herculano, Anderson Manoel
2015-01-01
Current models in biological psychiatry focus on a handful of model species, and the majority of work relies on data generated in rodents. However, in the same sense that a comparative approach to neuroanatomy allows for the identification of patterns of brain organization, the inclusion of other species and an adoption of comparative viewpoints in behavioral neuroscience could also lead to increases in knowledge relevant to biological psychiatry. Specifically, this approach could help to identify conserved features of brain structure and behavior, as well as to understand how variation in gene expression or developmental trajectories relates to variation in brain and behavior pertinent to psychiatric disorders. To achieve this goal, the current focus on mammalian species must be expanded to include other species, including non-mammalian taxa. In this article, we review behavioral neuroscientific experiments in non-mammalian species, including traditional “model organisms” (zebrafish and Drosophila) as well as in other species which can be used as “reference.” The application of these domains in biological psychiatry and their translational relevance is considered. PMID:26441567
A microcontroller-based lock-in amplifier for sub-milliohm resistance measurements.
Bengtsson, Lars E
2012-07-01
This paper presents a novel approach to the design of a digital ohmmeter with a resolution of <60 μΩ based on a general-purpose microcontroller and a high-impedance instrumentation amplifier only. The design uses two digital I/O-pins to alternate the current through the sample resistor and combined with a proper firmware routine, the design is a lock-in detector that discriminates any signal that is out of phase/frequency with the reference signal. This makes it possible to selectively detect the μV drop across sample resistors down to 55.6 μΩ using only the current that can be supplied by the digital output pins of a microcontroller. This is achieved without the need for an external reference signal generator and does not rely on the computing processing power of a digital signal processor.
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data
2017-01-01
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects. PMID:28984823
Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data.
Falque, Raphael; Vidal-Calleja, Teresa; Miro, Jaime Valls
2017-10-06
Remote-Field Eddy-Current (RFEC) technology is often used as a Non-Destructive Evaluation (NDE) method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.
The U.S. Earthquake Prediction Program
Wesson, R.L.; Filson, J.R.
1981-01-01
There are two distinct motivations for earthquake prediction. The mechanistic approach aims to understand the processes leading to a large earthquake. The empirical approach is governed by the immediate need to protect lives and property. With our current lack of knowledge about the earthquake process, future progress cannot be made without gathering a large body of measurements. These are required not only for the empirical prediction of earthquakes, but also for the testing and development of hypotheses that further our understanding of the processes at work. The earthquake prediction program is basically a program of scientific inquiry, but one which is motivated by social, political, economic, and scientific reasons. It is a pursuit that cannot rely on empirical observations alone nor can it carried out solely on a blackboard or in a laboratory. Experiments must be carried out in the real Earth.
Protein Folding Using a Vortex Fluidic Device.
Britton, Joshua; Smith, Joshua N; Raston, Colin L; Weiss, Gregory A
2017-01-01
Essentially all biochemistry and most molecular biology experiments require recombinant proteins. However, large, hydrophobic proteins typically aggregate into insoluble and misfolded species, and are directed into inclusion bodies. Current techniques to fold proteins recovered from inclusion bodies rely on denaturation followed by dialysis or rapid dilution. Such approaches can be time consuming, wasteful, and inefficient. Here, we describe rapid protein folding using a vortex fluidic device (VFD). This process uses mechanical energy introduced into thin films to rapidly and efficiently fold proteins. With the VFD in continuous flow mode, large volumes of protein solution can be processed per day with 100-fold reductions in both folding times and buffer volumes.
Deflection Missions for Asteroid 2011 AG5
NASA Technical Reports Server (NTRS)
Grebow, Daniel; Landau, Damon; Bhaskaran, Shyam; Chodas, Paul; Chesley, Steven; Yeomans, Don; Petropoulos, Anastassios; Sims, Jon
2012-01-01
The recently discovered asteroid 2011 AG5 currently has a 1-in-500 chance of impacting Earth in 2040. In this paper, we discuss the potential of future observations of the asteroid and their effects on the asteroid's orbital uncertainty. Various kinetic impactor mission scenarios, relying on both conventional chemical as well as solar-electric propulsion, are presented for deflecting the course of the asteroid safely away from Earth. The times for the missions range from pre-keyhole passage (pre-2023), and up to five years prior to the 2040 Earth close approach. We also include a brief discussion on terminal guidance, and contingency options for mission planning.
Advances in bioartificial liver assist devices.
Patzer, J F
2001-11-01
Rapid advances in development of bioartificial liver assist devices (BLADs) are exciting clinical interest in the application of BLAD technology for support of patients with acute liver failure. Four devices (Circe Biomedical HepatAssist, Vitagen ELAD, Gerlach BELS, and Excorp Medical BLSS) that rely on hepatocytes cultured in hollow-fiber membrane technology are currently in various stages of clinical evaluation. Several alternative approaches for culture and perfusion of hepatocytes have been evaluated in preclinical, large animal models of liver failure, or at a laboratory scale. Engineering design issues with respect to xenotransplantation, BLAD perfusion, hepatocyte functionality and culture maintenance, and ultimate distribution of a BLAD to a clinical site are delineated.
A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.
Petrantonakis, Panagiotis C; Poirazi, Panayiota
2015-01-01
The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.
Note: A microfluidic freezer based on evaporative cooling of atomized aqueous microdroplets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Jin; Kim, Dohyun, E-mail: dohyun.kim@mju.ac.kr; Chung, Minsub
2015-01-15
We report for the first time water-based evaporative cooling integrated into a microfluidic chip for temperature control and freezing of biological solution. We opt for water as a nontoxic, effective refrigerant. Aqueous solutions are atomized in our device and evaporation of microdroplets under vacuum removes heat effectively. We achieve rapid cooling (−5.1 °C/s) and a low freezing temperature (−14.1 °C). Using this approach, we demonstrate freezing of deionized water and protein solution. Our simple, yet effective cooling device may improve many microfluidic applications currently relying on external power-hungry instruments for cooling and freezing.
NASA Astrophysics Data System (ADS)
Laubscher, Markus; Bourquin, Stéphane; Froehly, Luc; Karamata, Boris; Lasser, Theo
2004-07-01
Current spectroscopic optical coherence tomography (OCT) methods rely on a posteriori numerical calculation. We present an experimental alternative for accessing spectroscopic information in OCT without post-processing based on wavelength de-multiplexing and parallel detection using a diffraction grating and a smart pixel detector array. Both a conventional A-scan with high axial resolution and the spectrally resolved measurement are acquired simultaneously. A proof-of-principle demonstration is given on a dynamically changing absorbing sample. The method's potential for fast spectroscopic OCT imaging is discussed. The spectral measurements obtained with this approach are insensitive to scan non-linearities or sample movements.
Structure, Intent and Conformance Monitoring in ATC
NASA Technical Reports Server (NTRS)
Reynolds, Tom G.; Histon, Jonathan M.; Davison, Hayley J.; Hansman, R. John
2004-01-01
Infield studies of current Air Traffic Control operations it is found that controllers rely on underlying airspace structure to reduce the complexity of the planning and conformance monitoring tasks. The structure appears to influence the controller's working mental model through abstractions that reduce the apparent cognitive complexity. These structure-based abstractions are useful for the controller's key tasks of planning, implementing, monitoring, and evaluating tactical situations. In addition, the structure-based abstractions appear to be important in the maintenance of Situation Awareness. The process of conformance monitoring is analyzed in more detail and an approach to conformance monitoring which utilizes both the structure-based abstractions and intent is presented.
ODISEES: Ontology-Driven Interactive Search Environment for Earth Sciences
NASA Technical Reports Server (NTRS)
Rutherford, Matthew T.; Huffer, Elisabeth B.; Kusterer, John M.; Quam, Brandi M.
2015-01-01
This paper discusses the Ontology-driven Interactive Search Environment for Earth Sciences (ODISEES) project currently being developed to aid researchers attempting to find usable data among an overabundance of closely related data. ODISEES' ontological structure relies on a modular, adaptable concept modeling approach, which allows the domain to be modeled more or less as it is without worrying about terminology or external requirements. In the model, variables are individually assigned semantic content based on the characteristics of the measurements they represent, allowing intuitive discovery and comparison of data without requiring the user to sift through large numbers of data sets and variables to find the desired information.
Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.
Lan, Y
1992-12-01
This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.
Why is the VLT Very Efficient?
NASA Astrophysics Data System (ADS)
Comerón, F.
2009-09-01
The operations model of the ESO Very Large Telescope (VLT) heavily relies on a full-scale implementation of Service Mode observing. In this contribution we review the main features of ESO's approach to Service Mode at the VLT, we outline the advantages offered by this mode, and the challenges faced when implementing it given the wide diversity of instrumentation and instrument modes currently available at the VLT and the VLT Interferometer (VLTI). We give special emphasis to the part of this challenge directly derived from the evolution of the atmospheric conditions, which drive the short-term scheduling of the different scientific programmes competing for the available time.
Separability of Lexical and Morphological Knowledge: Evidence from Language Minority Children
Shahar-Yames, Daphna; Eviatar, Zohar; Prior, Anat
2018-01-01
Lexical and morphological knowledge of school-aged children are correlated with each other, and are often difficult to distinguish. One reason for this might be that many tasks currently used to assess morphological knowledge require children to inflect or derive real words in the language, thus recruiting their vocabulary knowledge. The current study investigated the possible separability of lexical and morphological knowledge using two complementary approaches. First, we examined the correlations between vocabulary and four morphological tasks tapping different aspects of morphological processing and awareness, and using either real-word or pseudo-word stimuli. Thus, we tested the hypothesis that different morphological tasks recruit lexical knowledge to various degrees. Second, we compared the Hebrew vocabulary and morphological knowledge of 5th grade language minority speaking children to that of their native speaking peers. This comparison allows us to ask whether reduced exposure to the societal language might differentially influence vocabulary and morphological knowledge. The results demonstrate that indeed different morphological tasks rely on lexical knowledge to varying degrees. In addition, language minority students had significantly lower performance in vocabulary and in morphological tasks that recruited vocabulary knowledge to a greater extent. In contrast, both groups performed similarly in abstract morphological tasks with a lower vocabulary load. These results demonstrate that lexical and morphological knowledge may rely on partially separable learning mechanisms, and highlight the importance of distinguishing between these two linguistic components. PMID:29515486
Discovering Deeply Divergent RNA Viruses in Existing Metatranscriptome Data with Machine Learning
NASA Astrophysics Data System (ADS)
Rivers, A. R.
2016-02-01
Most sampling of RNA viruses and phages has been directed toward a narrow range of hosts and environments. Several marine metagenomic studies have examined the RNA viral fraction in aquatic samples and found a number of picornaviruses and uncharacterized sequences. The lack of homology to known protein families has limited the discovery of new RNA viruses. We developed a computational method for identifying RNA viruses that relies on information in the codon transition probabilities of viral sequences to train a classifier. This approach does not rely on homology, but it has higher information content than other reference-free methods such as tetranucleotide frequency. Training and validation with RefSeq data gave true positive and true negative rates of 99.6% and 99.5% on the highly imbalanced validation sets (0.2% viruses) that, like the metatranscriptomes themselves, contain mostly non-viral sequences. To further test the method, a validation dataset of putative RNA virus genomes were identified in metatransciptomes by the presence of RNA dependent RNA polymerase, an essential gene for RNA viruses. The classifier successfully identified 99.4% of those contigs as viral. This approach is currently being extended to screen all metatranscriptome data sequenced at the DOE Joint Genome Institute, presently 4.5 Gb of assembled data from 504 public projects representing a wide range of marine, aquatic and terrestrial environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O
Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less
Comparing New Zealand's 'Middle Out' health information technology strategy with other OECD nations.
Bowden, Tom; Coiera, Enrico
2013-05-01
Implementation of efficient, universally applied, computer to computer communications is a high priority for many national health systems. As a consequence, much effort has been channelled into finding ways in which a patient's previous medical history can be made accessible when needed. A number of countries have attempted to share patients' records, with varying degrees of success. While most efforts to create record-sharing architectures have relied upon government-provided strategy and funding, New Zealand has taken a different approach. Like most British Commonwealth nations, New Zealand has a 'hybrid' publicly/privately funded health system. However its information technology infrastructure and automation has largely been developed by the private sector, working closely with regional and central government agencies. Currently the sector is focused on finding ways in which patient records can be shared amongst providers across three different regions. New Zealand's healthcare IT model combines government contributed funding, core infrastructure, facilitation and leadership with private sector investment and skills and is being delivered via a set of controlled experiments. The net result is a 'Middle Out' approach to healthcare automation. 'Middle Out' relies upon having a clear, well-articulated health-reform strategy and a determination by both public and private sector organisations to implement useful healthcare IT solutions by working closely together. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Implementing system simulation of C3 systems using autonomous objects
NASA Technical Reports Server (NTRS)
Rogers, Ralph V.
1987-01-01
The basis of all conflict recognition in simulation is a common frame of reference. Synchronous discrete-event simulation relies on the fixed points in time as the basic frame of reference. Asynchronous discrete-event simulation relies on fixed-points in the model space as the basic frame of reference. Neither approach provides sufficient support for autonomous objects. The use of a spatial template as a frame of reference is proposed to address these insufficiencies. The concept of a spatial template is defined and an implementation approach offered. Discussed are the uses of this approach to analyze the integration of sensor data associated with Command, Control, and Communication systems.
Presence for design: conveying atmosphere through video collages.
Keller, I; Stappers, P J
2001-04-01
Product designers use imagery for inspiration in their creative design process. To support creativity, designers apply many tools and techniques, which often rely on their ability to be inspired by found and previously made visual material and to experience the atmosphere of the user environment. Computer tools and developments in VR offer perspectives to support this kind of imagery and presence in the design process. But currently these possibilities come at too high a technological overhead and price to be usable in the design practice. This article proposes an expressive and technically lightweight approach using the possibilities of VR and computer tools, by creating a sketchy environment using video collages. Instead of relying on highly realistic or even "hyperreal" graphics, these video collages use lessons learned from theater and cinema to get a sense of atmosphere across. Product designers can use these video collages to reexperience their observations in the environment in which a product is to be used, and to communicate this atmosphere to their colleagues and clients. For user-centered design, video collages can also provide an environmental context for concept testing with prospective user groups.
Hong, Seong Cheol; Murale, Dhiraj P; Jang, Se-Young; Haque, Md Mamunul; Seo, Minah; Lee, Seok; Woo, Deok Ha; Kwon, Junghoon; Song, Chang-Seon; Kim, Yun Kyung; Lee, Jun-Seok
2018-06-22
Avian Influenza (AI) caused an annual epidemic outbreak that led to destroying tens of millions of poultry worldwide. Current gold standard AI diagnosis method is an embryonic egg-based hemagglutination assay followed by immunoblotting or PCR sequencing to confirm subtypes. It requires, however, specialized facilities to handle egg inoculation and incubation, and the subtyping methods relied on costly reagents. Here, we demonstrated the first differential sensing approach to distinguish AI subtypes using series of cell lines and fluorescent sensor. Susceptibility of AI virus differs depending on genetic backgrounds of host cells. Thus, we examined cells from different organ origin, and the infection patterns against a panel of cells were utilized for AI virus subtyping. To quantify AI infection, we designed a highly cell-permeable fluorescent superoxide sensor to visualize infection. Though many AI monitoring strategies relied on sophisticated antibody have been extensively studied, our differential sensing strategy successfully proved discriminations of AI subtypes and demonstrated as a useful primary screening platform to monitor a large number of samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Casalegno, Mosè; Bernardi, Andrea; Raos, Guido
2013-07-01
Numerical approaches can provide useful information about the microscopic processes underlying photocurrent generation in organic solar cells (OSCs). Among them, the Kinetic Monte Carlo (KMC) method is conceptually the simplest, but computationally the most intensive. A less demanding alternative is potentially represented by so-called Master Equation (ME) approaches, where the equations describing particle dynamics rely on the mean-field approximation and their solution is attained numerically, rather than stochastically. The description of charge separation dynamics, the treatment of electrostatic interactions and numerical stability are some of the key issues which have prevented the application of these methods to OSC modelling, despite of their successes in the study of charge transport in disordered system. Here we describe a three-dimensional ME approach to photocurrent generation in OSCs which attempts to deal with these issues. The reliability of the proposed method is tested against reference KMC simulations on bilayer heterojunction solar cells. Comparison of the current-voltage curves shows that the model well approximates the exact result for most devices. The largest deviations in current densities are mainly due to the adoption of the mean-field approximation for electrostatic interactions. The presence of deep traps, in devices characterized by strong energy disorder, may also affect result quality. Comparison of the simulation times reveals that the ME algorithm runs, on the average, one order of magnitude faster than KMC.
NASA Astrophysics Data System (ADS)
Topping, David; Alibay, Irfan; Bane, Michael
2017-04-01
To predict the evolving concentration, chemical composition and ability of aerosol particles to act as cloud droplets, we rely on numerical modeling. Mechanistic models attempt to account for the movement of compounds between the gaseous and condensed phases at a molecular level. This 'bottom up' approach is designed to increase our fundamental understanding. However, such models rely on predicting the properties of molecules and subsequent mixtures. For partitioning between the gaseous and condensed phases this includes: saturation vapour pressures; Henrys law coefficients; activity coefficients; diffusion coefficients and reaction rates. Current gas phase chemical mechanisms predict the existence of potentially millions of individual species. Within a dynamic ensemble model, this can often be used as justification for neglecting computationally expensive process descriptions. Indeed, on whether we can quantify the true sensitivity to uncertainties in molecular properties, even at the single aerosol particle level it has been impossible to embed fully coupled representations of process level knowledge with all possible compounds, typically relying on heavily parameterised descriptions. Relying on emerging numerical frameworks, and designed for the changing landscape of high-performance computing (HPC), in this study we focus specifically on the ability to capture activity coefficients in liquid solutions using the UNIFAC method. Activity coefficients are often neglected with the largely untested hypothesis that they are simply too computationally expensive to include in dynamic frameworks. We present results demonstrating increased computational efficiency for a range of typical scenarios, including a profiling of the energy use resulting from reliance on such computations. As the landscape of HPC changes, the latter aspect is important to consider in future applications.
Exploiting metamaterials, plasmonics and nanoantennas concepts in silicon photonics
NASA Astrophysics Data System (ADS)
Rodríguez-Fortuño, Francisco J.; Espinosa-Soria, Alba; Martínez, Alejandro
2016-12-01
The interaction of light with subwavelength metallic nano-structures is at the heart of different current scientific hot topics, namely plasmonics, metamaterials and nanoantennas. Research in these disciplines during the last decade has given rise to new, powerful concepts providing an unprecedented degree of control over light manipulation at the nanoscale. However, only recently have these concepts been used to increase the capabilities of light processing in current photonic integrated circuits (PICs), which traditionally rely only on dielectric materials with element sizes larger than the light wavelength. Amongst the different PIC platforms, silicon photonics is expected to become mainstream, since manufacturing using well-established CMOS processes enables the mass production of low-cost PICs. In this review we discuss the benefits of introducing recent concepts arisen from the fields of metamaterials, plasmonics and nanoantennas into a silicon photonics integrated platform. We review existing works in this direction and discuss how this hybrid approach can lead to the improvement of current PICs enabling novel and disruptive applications in photonics.
The physical and empirical basis for a specific clear-air turbulence risk index
NASA Technical Reports Server (NTRS)
Keller, J. L.
1986-01-01
The fundamental emphasis of this research was to develop a technique which would be a significant improvement over those currently used for flight planning to avoid clear air turbulence (CAT). The technique should, ideally, be both quantitative in determining potential intensity and specific in locating regions of relatively high risk. Furthermore, it should not rely on specialized data but be functional using the currently available rawinsonde observation (raob) system. Encouraging results documented in an earlier investigation were considered compelling enough to warrant a closer look into the possibilities of a Specific Clear Air Turbulence Risk (SCATR) index approach to the clear air turbulence problem. Unlike that research, which considered sustained periods of flight in light to moderate clear air turbulence, this study focuses on several cases of documented severe CAT. Results of these case studies suggest that a SCATR index is not an unrealizable goal and that uses of such an index, event in its current prototype level of development, are also apparent.
Chen, Z; Lönnberg, T; Lahesmaa, R
2013-08-01
Current knowledge of helper T cell differentiation largely relies on data generated from mouse studies. To develop therapeutical strategies combating human diseases, understanding the molecular mechanisms how human naïve T cells differentiate to functionally distinct T helper (Th) subsets as well as studies on human differentiated Th cell subsets is particularly valuable. Systems biology approaches provide a holistic view of the processes of T helper differentiation, enable discovery of new factors and pathways involved and generation of new hypotheses to be tested to improve our understanding of human Th cell differentiation and immune-mediated diseases. Here, we summarize studies where high-throughput systems biology approaches have been exploited to human primary T cells. These studies reveal new factors and signalling pathways influencing T cell differentiation towards distinct subsets, important for immune regulation. Such information provides new insights into T cell biology and into targeting immune system for therapeutic interventions. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Vallam, P.; Qin, X. S.
2017-07-01
Flooding risk is increasing in many parts of the world and may worsen under climate change conditions. The accuracy of predicting flooding risk relies on reasonable projection of meteorological data (especially rainfall) at the local scale. The current statistical downscaling approaches face the difficulty of projecting multi-site climate information for future conditions while conserving spatial information. This study presents a combined Long Ashton Research Station Weather Generator (LARS-WG) stochastic weather generator and multi-site rainfall simulator RainSim (CLWRS) approach to investigate flow regimes under future conditions in the Kootenay Watershed, Canada. To understand the uncertainty effect stemming from different scenarios, the climate output is fed into a hydrologic model. The results showed different variation trends of annual peak flows (in 2080-2099) based on different climate change scenarios and demonstrated that the hydrological impact would be driven by the interaction between snowmelt and peak flows. The proposed CLWRS approach is useful where there is a need for projection of potential climate change scenarios.
Quantifying Cancer Risk from Radiation.
Keil, Alexander P; Richardson, David B
2017-12-06
Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.
Morgan, Sonya J; Pullon, Susan R H; Macdonald, Lindsay M; McKinlay, Eileen M; Gray, Ben V
2017-06-01
Case study research is a comprehensive method that incorporates multiple sources of data to provide detailed accounts of complex research phenomena in real-life contexts. However, current models of case study research do not particularly distinguish the unique contribution observation data can make. Observation methods have the potential to reach beyond other methods that rely largely or solely on self-report. This article describes the distinctive characteristics of case study observational research, a modified form of Yin's 2014 model of case study research the authors used in a study exploring interprofessional collaboration in primary care. In this approach, observation data are positioned as the central component of the research design. Case study observational research offers a promising approach for researchers in a wide range of health care settings seeking more complete understandings of complex topics, where contextual influences are of primary concern. Future research is needed to refine and evaluate the approach.
Temporal enhancement of two-dimensional color doppler echocardiography
NASA Astrophysics Data System (ADS)
Terentjev, Alexey B.; Settlemier, Scott H.; Perrin, Douglas P.; del Nido, Pedro J.; Shturts, Igor V.; Vasilyev, Nikolay V.
2016-03-01
Two-dimensional color Doppler echocardiography is widely used for assessing blood flow inside the heart and blood vessels. Currently, frame acquisition time for this method varies from tens to hundreds of milliseconds, depending on Doppler sector parameters. This leads to low frame rates of resulting video sequences equal to tens of Hz, which is insufficient for some diagnostic purposes, especially in pediatrics. In this paper, we present a new approach for reconstruction of 2D color Doppler cardiac images, which results in the frame rate being increased to hundreds of Hz. This approach relies on a modified method of frame reordering originally applied to real-time 3D echocardiography. There are no previous publications describing application of this method to 2D Color Doppler data. The approach has been tested on several in-vivo cardiac 2D color Doppler datasets with approximate duration of 30 sec and native frame rate of 15 Hz. The resulting image sequences had equivalent frame rates to 500Hz.
Next generation capacity building for the GEOSS community - an European approach
NASA Astrophysics Data System (ADS)
Bye, B. L.
2016-12-01
The Group on Earth observation embarked on the next 10 year phase with an ambition to streamline and futher develop its achievements in building the Global Earth Observing System of Systems (GEOSS). The NextGEOSS project evolves the European vision of GEOSS data exploitation for innovation and business, relying on the three main pillars of engaging communities, delivering technological developments and advocating the use of GEOSS, in order to support the creation and deployment of Earth observation based innovative research activities and commercial services. In this presentation we will present the new integrated approach to capacity building engaging the various actors involved in the entire value-chain from data providers to decision-makers. A presentation of the general approach together with concrete pilot cases will be included.In this work it will be shown how we integrate new technological development and societial change enabling GEO and GEOSS to adapt to the current environment. The result is important for better decision-making and better use of our limited resources to manage our planet.
Classification review of dental adhesive systems: from the IV generation to the universal type
Sofan, Eshrak; Sofan, Afrah; Palaia, Gaspare; Tenore, Gianluca; Romeo, Umberto; Migliau, Guido
2017-01-01
Summary Adhesive dentistry has undergone great progress in the last decades. In light of minimal-invasive dentistry, this new approach promotes a more conservative cavity design, which relies on the effectiveness of current enamel-dentine adhesives. Adhesive dentistry began in 1955 by Buonocore on the benefits of acid etching. With changing technologies, dental adhesives have evolved from no-etch to total-etch (4th and 5th generation) to self-etch (6th, 7th and 8th generation) systems. Currently, bonding to dental substrates is based on three different strategies: 1) etch-and-rinse, 2) self-etch and 3) resin-modified glass-ionomer approach as possessing the unique properties of self-adherence to the tooth tissue. More recently, a new family of dentin adhesives has been introduced (universal or multi-mode adhesives), which may be used either as etch-and-rinse or as self-etch adhesives. The purpose of this article is to review the literature on the current knowledge for each adhesive system according to their classification that have been advocated by many authorities in most operative/restorative procedures. As noted by several valuable studies that have contributed to understanding of bonding to various substrates helps clinicians to choose the appropriate dentin bonding agents for optimal clinical outcomes. PMID:28736601
Digital Repositories and the Question of Data Usefulness
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Downs, R. R.
2017-12-01
The advent of ISO standards for trustworthy long-term digital repositories provides both a set of principles to develop long-term data repositories and the instruments to assess them for trustworthiness. Such mandatory high-level requirements are broad enough to be achievable, to some extent, by many scientific data centers, archives, and other repositories. But the requirement that the data be useful in the future, the requirement that is usually considered to be most relevant to the value of the repository for its user communities, largely remains subject to various interpretations and misunderstanding. However, current and future users will be relying on repositories to preserve and disseminate the data and information needed to discover, understand, and utilize these resources to support their research, learning, and decision-making objectives. Therefore, further study is needed to determine the approaches that can be adopted by repositories to make data useful to future communities of users. This presentation will describe approaches for enabling scientific data and related information, such as software, to be useful for current and potential future user communities and will present the methodology chosen to make one science discipline's data useful for both current and future users. The method uses an ontology-based information model to define and capture the information necessary to make the data useful for contemporary and future users.
Primary health care attributes and responses to intimate partner violence in Spain.
Goicolea, Isabel; Mosquera, Paola; Briones-Vozmediano, Erica; Otero-García, Laura; García-Quinto, Marta; Vives-Cases, Carmen
This study provides an overview of the perceptions of primary care professionals on how the current primary health care (PHC) attributes in Spain could influence health-related responses to intimate partner violence (IPV). A qualitative study was conducted using semi-structured interviews with 160 health professionals working in 16 PHC centres in Spain. Data were analysed using a qualitative content analysis. Four categories emerged from the interview analysis: those committed to the PHC approach, but with difficulties implementing it; community work relying on voluntarism; multidisciplinary team work or professionals who work together?; and continuity of care hindered by heavy work load. Participants felt that person-centred care as well as other attributes of the PHC approach facilitated detecting IPV and a better response to the problem. However, they also pointed out that the current management of the health system (workload, weak supervision and little feedback, misdistribution of human and material resources, etc.) does not facilitate the sustainability of such an approach. There is a gap between the theoretical attributes of PHC and the "reality" of how these attributes are managed in everyday work, and how this influences IPV care. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narasimha S
2013-01-01
Studies were performed to carry out semi-empirical validation of a new measurement approach we propose for molecular mixing ratios determination. The approach is based on relative measurements in bands of O2 and other molecules and as such may be best described as cross band relative absorption (CoBRA). . The current validation studies rely upon well verified and established theoretical and experimental databases, satellite data assimilations and modeling codes such as HITRAN, line-by-line radiative transfer model (LBLRTM), and the modern-era retrospective analysis for research and applications (MERRA). The approach holds promise for atmospheric mixing ratio measurements of CO2 and a variety of other molecules currently under investigation for several future satellite lidar missions. One of the advantages of the method is a significant reduction of the temperature sensitivity uncertainties which is illustrated with application to the ASCENDS mission for the measurement of CO2 mixing ratios (XCO2). Additional advantages of the method include the possibility to closely match cross-band weighting function combinations which is harder to achieve using conventional differential absorption techniques and the potential for additional corrections for water vapor and other interferences without using the data from numerical weather prediction (NWP) models.
Risk Management and Physical Modelling for Mountainous Natural Hazards
NASA Astrophysics Data System (ADS)
Lehning, Michael; Wilhelm, Christian
Population growth and climate change cause rapid changes in mountainous regions resulting in increased risks of floods, avalanches, debris flows and other natural hazards. Xevents are of particular concern, since attempts to protect against them result in exponentially growing costs. In this contribution, we suggest an integral risk management approach to dealing with natural hazards that occur in mountainous areas. Using the example of a mountain pass road, which can be protected from the danger of an avalanche by engineering (galleries) and/or organisational (road closure) measures, we show the advantage of an optimal combination of both versus the traditional approach, which is to rely solely on engineering structures. Organisational measures become especially important for Xevents because engineering structures cannot be designed for those events. However, organisational measures need a reliable and objective forecast of the hazard. Therefore, we further suggest that such forecasts should be developed using physical numerical modelling. We present the status of current approaches to using physical modelling to predict snow cover stability for avalanche warnings and peak runoff from mountain catchments for flood warnings. While detailed physical models can already predict peak runoff reliably, they are only used to support avalanche warnings. With increased process knowledge and computer power, current developments should lead to a enhanced role for detailed physical models in natural mountain hazard prediction.
The Effects of Concurrent Verbal and Visual Tasks on Category Learning
ERIC Educational Resources Information Center
Miles, Sarah J.; Minda, John Paul
2011-01-01
Current theories of category learning posit separate verbal and nonverbal learning systems. Past research suggests that the verbal system relies on verbal working memory and executive functioning and learns rule-defined categories; the nonverbal system does not rely on verbal working memory and learns non-rule-defined categories (E. M. Waldron…
Solutions for data integration in functional genomics: a critical assessment and case study.
Smedley, Damian; Swertz, Morris A; Wolstencroft, Katy; Proctor, Glenn; Zouberakis, Michael; Bard, Jonathan; Hancock, John M; Schofield, Paul
2008-11-01
The torrent of data emerging from the application of new technologies to functional genomics and systems biology can no longer be contained within the traditional modes of data sharing and publication with the consequence that data is being deposited in, distributed across and disseminated through an increasing number of databases. The resulting fragmentation poses serious problems for the model organism community which increasingly rely on data mining and computational approaches that require gathering of data from a range of sources. In the light of these problems, the European Commission has funded a coordination action, CASIMIR (coordination and sustainability of international mouse informatics resources), with a remit to assess the technical and social aspects of database interoperability that currently prevent the full realization of the potential of data integration in mouse functional genomics. In this article, we assess the current problems with interoperability, with particular reference to mouse functional genomics, and critically review the technologies that can be deployed to overcome them. We describe a typical use-case where an investigator wishes to gather data on variation, genomic context and metabolic pathway involvement for genes discovered in a genome-wide screen. We go on to develop an automated approach involving an in silico experimental workflow tool, Taverna, using web services, BioMart and MOLGENIS technologies for data retrieval. Finally, we focus on the current impediments to adopting such an approach in a wider context, and strategies to overcome them.
NASA Astrophysics Data System (ADS)
Tian, Heng; Chen, GuanHua
2013-10-01
Going beyond the limitations of our earlier works [X. Zheng, F. Wang, C.Y. Yam, Y. Mo, G.H. Chen, Phys. Rev. B 75, 195127 (2007); X. Zheng, G.H. Chen, Y. Mo, S.K. Koo, H. Tian, C.Y. Yam, Y.J. Yan, J. Chem. Phys. 133, 114101 (2010)], we propose, in this manuscript, a new alternative approach to simulate time-dependent quantum transport phenomenon from first-principles. This new practical approach, still retaining the formal exactness of HEOM framework, does not rely on any intractable parametrization scheme and the pole structure of Fermi distribution function, thus, can seamlessly incorporated into first-principles simulation and treat transient response of an open electronic systems to an external bias voltage at both zero and finite temperatures on the equal footing. The salient feature of this approach is surveyed, and its time complexity is analysed. As a proof-of-principle of this approach, simulation of the transient current of one dimensional tight-binding chain, driven by some direct external voltages, is demonstrated.
Analytic Steering: Inserting Context into the Information Dialog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.
2011-10-23
An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less
NASA Technical Reports Server (NTRS)
Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.
2018-01-01
Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.
Changes in Adult Child Caregiver Networks
ERIC Educational Resources Information Center
Szinovacz, Maximiliane E.; Davey, Adam
2007-01-01
Purpose: Caregiving research has typically relied on cross-sectional data that focus on the primary caregiver. This approach neglects the dynamic and systemic character of caregiver networks. Our analyses addressed changes in adult child care networks over a 2-year period. Design and Methods: The study relied on pooled data from Waves 1 through 5…
Longmore, Monica A.; Johnson, Wendi L.; Manning, Wendy D.; Giordano, Peggy C.
2012-01-01
This study relies on survey (N=704) and in-depth qualitative (N = 100) interviews (Toledo Adolescent Relationship Study) to examine individual, partner, and relationship barriers and facilitators to HIV testing in a sample of young adults. Consistent with the public health goal of routine testing, nearly 40% of respondents had an HIV test within the context of their current sexual relationship, and women were significantly more likely to have tested within the current relationship than were men. For women, it was both their own risky behavior, and the partners’ characteristics and related relationship dynamics that distinguished testers from non-testers. In contrast, for men their own risky behavior was the most salient factor influencing their odds of being tested. These results showcase gender specific approaches to best promote sexual health, i.e., routine HIV testing among young adults. PMID:22489753
Core conditions for alpha heating attained in direct-drive inertial confinement fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bose, A.; Woo, K. M.; Betti, R.
It is shown that direct-drive implosions on the OMEGA laser have achieved core conditions that would lead to significant alpha heating at incident energies available on the National Ignition Facility (NIF) scale. The extrapolation of the experimental results from OMEGA to NIF energy assumes only that the implosion hydrodynamic efficiency is unchanged at higher energies. This approach is independent of the uncertainties in the physical mechanism that degrade implosions on OMEGA, and relies solely on a volumetric scaling of the experimentally observed core conditions. It is estimated that the current best-performing OMEGA implosion [Regan et al., Phys. Rev. Lett. 117,more » 025001 (2016)] extrapolated to a 1.9 MJ laser driver with the same illumination configuration and laser-target coupling would produce 125 kJ of fusion energy with similar levels of alpha heating observed in current highest performing indirect-drive NIF implosions.« less
Bounding species distribution models
Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.
Bounding Species Distribution Models
NASA Technical Reports Server (NTRS)
Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].
Core conditions for alpha heating attained in direct-drive inertial confinement fusion
Bose, A.; Woo, K. M.; Betti, R.; ...
2016-07-07
It is shown that direct-drive implosions on the OMEGA laser have achieved core conditions that would lead to significant alpha heating at incident energies available on the National Ignition Facility (NIF) scale. The extrapolation of the experimental results from OMEGA to NIF energy assumes only that the implosion hydrodynamic efficiency is unchanged at higher energies. This approach is independent of the uncertainties in the physical mechanism that degrade implosions on OMEGA, and relies solely on a volumetric scaling of the experimentally observed core conditions. It is estimated that the current best-performing OMEGA implosion [Regan et al., Phys. Rev. Lett. 117,more » 025001 (2016)] extrapolated to a 1.9 MJ laser driver with the same illumination configuration and laser-target coupling would produce 125 kJ of fusion energy with similar levels of alpha heating observed in current highest performing indirect-drive NIF implosions.« less
Reinventing solid state electronics: Harnessing quantum confinement in bismuth thin films
NASA Astrophysics Data System (ADS)
Gity, Farzan; Ansari, Lida; Lanius, Martin; Schüffelgen, Peter; Mussler, Gregor; Grützmacher, Detlev; Greer, J. C.
2017-02-01
Solid state electronics relies on the intentional introduction of impurity atoms or dopants into a semiconductor crystal and/or the formation of junctions between different materials (heterojunctions) to create rectifiers, potential barriers, and conducting pathways. With these building blocks, switching and amplification of electrical currents and voltages are achieved. As miniaturisation continues to ultra-scaled transistors with critical dimensions on the order of ten atomic lengths, the concept of doping to form junctions fails and forming heterojunctions becomes extremely difficult. Here, it is shown that it is not needed to introduce dopant atoms nor is a heterojunction required to achieve the fundamental electronic function of current rectification. Ideal diode behavior or rectification is achieved solely by manipulation of quantum confinement using approximately 2 nm thick films consisting of a single atomic element, the semimetal bismuth. Crucially for nanoelectronics, this approach enables room temperature operation.
Libyan National Health Services The Need to Move to Management-by-Objectives
El Taguri, A; Elkhammas, EA; Bakoush, O; Ashammakhi, N; Baccoush, M; Betilmal, I
2008-01-01
In the last four decades, there has been a substantial horizontal expansion of health services in Libya. This resulted in improvement in morbidity and mortality, in particularly those related to infectious disease. However, measures such as the national performance gap indicator reveal an underperforming health system. In this article, we discuss aspects related to the Libyan health system and its current status including areas of weakness. Overcoming current failures and further improvement are unlikely to occur spontaneously without proper planning. Defining community health problems, identifying unmet needs, surveying resources to meet them, establishing SMART (specific, measurable, achievable, and realistic and time specific) objectives, and projecting administrative action to accomplish the proposed programs, are a must. The health system should rely on newer approaches such as management-by-objectives and risk-management rather than the prevailing crisis-management attitude. PMID:21499467
Core conditions for alpha heating attained in direct-drive inertial confinement fusion.
Bose, A; Woo, K M; Betti, R; Campbell, E M; Mangino, D; Christopherson, A R; McCrory, R L; Nora, R; Regan, S P; Goncharov, V N; Sangster, T C; Forrest, C J; Frenje, J; Gatu Johnson, M; Glebov, V Yu; Knauer, J P; Marshall, F J; Stoeckl, C; Theobald, W
2016-07-01
It is shown that direct-drive implosions on the OMEGA laser have achieved core conditions that would lead to significant alpha heating at incident energies available on the National Ignition Facility (NIF) scale. The extrapolation of the experimental results from OMEGA to NIF energy assumes only that the implosion hydrodynamic efficiency is unchanged at higher energies. This approach is independent of the uncertainties in the physical mechanism that degrade implosions on OMEGA, and relies solely on a volumetric scaling of the experimentally observed core conditions. It is estimated that the current best-performing OMEGA implosion [Regan et al., Phys. Rev. Lett. 117, 025001 (2016)10.1103/PhysRevLett.117.025001] extrapolated to a 1.9 MJ laser driver with the same illumination configuration and laser-target coupling would produce 125 kJ of fusion energy with similar levels of alpha heating observed in current highest performing indirect-drive NIF implosions.
Matrices and scaffolds for drug delivery in dental, oral and craniofacial tissue engineering☆
Moioli, Eduardo K.; Clark, Paul A.; Xin, Xuejun; Lal, Shan; Mao, Jeremy J.
2010-01-01
Current treatments for diseases and trauma of dental, oral and craniofacial (DOC) structures rely on durable materials such as amalgam and synthetic materials, or autologous tissue grafts. A paradigm shift has taken place to utilize tissue engineering and drug delivery approaches towards the regeneration of these structures. Several prototypes of DOC structures have been regenerated such as temporomandibular joint (TMJ) condyle, cranial sutures, tooth structures and periodontium components. However, many challenges remain when taking in consideration the high demand for esthetics of DOC structures, the complex environment and yet minimal scar formation in the oral cavity, and the need for accommodating multiple tissue phenotypes. This review highlights recent advances in the regeneration of DOC structures, including the tooth, periodontium, TMJ, cranial sutures and implant dentistry, with specific emphasis on controlled release of signaling cues for stem cells, biomaterial matrices and scaffolds, and integrated tissue engineering approaches. PMID:17499385
AEG-1 promoter-mediated imaging of prostate cancer
Bhatnagar, Akrita; Wang, Yuchuan; Mease, Ronnie C.; Gabrielson, Matthew; Sysa, Polina; Minn, Il; Green, Gilbert; Simmons, Brian; Gabrielson, Kathleen; Sarkar, Siddik; Fisher, Paul B.; Pomper, Martin G.
2014-01-01
We describe a new imaging method for detecting prostate cancer, whether localized or disseminated and metastatic to soft tissues and bone. The method relies on the use of imaging reporter genes under the control of the promoter of AEG-1 (MTDH), which is selectively active only in malignant cells. Through systemic, nanoparticle-based delivery of the imaging construct, lesions can be identified through bioluminescence imaging and single photon emission-computed tomography in the PC3-ML murine model of prostate cancer at high sensitivity. This approach is applicable for the detection of prostate cancer metastases, including bone lesions for which there is no current reliable agent for non-invasive clinical imaging. Further, the approach compares favorably to accepted and emerging clinical standards, including positron emission tomography with [18F]fluorodeoxyglucose and [18F]sodium fluoride. Our results offer a preclinical proof of concept that rationalizes clinical evaluation in patients with advanced prostate cancer. PMID:25145668
Forensic archaeology and anthropology : An Australian perspective.
Oakley, Kate
2005-09-01
Forensic archaeology is an extremely powerful investigative discipline and, in combination with forensic anthropology, can provide a wealth of evidentiary information to police investigators and the forensic community. The re-emergence of forensic archaeology and anthropology within Australia relies on its diversification and cooperation with established forensic medical organizations, law enforcement forensic service divisions, and national forensic boards. This presents a unique opportunity to develop a new multidisciplinary approach to forensic archaeology/anthropology within Australia as we hold a unique set of environmental, social, and cultural conditions that diverge from overseas models and require different methodological approaches. In the current world political climate, more forensic techniques are being applied at scenes of mass disasters, genocide, and terrorism. This provides Australian forensic archaeology/anthropology with a unique opportunity to develop multidisciplinary models with contributions from psychological profiling, ballistics, sociopolitics, cultural anthropology, mortuary technicians, post-blast analysis, fire analysis, and other disciplines from the world of forensic science.
Molecular methods for septicemia diagnosis.
Marco, Francesc
2017-11-01
Septicemia remains a major cause of hospital mortality. Blood culture remains the best approach to identify the etiological microorganisms when a bloodstream infection is suspected but it takes long time because it relies on bacterial or fungal growth. The introduction in clinical microbiology laboratories of the matrix-assisted laser desorption ionization time-of-flight mass spectrometry technology, DNA hybridization, microarrays or rapid PCR-based test significantly reduce the time to results. Tests for direct detection in whole blood samples are highly desirable because of their potential to identify bloodstream pathogens without waiting for blood cultures to become positive. Nonetheless, limitations of current molecular diagnostic methods are substantial. This article reviews these new molecular approaches (LightCycler SeptiFast, Magicplex sepsis real time, Septitest, VYOO, PCR/ESI-MS analysis, T2Candida). Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Droplet Microfluidics for Compartmentalized Cell Lysis and Extension of DNA from Single-Cells
NASA Astrophysics Data System (ADS)
Zimny, Philip; Juncker, David; Reisner, Walter
Current single cell DNA analysis methods suffer from (i) bias introduced by the need for molecular amplification and (ii) limited ability to sequence repetitive elements, resulting in (iii) an inability to obtain information regarding long range genomic features. Recent efforts to circumvent these limitations rely on techniques for sensing single molecules of DNA extracted from single-cells. Here we demonstrate a droplet microfluidic approach for encapsulation and biochemical processing of single-cells inside alginate microparticles. In our approach, single-cells are first packaged inside the alginate microparticles followed by cell lysis, DNA purification, and labeling steps performed off-chip inside this microparticle system. The alginate microparticles are then introduced inside a micro/nanofluidic system where the alginate is broken down via a chelating buffer, releasing long DNA molecules which are then extended inside nanofluidic channels for analysis via standard mapping protocols.
Biological fabrication of cellulose fibers with tailored properties.
Natalio, Filipe; Fuchs, Regina; Cohen, Sidney R; Leitus, Gregory; Fritz-Popovski, Gerhard; Paris, Oskar; Kappl, Michael; Butt, Hans-Jürgen
2017-09-15
Cotton is a promising basis for wearable smart textiles. Current approaches that rely on fiber coatings suffer from function loss during wear. We present an approach that allows biological incorporation of exogenous molecules into cotton fibers to tailor the material's functionality. In vitro model cultures of upland cotton ( Gossypium hirsutum ) are incubated with 6-carboxyfluorescein-glucose and dysprosium-1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid-glucose, where the glucose moiety acts as a carrier capable of traveling from the vascular connection to the outermost cell layer of the ovule epidermis, becoming incorporated into the cellulose fibers. This yields fibers with unnatural properties such as fluorescence or magnetism. Combining biological systems with the appropriate molecular design offers numerous possibilities to grow functional composite materials and implements a material-farming concept. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
1992-06-01
processes. It demands commitment and discipline. It relies on people and involves everyone. (DoD TQM Pamplet (undated), 1) The following are four...export the TQM philosophy to their suppliers, as indicated in their brochure : TQM relies on continuous improvement in DoD’s acquired products and services
Active management of food allergy: an emerging concept.
Anagnostou, Katherine; Stiefel, Gary; Brough, Helen; du Toit, George; Lack, Gideon; Fox, Adam T
2015-04-01
IgE-mediated food allergies are common and currently there is no cure. Traditionally, management has relied upon patient education, food avoidance and the provision of an emergency medication plan. Despite this, food allergy can significantly impact on quality of life. Therefore, in recent years, evolving research has explored alternative management strategies. A more active approach to management is being adopted, which includes early introduction of potentially allergenic foods, anticipatory testing, active monitoring, desensitisation to food allergens and active risk management. This review will discuss these areas in turn. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
The Interfamilial Principle and the Harvest Festival.
Cherkassky, Lisa
2016-03-01
It is widely accepted that younger children can act as saviour siblings by donating cord blood or bone marrow to their gravely-ill brothers or sisters. However, it is under dispute whether these procedures are in the best interests of the child. This article suggests that parents may be relying on a thinly-veiled interfamilial approach, where the wider benefit to the whole family is used to justify the procedure to the Human Tissue Authority in the United Kingdom. This article suggests that the merging of familial interests to validate a non-therapeutic bone marrow harvest on a child forces altruism in a patient too young to understand, rendering the harvests unlawful under current law.
Proteoglycomics: Recent Progress and Future Challenges
Ly, Mellisa; Laremore, Tatiana N.
2010-01-01
Abstract Proteoglycomics is a systematic study of structure, expression, and function of proteoglycans, a posttranslationally modified subset of a proteome. Although relying on the established technologies of proteomics and glycomics, proteoglycomics research requires unique approaches for elucidating structure–function relationships of both proteoglycan components, glycosaminoglycan chain, and core protein. This review discusses our current understanding of structure and function of proteoglycans, major players in the development, normal physiology, and disease. A brief outline of the proteoglycomic sample preparation and analysis is provided along with examples of several recent proteoglycomic studies. Unique challenges in the characterization of glycosaminoglycan component of proteoglycans are discussed, with emphasis on the many analytical tools used and the types of information they provide. PMID:20450439
Semihierarchical quantum repeaters based on moderate lifetime quantum memories
NASA Astrophysics Data System (ADS)
Liu, Xiao; Zhou, Zong-Quan; Hua, Yi-Lin; Li, Chuan-Feng; Guo, Guang-Can
2017-01-01
The construction of large-scale quantum networks relies on the development of practical quantum repeaters. Many approaches have been proposed with the goal of outperforming the direct transmission of photons, but most of them are inefficient or difficult to implement with current technology. Here, we present a protocol that uses a semihierarchical structure to improve the entanglement distribution rate while reducing the requirement of memory time to a range of tens of milliseconds. This protocol can be implemented with a fixed distance of elementary links and fixed requirements on quantum memories, which are independent of the total distance. This configuration is especially suitable for scalable applications in large-scale quantum networks.
Fourier transform spectrometer controller for partitioned architectures
NASA Astrophysics Data System (ADS)
Tamas-Selicean, D.; Keymeulen, D.; Berisford, D.; Carlson, R.; Hand, K.; Pop, P.; Wadsworth, W.; Levy, R.
The current trend in spacecraft computing is to integrate applications of different criticality levels on the same platform using no separation. This approach increases the complexity of the development, verification and integration processes, with an impact on the whole system life cycle. Researchers at ESA and NASA advocated for the use of partitioned architecture to reduce this complexity. Partitioned architectures rely on platform mechanisms to provide robust temporal and spatial separation between applications. Such architectures have been successfully implemented in several industries, such as avionics and automotive. In this paper we investigate the challenges of developing and the benefits of integrating a scientific instrument, namely a Fourier Transform Spectrometer, in such a partitioned architecture.
The role of exercise in amyotrophic lateral sclerosis.
Chen, Amy; Montes, Jacqueline; Mitsumoto, Hiroshi
2008-08-01
Amyotrophic lateral sclerosis (ALS) is a neurodegenerative disease affecting the motor nervous system. It causes progressive and cumulative physical disabilities in patients, and leads to eventual death due to respiratory muscle failure. The disease is diverse in its presentation, course, and progression. We do not yet fully understand the cause or causes of the disease, nor the mechanisms for its progression; thus, we lack effective means for treating this disease. Currently, we rely on a multidisciplinary approach to symptomatically manage and care for patients who have ALS. In this article, the authors review the literature on the role of exercise in patients who have ALS, and briefly compare what is known about exercise in other neuromuscular diseases.
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level
Savalei, Victoria; Rhemtulla, Mijke
2017-01-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data—that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study. PMID:29276371
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level.
Savalei, Victoria; Rhemtulla, Mijke
2017-08-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately handle missing data at the item level. Item-level multiple imputation (MI), however, can handle such missing data straightforwardly. In this article, we develop an analytic approach for dealing with item-level missing data-that is, one that obtains a unique set of parameter estimates directly from the incomplete data set and does not require imputations. The proposed approach is a variant of the two-stage maximum likelihood (TSML) methodology, and it is the analytic equivalent of item-level MI. We compare the new TSML approach to three existing alternatives for handling item-level missing data: scale-level full information maximum likelihood, available-case maximum likelihood, and item-level MI. We find that the TSML approach is the best analytic approach, and its performance is similar to item-level MI. We recommend its implementation in popular software and its further study.
Text mining of cancer-related information: review of current status and future directions.
Spasić, Irena; Livsey, Jacqueline; Keane, John A; Nenadić, Goran
2014-09-01
This paper reviews the research literature on text mining (TM) with the aim to find out (1) which cancer domains have been the subject of TM efforts, (2) which knowledge resources can support TM of cancer-related information and (3) to what extent systems that rely on knowledge and computational methods can convert text data into useful clinical information. These questions were used to determine the current state of the art in this particular strand of TM and suggest future directions in TM development to support cancer research. A review of the research on TM of cancer-related information was carried out. A literature search was conducted on the Medline database as well as IEEE Xplore and ACM digital libraries to address the interdisciplinary nature of such research. The search results were supplemented with the literature identified through Google Scholar. A range of studies have proven the feasibility of TM for extracting structured information from clinical narratives such as those found in pathology or radiology reports. In this article, we provide a critical overview of the current state of the art for TM related to cancer. The review highlighted a strong bias towards symbolic methods, e.g. named entity recognition (NER) based on dictionary lookup and information extraction (IE) relying on pattern matching. The F-measure of NER ranges between 80% and 90%, while that of IE for simple tasks is in the high 90s. To further improve the performance, TM approaches need to deal effectively with idiosyncrasies of the clinical sublanguage such as non-standard abbreviations as well as a high degree of spelling and grammatical errors. This requires a shift from rule-based methods to machine learning following the success of similar trends in biological applications of TM. Machine learning approaches require large training datasets, but clinical narratives are not readily available for TM research due to privacy and confidentiality concerns. This issue remains the main bottleneck for progress in this area. In addition, there is a need for a comprehensive cancer ontology that would enable semantic representation of textual information found in narrative reports. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Hot Carrier-Based Near-Field Thermophotovoltaic Energy Conversion.
St-Gelais, Raphael; Bhatt, Gaurang Ravindra; Zhu, Linxiao; Fan, Shanhui; Lipson, Michal
2017-03-28
Near-field thermophotovoltaics (NFTPV) is a promising approach for direct conversion of heat to electrical power. This technology relies on the drastic enhancement of radiative heat transfer (compared to conventional blackbody radiation) that occurs when objects at different temperatures are brought to deep subwavelength distances (typically <100 nm) from each other. Achieving such radiative heat transfer between a hot object and a photovoltaic (PV) cell could allow direct conversion of heat to electricity with a greater efficiency than using current solid-state technologies (e.g., thermoelectric generators). One of the main challenges in the development of this technology, however, is its incompatibility with conventional silicon PV cells. Thermal radiation is weak at frequencies larger than the ∼1.1 eV bandgap of silicon, such that PV cells with lower excitation energies (typically 0.4-0.6 eV) are required for NFTPV. Using low bandgap III-V semiconductors to circumvent this limitation, as proposed in most theoretical works, is challenging and therefore has never been achieved experimentally. In this work, we show that hot carrier PV cells based on Schottky junctions between silicon and metallic films could provide an attractive solution for achieving high efficiency NFTPV electricity generation. Hot carrier science is currently an important field of research and several approaches are investigated for increasing the quantum efficiency (QE) of hot carrier generation beyond conventional Fowler model predictions. If the Fowler limit can indeed be overcome, we show that hot carrier-based NFTPV systems-after optimization of their thermal radiation spectrum-could allow electricity generation with up to 10-30% conversion efficiencies and 10-500 W/cm 2 generated power densities (at 900-1500 K temperatures). We also discuss how the unique properties of thermal radiation in the extreme near-field are especially well suited for investigating recently proposed approaches for high QE hot carrier junctions. We therefore expect our work to be of interest for the field of hot carrier science and-by relying solely on conventional thin film materials-to provide a path for the experimental demonstration of NFTPV energy conversion.
Spatial abstraction for autonomous robot navigation.
Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon
2015-09-01
Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
Future Issues and Perspectives in the Evaluation of Social Development.
ERIC Educational Resources Information Center
Marsden, David; Oakley, Peter
1991-01-01
An instrumental/technocratic approach to evaluation of social development relies on primarily quantitative methods. An interpretive approach resists claims to legitimacy and authority of "experts" and questions existing interpretations. The latter approach is characterized by cultural relativism and subjectivity. (SK)
Automated visualization of rule-based models
Tapia, Jose-Juan; Faeder, James R.
2017-01-01
Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Taylor, Cody
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less
Mechanistic species distribution modelling as a link between physiology and conservation.
Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W
2015-01-01
Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and conservation practitioners would work collaboratively to build models, interpret results and consider conservation management options, and articulating this need here may help to stimulate collaboration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pražnikar, Jure; University of Primorska,; Turk, Dušan, E-mail: dusan.turk@ijs.si
2014-12-01
The maximum-likelihood free-kick target, which calculates model error estimates from the work set and a randomly displaced model, proved superior in the accuracy and consistency of refinement of crystal structures compared with the maximum-likelihood cross-validation target, which calculates error estimates from the test set and the unperturbed model. The refinement of a molecular model is a computational procedure by which the atomic model is fitted to the diffraction data. The commonly used target in the refinement of macromolecular structures is the maximum-likelihood (ML) function, which relies on the assessment of model errors. The current ML functions rely on cross-validation. Theymore » utilize phase-error estimates that are calculated from a small fraction of diffraction data, called the test set, that are not used to fit the model. An approach has been developed that uses the work set to calculate the phase-error estimates in the ML refinement from simulating the model errors via the random displacement of atomic coordinates. It is called ML free-kick refinement as it uses the ML formulation of the target function and is based on the idea of freeing the model from the model bias imposed by the chemical energy restraints used in refinement. This approach for the calculation of error estimates is superior to the cross-validation approach: it reduces the phase error and increases the accuracy of molecular models, is more robust, provides clearer maps and may use a smaller portion of data for the test set for the calculation of R{sub free} or may leave it out completely.« less
Optimization of monopiles for offshore wind turbines.
Kallehave, Dan; Byrne, Byron W; LeBlanc Thilsted, Christian; Mikkelsen, Kristian Kousgaard
2015-02-28
The offshore wind industry currently relies on subsidy schemes to be competitive with fossil-fuel-based energy sources. For the wind industry to survive, it is vital that costs are significantly reduced for future projects. This can be partly achieved by introducing new technologies and partly through optimization of existing technologies and design methods. One of the areas where costs can be reduced is in the support structure, where better designs, cheaper fabrication and quicker installation might all be possible. The prevailing support structure design is the monopile structure, where the simple design is well suited to mass-fabrication, and the installation approach, based on conventional impact driving, is relatively low-risk and robust for most soil conditions. The range of application of the monopile for future wind farms can be extended by using more accurate engineering design methods, specifically tailored to offshore wind industry design. This paper describes how state-of-the-art optimization approaches are applied to the design of current wind farms and monopile support structures and identifies the main drivers where more accurate engineering methods could impact on a next generation of highly optimized monopiles. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.
Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H
2016-12-01
Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.
Let Your Fingers Do the Walking.
ERIC Educational Resources Information Center
Pecchenino, Eve Hill; St. John, Jeanne
1983-01-01
An approach relying on the principles of acupuncture and emphasizing relaxation and trust has promoted psychomotor, health, communication-cognition, and affective skills in handicapped students. The approach has involved staff, parents, and community members. (CL)
Polyphony: superposition independent methods for ensemble-based drug discovery.
Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L
2014-09-30
Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].
Multimode-Optical-Fiber Imaging Probe
NASA Technical Reports Server (NTRS)
Jackson, Deborah
2000-01-01
Currently, endoscopic surgery uses single-mode fiber-bundles to obtain in vivo image information inside orifices of the body. This limits their use to the larger natural bodily orifices and to surgical procedures where there is plenty of room for manipulation. The knee joint, for example can be easily viewed with a fiber optic viewer, but joints in the finger cannot. However, there are a host of smaller orifices where fiber endoscopy would play an important role if a cost effective fiber probe were developed with small enough dimensions (< 250 microns). Examples of beneficiaries of micro-endoscopes are the treatment of the Eustatian tube of the middle ear, the breast ducts, tear ducts, coronary arteries, fallopian tubes, as well as the treatment of salivary duct parotid disease, and the neuro endoscopy of the ventricles and spinal canal. To solve this problem, this work describes an approach for recovering images from. tightly confined spaces using multimode fibers and analytically demonstrates that the concept is sound. The proof of concept draws upon earlier works that concentrated on image recovery after two-way transmission through a multimode fiber as well as work that demonstrated the recovery of images after one-way transmission through a multimode fiber. Both relied on generating a phase conjugated wavefront which was predistorted with the characteristics of the fiber. The described approach also relies on generating a phase conjugated wavefront, but utilizes two fibers to capture the image at some intermediate point (accessible by the fibers, but which is otherwise visually unaccessible).
Multimode-Optical-Fiber Imaging Probe
NASA Technical Reports Server (NTRS)
Jackson, Deborah
1999-01-01
Currently, endoscopic surgery uses single-mode fiber-bundles to obtain in vivo image information inside the orifices of the body. This limits their use to the larger natural orifices and to surgical procedures where there is plenty of room for manipulation. The knee joint, for example, can be easily viewed with a fiber optic viewer, but joints in the finger cannot. However, there are a host of smaller orifices where fiber endoscopy would play an important role if a cost effective fiber probe were developed with small enough dimensions (less than or equal to 250 microns). Examples of beneficiaries of micro-endoscopes are the treatment of the Eustatian tube of the middle ear, the breast ducts, tear ducts, coronary arteries, fallopian tubes, as well as the treatment of salivary duct parotid disease, and the neuro endoscopy of the ventricles and spinal canal. This work describes an approach for recovering images from tightly confined spaces using multimode. The concept draws upon earlier works that concentrated on image recovery after two-way transmission through a multimode fiber as well as work that demonstrated the recovery of images after one-way transmission through a multimode fiber. Both relied on generating a phase conjugated wavefront, which was predistorted with the characteristics of the fiber. The approach described here also relies on generating a phase conjugated wavefront, but utilizes two fibers to capture the image at some intermediate point (accessible by the fibers, but which is otherwise visually inaccessible).
A comparative study of radiofrequency antennas for Helicon plasma sources
NASA Astrophysics Data System (ADS)
Melazzi, D.; Lancellotti, V.
2015-04-01
Since Helicon plasma sources can efficiently couple power and generate high-density plasma, they have received interest also as spacecraft propulsive devices, among other applications. In order to maximize the power deposited into the plasma, it is necessary to assess the performance of the radiofrequency (RF) antenna that drives the discharge, as typical plasma parameters (e.g. the density) are varied. For this reason, we have conducted a comparative analysis of three Helicon sources which feature different RF antennas, namely, the single-loop, the Nagoya type-III and the fractional helix. These antennas are compared in terms of input impedance and induced current density; in particular, the real part of the impedance constitutes a measure of the antenna ability to couple power into the plasma. The results presented in this work have been obtained through a full-wave approach which (being hinged on the numerical solution of a system of integral equations) allows computing the antenna current and impedance self-consistently. Our findings indicate that certain combinations of plasma parameters can indeed maximize the real part of the input impedance and, thus, the deposited power, and that one of the three antennas analyzed performs best for a given plasma. Furthermore, unlike other strategies which rely on approximate antenna models, our approach enables us to reveal that the antenna current density is not spatially uniform, and that a correlation exists between the plasma parameters and the spatial distribution of the current density.
Anatomy integration blueprint: A fourth-year musculoskeletal anatomy elective model.
Lazarus, Michelle D; Kauffman, Gordon L; Kothari, Milind J; Mosher, Timothy J; Silvis, Matthew L; Wawrzyniak, John R; Anderson, Daniel T; Black, Kevin P
2014-01-01
Current undergraduate medical school curricular trends focus on both vertical integration of clinical knowledge into the traditionally basic science-dedicated curricula and increasing basic science education in the clinical years. This latter type of integration is more difficult and less reported on than the former. Here, we present an outline of a course wherein the primary learning and teaching objective is to integrate basic science anatomy knowledge with clinical education. The course was developed through collaboration by a multi-specialist course development team (composed of both basic scientists and physicians) and was founded in current adult learning theories. The course was designed to be widely applicable to multiple future specialties, using current published reports regarding the topics and clinical care areas relying heavily on anatomical knowledge regardless of specialist focus. To this end, the course focuses on the role of anatomy in the diagnosis and treatment of frequently encountered musculoskeletal conditions. Our iterative implementation and action research approach to this course development has yielded a curricular template for anatomy integration into clinical years. Key components for successful implementation of these types of courses, including content topic sequence, the faculty development team, learning approaches, and hidden curricula, were developed. We also report preliminary feedback from course stakeholders and lessons learned through the process. The purpose of this report is to enhance the current literature regarding basic science integration in the clinical years of medical school. © 2014 American Association of Anatomists.
Towards a whole-cell modeling approach for synthetic biology
NASA Astrophysics Data System (ADS)
Purcell, Oliver; Jain, Bonny; Karr, Jonathan R.; Covert, Markus W.; Lu, Timothy K.
2013-06-01
Despite rapid advances over the last decade, synthetic biology lacks the predictive tools needed to enable rational design. Unlike established engineering disciplines, the engineering of synthetic gene circuits still relies heavily on experimental trial-and-error, a time-consuming and inefficient process that slows down the biological design cycle. This reliance on experimental tuning is because current modeling approaches are unable to make reliable predictions about the in vivo behavior of synthetic circuits. A major reason for this lack of predictability is that current models view circuits in isolation, ignoring the vast number of complex cellular processes that impinge on the dynamics of the synthetic circuit and vice versa. To address this problem, we present a modeling approach for the design of synthetic circuits in the context of cellular networks. Using the recently published whole-cell model of Mycoplasma genitalium, we examined the effect of adding genes into the host genome. We also investigated how codon usage correlates with gene expression and find agreement with existing experimental results. Finally, we successfully implemented a synthetic Goodwin oscillator in the whole-cell model. We provide an updated software framework for the whole-cell model that lays the foundation for the integration of whole-cell models with synthetic gene circuit models. This software framework is made freely available to the community to enable future extensions. We envision that this approach will be critical to transforming the field of synthetic biology into a rational and predictive engineering discipline.
NASA Technical Reports Server (NTRS)
Pliutau, Denis; Prasad, Narashimha S.
2013-01-01
Current approaches to satellite observation data storage and distribution implement separate visualization and data access methodologies which often leads to the need in time consuming data ordering and coding for applications requiring both visual representation as well as data handling and modeling capabilities. We describe an approach we implemented for a data-encoded web map service based on storing numerical data within server map tiles and subsequent client side data manipulation and map color rendering. The approach relies on storing data using the lossless compression Portable Network Graphics (PNG) image data format which is natively supported by web-browsers allowing on-the-fly browser rendering and modification of the map tiles. The method is easy to implement using existing software libraries and has the advantage of easy client side map color modifications, as well as spatial subsetting with physical parameter range filtering. This method is demonstrated for the ASTER-GDEM elevation model and selected MODIS data products and represents an alternative to the currently used storage and data access methods. One additional benefit includes providing multiple levels of averaging due to the need in generating map tiles at varying resolutions for various map magnification levels. We suggest that such merged data and mapping approach may be a viable alternative to existing static storage and data access methods for a wide array of combined simulation, data access and visualization purposes.
Interteaching: An Evidence-Based Approach to Instruction
ERIC Educational Resources Information Center
Brown, Thomas Wade; Killingsworth, Kenneth; Alavosius, Mark P.
2014-01-01
This paper describes "interteaching" as an evidence-based method of instruction. Instructors often rely on more traditional approaches, such as lectures, as means to deliver instruction. Despite high usage, these methods are ineffective at achieving desirable academic outcomes. We discuss an innovative approach to delivering instruction…
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Modeling Time-Dependent Association in Longitudinal Data: A Lag as Moderator Approach
ERIC Educational Resources Information Center
Selig, James P.; Preacher, Kristopher J.; Little, Todd D.
2012-01-01
We describe a straightforward, yet novel, approach to examine time-dependent association between variables. The approach relies on a measurement-lag research design in conjunction with statistical interaction models. We base arguments in favor of this approach on the potential for better understanding the associations between variables by…
The Role of Probability in Developing Learners' Models of Simulation Approaches to Inference
ERIC Educational Resources Information Center
Lee, Hollylynne S.; Doerr, Helen M.; Tran, Dung; Lovett, Jennifer N.
2016-01-01
Repeated sampling approaches to inference that rely on simulations have recently gained prominence in statistics education, and probabilistic concepts are at the core of this approach. In this approach, learners need to develop a mapping among the problem situation, a physical enactment, computer representations, and the underlying randomization…
Markerless 3D motion capture for animal locomotion studies
Sellers, William Irvin; Hirasaki, Eishi
2014-01-01
ABSTRACT Obtaining quantitative data describing the movements of animals is an essential step in understanding their locomotor biology. Outside the laboratory, measuring animal locomotion often relies on video-based approaches and analysis is hampered because of difficulties in calibration and often the limited availability of possible camera positions. It is also usually restricted to two dimensions, which is often an undesirable over-simplification given the essentially three-dimensional nature of many locomotor performances. In this paper we demonstrate a fully three-dimensional approach based on 3D photogrammetric reconstruction using multiple, synchronised video cameras. This approach allows full calibration based on the separation of the individual cameras and will work fully automatically with completely unmarked and undisturbed animals. As such it has the potential to revolutionise work carried out on free-ranging animals in sanctuaries and zoological gardens where ad hoc approaches are essential and access within enclosures often severely restricted. The paper demonstrates the effectiveness of video-based 3D photogrammetry with examples from primates and birds, as well as discussing the current limitations of this technique and illustrating the accuracies that can be obtained. All the software required is open source so this can be a very cost effective approach and provides a methodology of obtaining data in situations where other approaches would be completely ineffective. PMID:24972869
Danchin, Antoine; Ouzounis, Christos; Tokuyasu, Taku; Zucker, Jean-Daniel
2018-07-01
Science and engineering rely on the accumulation and dissemination of knowledge to make discoveries and create new designs. Discovery-driven genome research rests on knowledge passed on via gene annotations. In response to the deluge of sequencing big data, standard annotation practice employs automated procedures that rely on majority rules. We argue this hinders progress through the generation and propagation of errors, leading investigators into blind alleys. More subtly, this inductive process discourages the discovery of novelty, which remains essential in biological research and reflects the nature of biology itself. Annotation systems, rather than being repositories of facts, should be tools that support multiple modes of inference. By combining deduction, induction and abduction, investigators can generate hypotheses when accurate knowledge is extracted from model databases. A key stance is to depart from 'the sequence tells the structure tells the function' fallacy, placing function first. We illustrate our approach with examples of critical or unexpected pathways, using MicroScope to demonstrate how tools can be implemented following the principles we advocate. We end with a challenge to the reader. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control
NASA Astrophysics Data System (ADS)
Deffner, Sebastian; Campbell, Steve
2017-11-01
One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.
A new approach to simulating collisionless dark matter fluids
NASA Astrophysics Data System (ADS)
Hahn, Oliver; Abel, Tom; Kaehler, Ralf
2013-09-01
Recently, we have shown how current cosmological N-body codes already follow the fine grained phase-space information of the dark matter fluid. Using a tetrahedral tessellation of the three-dimensional manifold that describes perfectly cold fluids in six-dimensional phase space, the phase-space distribution function can be followed throughout the simulation. This allows one to project the distribution function into configuration space to obtain highly accurate densities, velocities and velocity dispersions. Here, we exploit this technique to show first steps on how to devise an improved particle-mesh technique. At its heart, the new method thus relies on a piecewise linear approximation of the phase-space distribution function rather than the usual particle discretization. We use pseudo-particles that approximate the masses of the tetrahedral cells up to quadrupolar order as the locations for cloud-in-cell (CIC) deposit instead of the particle locations themselves as in standard CIC deposit. We demonstrate that this modification already gives much improved stability and more accurate dynamics of the collisionless dark matter fluid at high force and low mass resolution. We demonstrate the validity and advantages of this method with various test problems as well as hot/warm dark matter simulations which have been known to exhibit artificial fragmentation. This completely unphysical behaviour is much reduced in the new approach. The current limitations of our approach are discussed in detail and future improvements are outlined.
Saucedo-Espinosa, Mario A.; Lapizco-Encinas, Blanca H.
2016-01-01
Current monitoring is a well-established technique for the characterization of electroosmotic (EO) flow in microfluidic devices. This method relies on monitoring the time response of the electric current when a test buffer solution is displaced by an auxiliary solution using EO flow. In this scheme, each solution has a different ionic concentration (and electric conductivity). The difference in the ionic concentration of the two solutions defines the dynamic time response of the electric current and, hence, the current signal to be measured: larger concentration differences result in larger measurable signals. A small concentration difference is needed, however, to avoid dispersion at the interface between the two solutions, which can result in undesired pressure-driven flow that conflicts with the EO flow. Additional challenges arise as the conductivity of the test solution decreases, leading to a reduced electric current signal that may be masked by noise during the measuring process, making for a difficult estimation of an accurate EO mobility. This contribution presents a new scheme for current monitoring that employs multiple channels arranged in parallel, producing an increase in the signal-to-noise ratio of the electric current to be measured and increasing the estimation accuracy. The use of this parallel approach is particularly useful in the estimation of the EO mobility in systems where low conductivity mediums are required, such as insulator based dielectrophoresis devices. PMID:27375813
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langrish, T.A.G.; Harvey, A.C.
2000-01-01
A model of a well-mixed fluidized-bed dryer within a process flowsheeting package (SPEEDUP{trademark}) has been developed and applied to a parameter sensitivity study, a steady-state controllability analysis and an optimization study. This approach is more general and would be more easily applied to a complex flowsheet than one which relied on stand-alone dryer modeling packages. The simulation has shown that industrial data may be fitted to the model outputs with sensible values of unknown parameters. For this case study, the parameter sensitivity study has found that the heat loss from the dryer and the critical moisture content of the materialmore » have the greatest impact on the dryer operation at the current operating point. An optimization study has demonstrated the dominant effect of the heat loss from the dryer on the current operating cost and the current operating conditions, and substantial cost savings (around 50%) could be achieved with a well-insulated and airtight dryer, for the specific case studied here.« less
Feng, Jie; Yee, Rebecca; Zhang, Shuo; Tian, Lili; Shi, Wanliang; Zhang, Wen-Hong; Zhang, Ying
2018-01-01
Antibiotic-resistant bacteria have caused huge concerns and demand innovative approaches for their prompt detection. Current antimicrobial susceptibility tests (AST) rely on the growth of the organisms which takes 1-2 days for fast-growing organisms and several weeks for slow growing organisms. Here, we show for the first time the utility of the SYBR Green I/propidium iodide (PI) viability assay for rapidly identifying antibiotic resistance in less than 30 min for major, antibiotic-resistant, fast-growing bacteria, such as Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae , and Acinetobacter baumannii for bactericidal and bacteriostatic agents and in 16 h for extremely rapid detection of drug resistance for isoniazid and pyrazinamide in slow-growing Mycobacterium tuberculosis . The SYBR Green I/PI assay generated rapid and robust results in concordance with traditional AST methods. This novel growth-independent methodology changes the concept of the current growth-based AST and may revolutionize current drug susceptibility testing for all cells of prokaryotic and eukaryotic origin and, subject to further clinical validation, may play a major role in saving lives and improving patient outcomes.
Non-linear heterogeneous FE approach for FRP strengthened masonry arches
NASA Astrophysics Data System (ADS)
Bertolesi, Elisa; Milani, Gabriele; Fedele, Roberto
2015-12-01
A fast and reliable non-linear heterogeneous FE approach specifically conceived for the analysis of FRP-reinforced masonry arches is presented. The approach proposed relies into the reduction of mortar joints to interfaces exhibiting a non-linear holonomic behavior, with a discretization of bricks by means of four-noded elastic elements. The FRP reinforcement is modeled by means of truss elements with elastic-brittle behavior, where the peak tensile strength is estimated by means of a consolidated approach provided by the Italian guidelines CNR-DT200 on masonry strengthening with fiber materials, where the delamination of the strip from the support is taken into account. The model is validated against some recent experimental results relying into circular masonry arches reinforced at both the intrados and the extrados. Some sensitivity analyses are conducted varying the peak tensile strength of the trusses representing the FRP reinforcement.
A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.
Faya, Paul; Stamey, James D; Seaman, John W
2017-01-01
For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.
Atmospheric-pressure ionization and fragmentation of peptides by solution-cathode glow discharge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Andrew J.; Shelley, Jacob T.; Walton, Courtney L.
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here in this paper, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-likemore » spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma.« less
Atmospheric-pressure ionization and fragmentation of peptides by solution-cathode glow discharge
Schwartz, Andrew J.; Shelley, Jacob T.; Walton, Courtney L.; ...
2016-06-27
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here in this paper, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-likemore » spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma.« less
Is it time to reassess current safety standards for glyphosate-based herbicides?
Blumberg, Bruce; Antoniou, Michael N; Benbrook, Charles M; Carroll, Lynn; Colborn, Theo; Everett, Lorne G; Hansen, Michael; Landrigan, Philip J; Lanphear, Bruce P; Mesnage, Robin; vom Saal, Frederick S; Welshons, Wade V; Myers, John Peterson
2017-01-01
Use of glyphosate-based herbicides (GBHs) increased ∼100-fold from 1974 to 2014. Additional increases are expected due to widespread emergence of glyphosate-resistant weeds, increased application of GBHs, and preharvest uses of GBHs as desiccants. Current safety assessments rely heavily on studies conducted over 30 years ago. We have considered information on GBH use, exposures, mechanisms of action, toxicity and epidemiology. Human exposures to glyphosate are rising, and a number of in vitro and in vivo studies challenge the basis for the current safety assessment of glyphosate and GBHs. We conclude that current safety standards for GBHs are outdated and may fail to protect public health or the environment. To improve safety standards, the following are urgently needed: (1) human biomonitoring for glyphosate and its metabolites; (2) prioritisation of glyphosate and GBHs for hazard assessments, including toxicological studies that use state-of-the-art approaches; (3) epidemiological studies, especially of occupationally exposed agricultural workers, pregnant women and their children and (4) evaluations of GBHs in commercially used formulations, recognising that herbicide mixtures likely have effects that are not predicted by studying glyphosate alone. PMID:28320775
Espiritu, Michael J; Cabalteja, Chino C; Sugai, Christopher K; Bingham, Jon-Paul
2014-01-01
Bioactive peptides from Conus venom contain a natural abundance of post-translational modifications that affect their chemical diversity, structural stability, and neuroactive properties. These modifications have continually presented hurdles in their identification and characterization. Early endeavors in their analysis relied on classical biochemical techniques that have led to the progressive development and use of novel proteomic-based approaches. The critical importance of these post-translationally modified amino acids and their specific assignment cannot be understated, having impact on their folding, pharmacological selectivity, and potency. Such modifications at an amino acid level may also provide additional insight into the advancement of conopeptide drugs in the quest for precise pharmacological targeting. To achieve this end, a concerted effort between the classical and novel approaches is needed to completely elucidate the role of post-translational modifications in conopeptide structure and dynamics. This paper provides a reflection in the advancements observed in dealing with numerous and multiple post-translationally modified amino acids within conotoxins and conopeptides and provides a summary of the current techniques used in their identification.
A Plasmonic Mass Spectrometry Approach for Detection of Small Nutrients and Toxins
NASA Astrophysics Data System (ADS)
Wu, Shu; Qian, Linxi; Huang, Lin; Sun, Xuming; Su, Haiyang; Gurav, Deepanjali D.; Jiang, Mawei; Cai, Wei; Qian, Kun
2018-07-01
Nutriology relies on advanced analytical tools to study the molecular compositions of food and provide key information on sample quality/safety. Small nutrients detection is challenging due to the high diversity and broad dynamic range of molecules in food samples, and a further issue is to track low abundance toxins. Herein, we developed a novel plasmonic matrix-assisted laser desorption/ionization mass spectrometry (MALDI MS) approach to detect small nutrients and toxins in complex biological emulsion samples. Silver nanoshells (SiO2@Ag) with optimized structures were used as matrices and achieved direct analysis of 6 nL of human breast milk without any enrichment or separation. We performed identification and quantitation of small nutrients and toxins with limit-of-detection down to 0.4 pmol (for melamine) and reaction time shortened to minutes, which is superior to the conventional biochemical method currently in use. The developed approach contributes to the near-future application of MALDI MS in a broad field and personalized design of plasmonic materials for real-case bio-analysis.[Figure not available: see fulltext.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Din, Alif
2016-08-15
The theory of positive-ion collection by a probe immersed in a low-pressure plasma was reviewed and extended by Allen et al. [Proc. Phys. Soc. 70, 297 (1957)]. The numerical computations for cylindrical and spherical probes in a sheath region were presented by F. F. Chen [J. Nucl. Energy C 7, 41 (1965)]. Here, in this paper, the sheath and presheath solutions for a cylindrical probe are matched through a numerical matching procedure to yield “matched” potential profile or “M solution.” The solution based on the Bohm criterion approach “B solution” is discussed for this particular problem. The comparison of cylindricalmore » probe characteristics obtained from the correct potential profile (M solution) and the approximated Bohm-criterion approach are different. This raises questions about the correctness of cylindrical probe theories relying only on the Bohm-criterion approach. Also the comparison between theoretical and experimental ion current characteristics shows that in an argon plasma the ions motion towards the probe is almost radial.« less
Guidelines for Bacteriophage Product Certification.
Fauconnier, Alan
2018-01-01
Following decades in the wilderness, bacteriophage therapy is now appearing as a credible antimicrobial strategy. However, this reemerging therapy does not rekindle without raising sensitive regulatory concerns. Indeed, whereas the European regulatory framework has been basically implemented to tackle ready-to-use pharmaceuticals produced on a large scale, bacteriophage therapy relies on a dynamic approach requiring a regulation on personalized medicine, nonexistent at present. Because of this, no guideline are currently available for addressing the scientific and regulatory issues specifically related to phage therapy medicinal products (PTMP).Pending to the implementation of an appropriate regulatory framework and to the development of ensuing guidelines, several avenues which might lead to PTMP regulatory compliance are explored here. Insights might come from the multi-strain dossier approach set up for particular animal vaccines, from the homologous group concept developed for the allergen products or from the licensing process for veterinary autogenous vaccines. Depending on national legislations, customized preparations prescribed as magistral formulas or to be used on a named-patient basis are possible regulatory approaches to be considered. However, these schemes are not optimal and should thus be regarded as transitional.
Biotechnology applied to fish reproduction: tools for conservation.
de Siqueira-Silva, Diógenes Henrique; Saito, Taiju; Dos Santos-Silva, Amanda Pereira; da Silva Costa, Raphael; Psenicka, Martin; Yasui, George Shigueki
2018-04-29
This review discusses the new biotechnological tools that are arising and promising for conservation and enhancement of fish production, mainly regarding the endangered and the most economically important species. Two main techniques, in particular, are available to avoid extinction of endangered fish species and to improve the production of commercial species. Germ cell transplantation technology includes a number of approaches that have been studied, such as the transplantation of embryo-to-embryo blastomere, embryo-to-embryo differentiated PGC, larvae to larvae and embryo differentiated PGC, transplantation of spermatogonia from adult to larvae or between adults, and oogonia transplantation. However, the success of germ cell transplantation relies on the prior sterilization of fish, which can be performed at different stages of fish species development by means of several protocols that have been tested in order to achieve the best approach to produce a sterile fish. Among them, fish hybridization and triploidization, germline gene knockdown, hyperthermia, and chemical treatment deserve attention based on important results achieved thus far. This review currently used technologies and knowledge about surrogate technology and fish sterilization, discussing the stronger and the weaker points of each approach.
Vazquez-Anderson, Jorge; Mihailovic, Mia K.; Baldridge, Kevin C.; Reyes, Kristofer G.; Haning, Katie; Cho, Seung Hee; Amador, Paul; Powell, Warren B.
2017-01-01
Abstract Current approaches to design efficient antisense RNAs (asRNAs) rely primarily on a thermodynamic understanding of RNA–RNA interactions. However, these approaches depend on structure predictions and have limited accuracy, arguably due to overlooking important cellular environment factors. In this work, we develop a biophysical model to describe asRNA–RNA hybridization that incorporates in vivo factors using large-scale experimental hybridization data for three model RNAs: a group I intron, CsrB and a tRNA. A unique element of our model is the estimation of the availability of the target region to interact with a given asRNA using a differential entropic consideration of suboptimal structures. We showcase the utility of this model by evaluating its prediction capabilities in four additional RNAs: a group II intron, Spinach II, 2-MS2 binding domain and glgC 5΄ UTR. Additionally, we demonstrate the applicability of this approach to other bacterial species by predicting sRNA–mRNA binding regions in two newly discovered, though uncharacterized, regulatory RNAs. PMID:28334800
Estimating Causal Effects in Mediation Analysis Using Propensity Scores
ERIC Educational Resources Information Center
Coffman, Donna L.
2011-01-01
Mediation is usually assessed by a regression-based or structural equation modeling (SEM) approach that we refer to as the classical approach. This approach relies on the assumption that there are no confounders that influence both the mediator, "M", and the outcome, "Y". This assumption holds if individuals are randomly…
The Future of Computer-Based Toxicity Prediction:
Mechanism-Based Models vs. Information Mining Approaches
When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...
ERIC Review Faculty Evaluation: A Response to Competing Values.
ERIC Educational Resources Information Center
Redmon, Kent D.
1999-01-01
Reviews the literature on faculty evaluation to define how different purposes and competing values have produced two approaches. A procedural approach relies upon both self-evaluation and appraisals by peers, administrators, and students. The developmental approach rests upon teaching portfolios, dossiers, and self-evaluations. Contains 23…
The risk of collapse in abandoned mine sites: the issue of data uncertainty
NASA Astrophysics Data System (ADS)
Longoni, Laura; Papini, Monica; Brambilla, Davide; Arosio, Diego; Zanzi, Luigi
2016-04-01
Ground collapses over abandoned underground mines constitute a new environmental risk in the world. The high risk associated with subsurface voids, together with lack of knowledge of the geometric and geomechanical features of mining areas, makes abandoned underground mines one of the current challenges for countries with a long mining history. In this study, a stability analysis of Montevecchia marl mine is performed in order to validate a general approach that takes into account the poor local information and the variability of the input data. The collapse risk was evaluated through a numerical approach that, starting with some simplifying assumptions, is able to provide an overview of the collapse probability. The final results is an easy-accessible-transparent summary graph that shows the collapse probability. This approach may be useful for public administrators called upon to manage this environmental risk. The approach tries to simplify this complex problem in order to achieve a roughly risk assessment, but, since it relies on just a small amount of information, any final user should be aware that a comprehensive and detailed risk scenario can be generated only through more exhaustive investigations.
Artificial intelligence in nanotechnology.
Sacha, G M; Varona, P
2013-11-15
During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.
Kelly, Benjamin J; Fitch, James R; Hu, Yangqiu; Corsmeier, Donald J; Zhong, Huachun; Wetzel, Amy N; Nordquist, Russell D; Newsom, David L; White, Peter
2015-01-20
While advances in genome sequencing technology make population-scale genomics a possibility, current approaches for analysis of these data rely upon parallelization strategies that have limited scalability, complex implementation and lack reproducibility. Churchill, a balanced regional parallelization strategy, overcomes these challenges, fully automating the multiple steps required to go from raw sequencing reads to variant discovery. Through implementation of novel deterministic parallelization techniques, Churchill allows computationally efficient analysis of a high-depth whole genome sample in less than two hours. The method is highly scalable, enabling full analysis of the 1000 Genomes raw sequence dataset in a week using cloud resources. http://churchill.nchri.org/.
Optimal 2D-SIM reconstruction by two filtering steps with Richardson-Lucy deconvolution.
Perez, Victor; Chang, Bo-Jui; Stelzer, Ernst Hans Karl
2016-11-16
Structured illumination microscopy relies on reconstruction algorithms to yield super-resolution images. Artifacts can arise in the reconstruction and affect the image quality. Current reconstruction methods involve a parametrized apodization function and a Wiener filter. Empirically tuning the parameters in these functions can minimize artifacts, but such an approach is subjective and produces volatile results. We present a robust and objective method that yields optimal results by two straightforward filtering steps with Richardson-Lucy-based deconvolutions. We provide a resource to identify artifacts in 2D-SIM images by analyzing two main reasons for artifacts, out-of-focus background and a fluctuating reconstruction spectrum. We show how the filtering steps improve images of test specimens, microtubules, yeast and mammalian cells.
Optimal 2D-SIM reconstruction by two filtering steps with Richardson-Lucy deconvolution
NASA Astrophysics Data System (ADS)
Perez, Victor; Chang, Bo-Jui; Stelzer, Ernst Hans Karl
2016-11-01
Structured illumination microscopy relies on reconstruction algorithms to yield super-resolution images. Artifacts can arise in the reconstruction and affect the image quality. Current reconstruction methods involve a parametrized apodization function and a Wiener filter. Empirically tuning the parameters in these functions can minimize artifacts, but such an approach is subjective and produces volatile results. We present a robust and objective method that yields optimal results by two straightforward filtering steps with Richardson-Lucy-based deconvolutions. We provide a resource to identify artifacts in 2D-SIM images by analyzing two main reasons for artifacts, out-of-focus background and a fluctuating reconstruction spectrum. We show how the filtering steps improve images of test specimens, microtubules, yeast and mammalian cells.
Beyond cysteine: recent developments in the area of targeted covalent inhibition.
Mukherjee, Herschel; Grimster, Neil P
2018-05-29
Over the past decade targeted covalent inhibitors have undergone a renaissance due to the clinical validation and regulatory approval of several small molecule therapeutics that are designed to irreversibly modify their target protein. Invariably, these compounds rely on the serendipitous placement of a cysteine residue proximal to the small molecule binding site; while this strategy has afforded numerous successes, it necessarily limits the number of proteins that can be targeted by this approach. This drawback has led several research groups to develop novel methodologies that target non-cysteine residues for covalent modification. Herein, we survey the current literature of warheads that covalently modify non-cysteine amino acids in proteins. Copyright © 2018 Elsevier Ltd. All rights reserved.
Artificial intelligence in nanotechnology
NASA Astrophysics Data System (ADS)
Sacha, G. M.; Varona, P.
2013-11-01
During the last decade there has been increasing use of artificial intelligence tools in nanotechnology research. In this paper we review some of these efforts in the context of interpreting scanning probe microscopy, the study of biological nanosystems, the classification of material properties at the nanoscale, theoretical approaches and simulations in nanoscience, and generally in the design of nanodevices. Current trends and future perspectives in the development of nanocomputing hardware that can boost artificial-intelligence-based applications are also discussed. Convergence between artificial intelligence and nanotechnology can shape the path for many technological developments in the field of information sciences that will rely on new computer architectures and data representations, hybrid technologies that use biological entities and nanotechnological devices, bioengineering, neuroscience and a large variety of related disciplines.
NASA Astrophysics Data System (ADS)
Chang, C. L.; Chen, C. Y.; Sung, C. C.; Liou, D. H.
This study presents a novel fuel sensor-less control scheme for a liquid feed fuel cell system that does not rely on a fuel concentration sensor. The proposed approach simplifies the design and reduces the cost and complexity of a liquid feed fuel cell system, and is especially suited to portable power sources, of which the volume and weight are important. During the reaction of a fuel cell, the cell's operating characteristics, such as potential, current and power are measured to control the supply of fuel and regulate its concentration to optimize performance. Experiments were conducted to verify that the fuel sensor-less control algorithm is effective in the liquid feed fuel cell system.
A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campos, E; Sisterson, DL
The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore,more » an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.« less
Neurobehavioral Development of Common Marmoset Monkeys
Schultz-Darken, Nancy; Braun, Katarina M.; Emborg, Marina E.
2016-01-01
Common marmoset (Callithrix jacchus) monkeys are a resource for biomedical research and their use is predicted to increase due to the suitability of this species for transgenic approaches. Identification of abnormal neurodevelopment due to genetic modification relies upon the comparison with validated patterns of normal behavior defined by unbiased methods. As scientists unfamiliar with nonhuman primate development are interested to apply genomic editing techniques in marmosets, it would be beneficial to the field that the investigators use validated methods of postnatal evaluation that are age and species appropriate. This review aims to analyze current available data on marmoset physical and behavioral postnatal development, describe the methods used and discuss next steps to better understand and evaluate marmoset normal and abnormal postnatal neurodevelopment PMID:26502294
Body Temperature Measurements for Metabolic Phenotyping in Mice
Meyer, Carola W.; Ootsuka, Youichirou; Romanovsky, Andrej A.
2017-01-01
Endothermic organisms rely on tightly balanced energy budgets to maintain a regulated body temperature and body mass. Metabolic phenotyping of mice, therefore, often includes the recording of body temperature. Thermometry in mice is conducted at various sites, using various devices and measurement practices, ranging from single-time probing to continuous temperature imaging. Whilst there is broad agreement that body temperature data is of value, procedural considerations of body temperature measurements in the context of metabolic phenotyping are missing. Here, we provide an overview of the various methods currently available for gathering body temperature data from mice. We explore the scope and limitations of thermometry in mice, with the hope of assisting researchers in the selection of appropriate approaches, and conditions, for comprehensive mouse phenotypic analyses. PMID:28824441
Regulation of alcohol marketing: a global view.
Casswell, Sally; Maxwell, Anna
2005-09-01
The marketing of alcohol produces a new challenge for policy development internationally, in part because of the increase in the use of new, unmeasured technologies. Many of these new developments are, as yet, relatively invisible in the policy arena. New approaches in branding, the utilization of marketing opportunities via branded events and new products provide additional complexity to attempts to monitor and to restrict the impact of marketing on young people and other vulnerable groups. Current attempts to restrict marketing globally, which rely primarily on voluntary codes and focus on traditional media, are inadequate to these challenges. A new statutory framework is required to enable the monitoring and control of the full marketing mix in ways which match the sophistication of the marketing efforts themselves.
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
Automation of the novel object recognition task for use in adolescent rats
Silvers, Janelle M.; Harrod, Steven B.; Mactutus, Charles F.; Booze, Rosemarie M.
2010-01-01
The novel object recognition task is gaining popularity for its ability to test a complex behavior which relies on the integrity of memory and attention systems without placing undue stress upon the animal. While the task places few requirements upon the animal, it traditionally requires the experimenter to observe the test phase directly and record behavior. This approach can severely limit the number of subjects which can be tested in a reasonable period of time, as training and testing occur on the same day and span several hours. The current study was designed to test the feasibility of automation of this task for adolescent rats using standard activity chambers, with the goals of increased objectivity, flexibility, and throughput of subjects. PMID:17719091
Ji, Yanqing; Ying, Hao; Farber, Margo S.; Yen, John; Dews, Peter; Miller, Richard E.; Massanari, R. Michael
2014-01-01
Discovering unknown adverse drug reactions (ADRs) in postmarketing surveillance as early as possible is of great importance. The current approach to postmarketing surveillance primarily relies on spontaneous reporting. It is a passive surveillance system and limited by gross underreporting (<10% reporting rate), latency, and inconsistent reporting. We propose a novel team-based intelligent agent software system approach for proactively monitoring and detecting potential ADRs of interest using electronic patient records. We designed such a system and named it ADRMonitor. The intelligent agents, operating on computers located in different places, are capable of continuously and autonomously collaborating with each other and assisting the human users (e.g., the food and drug administration (FDA), drug safety professionals, and physicians). The agents should enhance current systems and accelerate early ADR identification. To evaluate the performance of the ADRMonitor with respect to the current spontaneous reporting approach, we conducted simulation experiments on identification of ADR signal pairs (i.e., potential links between drugs and apparent adverse reactions) under various conditions. The experiments involved over 275 000 simulated patients created on the basis of more than 1000 real patients treated by the drug cisapride that was on the market for seven years until its withdrawal by the FDA in 2000 due to serious ADRs. Healthcare professionals utilizing the spontaneous reporting approach and the ADRMonitor were separately simulated by decision-making models derived from a general cognitive decision model called fuzzy recognition-primed decision (RPD) model that we recently developed. The quantitative simulation results show that 1) the number of true ADR signal pairs detected by the ADRMonitor is 6.6 times higher than that by the spontaneous reporting strategy; 2) the ADR detection rate of the ADRMonitor agents with even moderate decision-making skills is five times higher than that of spontaneous reporting; and 3) as the number of patient cases increases, ADRs could be detected significantly earlier by the ADRMonitor. PMID:20007038
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.
Knowledge-driven lead discovery.
Pirard, Bernard
2005-11-01
Virtual screening encompasses several computational approaches which have proven valuable for identifying novel leads. These approaches rely on available information. Herein, we review recent successful applications of virtual screening. The extension of virtual screening methodologies to target families is also briefly discussed.
Air pollution exposure prediction approaches used in air pollution epidemiology studies
Epidemiological studies of the health effects of air pollution have traditionally relied upon surrogates of personal exposures, most commonly ambient concentration measurements from central-site monitors. However, this approach may introduce exposure prediction errors and miscla...
Analog track angle error displays improve simulated GPS approach performance
DOT National Transportation Integrated Search
1996-01-01
Pilots flying non-precision instrument approaches traditionally rely on a course deviation indicator (CDI) analog display of cross track error (XTE) information. THe new generation of GPS based area navigation (RNAV) receivers can also compute accura...
ASTER preflight and inflight calibration and the validation of level 2 products
Thome, K.; Aral, K.; Hook, S.; Kieffer, H.; Lang, H.; Matsunaga, T.; Ono, A.; Palluconi, F. D.; Sakuma, H.; Slater, P.; Takashima, T.; Tonooka, H.; Tsuchida, S.; Welch, R.M.; Zalewski, E.
1998-01-01
This paper describes the preflight and inflight calibration approaches used for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER). The system is a multispectral, high-spatial resolution sensor on the Earth Observing System's (EOS)-AMl platform. Preflight calibration of ASTER uses well-characterized sources to provide calibration and preflight round-robin exercises to understand biases between the calibration sources of ASTER and other EOS sensors. These round-robins rely on well-characterized, ultra-stable radiometers. An experiment held in Yokohama, Japan, showed that the output from the source used for the visible and near-infrared (VNIR) subsystem of ASTER may be underestimated by 1.5%, but this is still within the 4% specification for the absolute, radiometric calibration of these bands. Inflight calibration will rely on vicarious techniques and onboard blackbodies and lamps. Vicarious techniques include ground-reference methods using desert and water sites. A recent joint field campaign gives confidence that these methods currently provide absolute calibration to better than 5%, and indications are that uncertainties less than the required 4% should be achievable at launch. The EOS-AMI platform will also provide a spacecraft maneuver that will allow ASTER to see the moon, allowing further characterization of the sensor. A method for combining the results of these independent calibration results is presented. The paper also describes the plans for validating the Level 2 data products from ASTER. These plans rely heavily upon field campaigns using methods similar to those used for the ground-reference, vicarious calibration methods. ?? 1998 IEEE.
Integrated Scenario Modeling of NSTX Advanced Plasma Configurations
NASA Astrophysics Data System (ADS)
Kessel, Charles; Synakowski, Edward
2003-10-01
The Spherical Torus will provide an attractive fusion energy source if it can demonstrate the following major features: high elongation and triangularity, 100% non-inductive current with a credible path to high bootstrap fractions, non-solenoidal startup and current rampup, high beta with stabilization of RWM instabilities, and sufficiently high energy confinement. NSTX has specific experimental milestones to examine these features, and integrated scenario modeling is helping to understand how these configurations might be produced and what tools are needed to access this operating space. Simulations with the Tokamak Simulation Code (TSC), CURRAY, and JSOLVER/BALMSC/PEST2 have identified fully non-inductively sustained, high beta plasmas that rely on strong plasma shaping accomplished with a PF coil modification, off-axis current drive from Electron Bernstein Waves (EBW), flexible on-axis heating and CD from High Harmonic Fast Wave (HHFW) and Neutral Beam Injection (NBI), and density control. Ideal MHD stability shows that with wall stabilization through plasma rotation and/or RWM feedback coils, a beta of 40% is achievable, with 100% non-inductive current sustained for 4 current diffusion times. Experimental data and theory are combined to produce a best extrapolation to these regimes, which is continuously improved as the discharges approach these parameters, and theoretical/computational methods expand. Further investigations and development for integrated scenario modeling on NSTX is discussed.
Robust Decision Making Approach to Managing Water Resource Risks (Invited)
NASA Astrophysics Data System (ADS)
Lempert, R.
2010-12-01
The IPCC and US National Academies of Science have recommended iterative risk management as the best approach for water management and many other types of climate-related decisions. Such an approach does not rely on a single set of judgments at any one time but rather actively updates and refines strategies as new information emerges. In addition, the approach emphasizes that a portfolio of different types of responses, rather than any single action, often provides the best means to manage uncertainty. Implementing an iterative risk management approach can however prove difficult in actual decision support applications. This talk will suggest that robust decision making (RDM) provides a particularly useful set of quantitative methods for implementing iterative risk management. This RDM approach is currently being used in a wide variety of water management applications. RDM employs three key concepts that differentiate it from most types of probabilistic risk analysis: 1) characterizing uncertainty with multiple views of the future (which can include sets of probability distributions) rather than a single probabilistic best-estimate, 2) employing a robustness rather than an optimality criterion to assess alternative policies, and 3) organizing the analysis with a vulnerability and response option framework, rather than a predict-then-act framework. This talk will summarize the RDM approach, describe its use in several different types of water management applications, and compare the results to those obtained with other methods.
ERIC Educational Resources Information Center
Gage, Nicholas A.; Lewis, Timothy J.; Stichter, Janine P.
2012-01-01
Of the myriad practices currently utilized for students with disabilities, particularly students with or at risk for emotional and/or behavioral disorder (EBD), functional behavior assessment (FBA) is a practice with an emerging solid research base. However, the FBA research base relies on single-subject design (SSD) and synthesis has relied on…
A monoclonal antibody-based ELISA for differential diagnosis of 2009 pandemic H1N1
USDA-ARS?s Scientific Manuscript database
The swine-origin 2009 pandemic H1N1 virus (pdmH1N1) is genetically related to North American swine H1 influenza viruses and unrelated to human seasonal H1 viruses. Currently, specific diagnosis of pdmH1N1 relies on RT-PCR. In order to develop an assay that does not rely in amplification of the viral...
Near Critical/Supercritical Carbon Dioxide Extraction for Treating Contaminated Bilgewater
2000-02-24
SUMMARY i TABLE OF CONTENTS ii LIST OF FIGURES iii LIST OF TABLES iii 1. INTRODUCTION 1 1.1 Current Treatment Processes 1 2. SUPERCRITICAL...Treatment Processes Historically, the Navy has relied on gravimetric separation to remove oily contaminants from bilgewater. Most ships contain one...continuously changes the orientation of the separator with respect to gravity, lowering the effectiveness of a separation process that relies on subtle
HEADROOM APPROACH TO DEVICE DEVELOPMENT: CURRENT AND FUTURE DIRECTIONS.
Girling, Alan; Lilford, Richard; Cole, Amanda; Young, Terry
2015-01-01
The headroom approach to medical device development relies on the estimation of a value-based price ceiling at different stages of the development cycle. Such price-ceilings delineate the commercial opportunities for new products in many healthcare systems. We apply a simple model to obtain critical business information as the product proceeds along a development pathway, and indicate some future directions for the development of the approach. Health economic modelling in the supply-side development cycle for new products. The headroom can be used: initially as a 'reality check' on the viability of the device in the healthcare market; to support product development decisions using a real options approach; and to contribute to a pricing policy which respects uncertainties in the reimbursement outlook. The headroom provides a unifying thread for business decisions along the development cycle for a new product. Over the course of the cycle attitudes to uncertainty will evolve, based on the timing and manner in which new information accrues. Within this framework the developmental value of new information can justify the costs of clinical trials and other evidence-gathering activities. Headroom can function as a simple shared tool to parties in commercial negotiations around individual products or groups of products. The development of similar approaches in other contexts holds promise for more rational planning of service provision.
Neural mechanisms of cue-approach training
Bakkour, Akram; Lewis-Peacock, Jarrod A.; Poldrack, Russell A.; Schonberg, Tom
2016-01-01
Biasing choices may prove a useful way to implement behavior change. Previous work has shown that a simple training task (the cue-approach task), which does not rely on external reinforcement, can robustly influence choice behavior by biasing choice toward items that were targeted during training. In the current study, we replicate previous behavioral findings and explore the neural mechanisms underlying the shift in preferences following cue-approach training. Given recent successes in the development and application of machine learning techniques to task-based fMRI data, which have advanced understanding of the neural substrates of cognition, we sought to leverage the power of these techniques to better understand neural changes during cue-approach training that subsequently led to a shift in choice behavior. Contrary to our expectations, we found that machine learning techniques applied to fMRI data during non-reinforced training were unsuccessful in elucidating the neural mechanism underlying the behavioral effect. However, univariate analyses during training revealed that the relationship between BOLD and choices for Go items increases as training progresses compared to choices of NoGo items primarily in lateral prefrontal cortical areas. This new imaging finding suggests that preferences are shifted via differential engagement of task control networks that interact with value networks during cue-approach training. PMID:27677231
DKIST Adaptive Optics System: Simulation Results
NASA Astrophysics Data System (ADS)
Marino, Jose; Schmidt, Dirk
2016-05-01
The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.
Simple Heuristic Approach to Introduction of the Black-Scholes Model
ERIC Educational Resources Information Center
Yalamova, Rossitsa
2010-01-01
A heuristic approach to explaining of the Black-Scholes option pricing model in undergraduate classes is described. The approach draws upon the method of protocol analysis to encourage students to "think aloud" so that their mental models can be surfaced. It also relies upon extensive visualizations to communicate relationships that are…
Co-Teaching in Student Teaching of an Elementary Education Program
ERIC Educational Resources Information Center
Chang, Sau Hou
2018-01-01
Successful co-teaching relied on essential elements and different approaches. However, few studies were found on these essential elements and different approaches in student teaching. The objective of this study was to examine how teacher candidates and cooperating teachers used the essential co-teaching elements and co-teaching approaches.…
Healthcare financing: approaches and trends in India.
Bajpai, Vikas; Saraya, Anoop
2010-01-01
Despite the importance of healthcare for the well-being of society, there is little public debate in India on issues relating to it. The 'human capital approach' to finance healthcare largely relies on private investment in health, while the 'human development approach' envisages the State as the guarantorof preventive as well as curative care to achieve universalization of healthcare. The prevailing health indices of India and challenges in the field of public health require a human developmentapproach to healthcare. On the eve of independence, India adopted the human development approach, with the report of the Bhore Committee emphasizing the role of the State in the development and provision of healthcare. However, more recently, successive governments have moved towards the human capital approach. Instead of increasing state spending on health and expanding the public health infrastructure, the government has been relying more and more on the private sector. The public-private partnership has been touted as the new-age panacea for the ills of the Indian healthcare system. This approach has led to a stagnation of public health indices and a decrease in the access of the poor to healthcare.
Application of nanotechnology in treatment of leishmaniasis: A Review.
Akbari, Maryam; Oryan, Ahmad; Hatam, Gholamreza
2017-08-01
Leishmaniasis is a neglected tropical disease caused by a protozoan species of the genus Leishmania affecting mostly the developing countries. The disease with current mortality rate of 50,000 deaths per year threatens approximately 350 million people in more than 90 countries all over the world. Cutaneous, mucocutaneous and visceral leishmaniasis are the most frequent forms of the disease. Chemotherapy still relies on the use of pentavalent antimonials, amphotericin B, liposomal amphotericin B and miltefosin. Treatment of leishmaniasis has remained insufficient since the current antileishmanial agents have several limitations including low efficacy, toxicity, adverse side effects, drug-resistance, length of treatment and cost lines. Consequently, there is an immediate requirement to search for new antileishmanial compounds. New drug delivery devices transport antileishmanial drug to the target cell specifically with minimizing the toxic effects to normal cells. This study attempts to present a comprehensive overview of different approaches of nanotechnology in treatment of leishmaniasis. Copyright © 2017 Elsevier B.V. All rights reserved.
The host immune response to gastrointestinal nematode infection in sheep.
McRae, K M; Stear, M J; Good, B; Keane, O M
2015-12-01
Gastrointestinal nematode infection represents a major threat to the health, welfare and productivity of sheep populations worldwide. Infected lambs have a reduced ability to absorb nutrients from the gastrointestinal tract, resulting in morbidity and occasional mortality. The current chemo-dominant approach to nematode control is considered unsustainable due to the increasing incidence of anthelmintic resistance. In addition, there is growing consumer demand for food products from animals not subjected to chemical treatment. Future mechanisms of nematode control must rely on alternative, sustainable strategies such as vaccination or selective breeding of resistant animals. Such strategies take advantage of the host's natural immune response to nematodes. The ability to resist gastrointestinal nematode infection is considered to be dependent on the development of a protective acquired immune response, although the precise immune mechanisms involved in initiating this process remain to be fully elucidated. In this study, current knowledge on the innate and acquired host immune response to gastrointestinal nematode infection in sheep and the development of immunity is reviewed. © 2015 John Wiley & Sons Ltd.
Current management of sarcoidosis I: pulmonary, cardiac, and neurologic manifestations.
West, Sterling G
2018-05-01
Sarcoidosis is a systemic disease characterized by noncaseating granulomatous inflammation of multiple organ systems. Pulmonary, cardiac, and neurologic involvements have the worst prognosis. Current recommendations for the therapeutic management and follow-up of sarcoidosis involving these critical organs will be reviewed. In those sarcoidosis patients requiring immunosuppressive therapy, corticosteroids are used first at varying doses depending on the presenting manifestation. Patients with symptomatic pulmonary, cardiac, or neurologic involvement will be maintained on corticosteroids for at least a year. Many require a second immunosuppressive agent with methotrexate used most commonly. Anti-tumor necrosis factor agents, especially infliximab, are effective and recommendations for their use have been proposed. Evidence-based treatment guidelines do not exist for most sarcoidosis clinical manifestations. Therefore, clinical care of these patients must rely on expert opinion. Patients are best served by a multidisciplinary approach to their care. Future research to identify environmental triggers, genetic associations, biomarkers for treatment response, and where to position new steroid-sparing immunosuppressive agents is warranted.
Harnessing Aptamers to Overcome Challenges in Gluten Detection
Miranda-Castro, Rebeca; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, María Jesús
2016-01-01
Celiac disease is a lifelong autoimmune disorder triggered by foods containing gluten, the storage protein in wheat, rye, and barley. The rapidly escalating number of patients diagnosed with this disease poses a great challenge to both food industry and authorities to guarantee food safety for all. Therefore, intensive efforts are being made to establish minimal disease-eliciting doses of gluten and consequently to improve gluten-free labeling. These efforts depend to a high degree on the availability of methods capable of detecting the protein in food samples at levels as low as possible. Current analytical approaches rely on the use of antibodies as selective recognition elements. With limited sensitivity, these methods exhibit some deficiencies that compromise the accuracy of the obtained results. Aptamers provide an ideal alternative for designing biosensors for fast and selective measurement of gluten in foods. This article highlights the challenges in gluten detection, the current status of the use of aptamers for solving this problem, and what remains to be done to move these systems into commercial applications. PMID:27104578
Alternative to Chemotherapy—The Unmet Demand against Leishmaniasis
Didwania, Nicky; Shadab, Md.; Sabur, Abdus; Ali, Nahid
2017-01-01
Leishmaniasis is a neglected protozoan disease that mainly affects the tropical as well as subtropical countries of the world. The primary option to control the disease still relies on chemotherapy. However, a hindrance to treatments owing to the emergence of drug-resistant parasites, enormous side effects of the drugs, their high cost, and requirement of long course hospitalization has added to the existing problems of leishmaniasis containment program. This review highlights the prospects of immunotherapy and/or immunochemotherapy to address the limitations for current treatment measures for leishmaniasis. In addition to the progress in alternate therapeutic strategies, the possibility and advances in developing preventive measures against the disease have been pointed. The review highlights our recent understandings of the protective immunology that can be exploited to develop an effective vaccine against leishmaniasis. Moreover, an update on the approaches that have evolved over the recent years are predominantly focused to overcome the current challenges in developing immunotherapeutic as well as prophylactic antileishmanial vaccines is discussed. PMID:29312309
Handwriting Examination: Moving from Art to Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.
In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less
Faulon, Jean-Loup; Misra, Milind; Martin, Shawn; ...
2007-11-23
Motivation: Identifying protein enzymatic or pharmacological activities are important areas of research in biology and chemistry. Biological and chemical databases are increasingly being populated with linkages between protein sequences and chemical structures. Additionally, there is now sufficient information to apply machine-learning techniques to predict interactions between chemicals and proteins at a genome scale. Current machine-learning techniques use as input either protein sequences and structures or chemical information. We propose here a method to infer protein–chemical interactions using heterogeneous input consisting of both protein sequence and chemical information. Results: Our method relies on expressing proteins and chemicals with a common cheminformaticsmore » representation. We demonstrate our approach by predicting whether proteins can catalyze reactions not present in training sets. We also predict whether a given drug can bind a target, in the absence of prior binding information for that drug and target. Lastly, such predictions cannot be made with current machine-learning techniques requiring binding information for individual reactions or individual targets.« less
Basu, Kisalaya; Pak, Maxwell
2016-01-01
Recently, the emphasis on health human resources (HHR) planning has shifted away from a utilization-based approach toward a needs-based one in which planning is based on the projected health needs of the population. However, needs-based models that are currently in use rely on a definition of 'needs' that include only the medical circumstances of individuals and not personal preferences or other socio-economic factors. We examine whether planning based on such a narrow definition will maximize social welfare. We show that, in a publicly funded healthcare system, if the planner seeks to meet the aggregate need without taking utilization into consideration, then oversupply of HHR is likely because 'needs' do not necessarily translate into 'usage.' Our result suggests that HHR planning should track the healthcare system as access gradually improves because, even if health care is fully accessible, individuals may not fully utilize it to the degree prescribed by their medical circumstances. Copyright © 2014 John Wiley & Sons, Ltd.
Identifying problem and compulsive gamblers.
van Es, R.
2000-01-01
OBJECTIVE: To present a meta-analysis of current research on the prevalence, identification, and treatment of problem and compulsive gamblers. QUALITY OF EVIDENCE: Problem and compulsive gambling was not a socio-scientific concern until the last two decades. Hence research on this topic is limited. The summary and analysis for this paper relied on computer searches of journal and news abstracts in addition to direct contact with organizations addressing the identification and treatment of compulsive gamblers. MAIN MESSAGE: An estimated 5% of those who gamble run into problems. About 1% of those who gamble are predicted to experience serious problems. Successful treatment of problem and compulsive gambling continues to be a challenge. Although cognitive therapy has been the favoured approach, a combination of several therapeutic approaches is advocated. CONCLUSIONS: Problem and compulsive gambling can present a real health threat. As with other addictions, treatment strategies continue to be a baffling social problem. Aware and informed physicians can have a pivotal role in the difficult process of identifying, acknowledging, and remediating problem and compulsive gambling. PMID:10907572
Gazestani, Vahid H; Salavati, Reza
2015-01-01
Trypanosoma brucei is a vector-borne parasite with intricate life cycle that can cause serious diseases in humans and animals. This pathogen relies on fine regulation of gene expression to respond and adapt to variable environments, with implications in transmission and infectivity. However, the involved regulatory elements and their mechanisms of actions are largely unknown. Here, benefiting from a new graph-based approach for finding functional regulatory elements in RNA (GRAFFER), we have predicted 88 new RNA regulatory elements that are potentially involved in the gene regulatory network of T. brucei. We show that many of these newly predicted elements are responsive to both transcriptomic and proteomic changes during the life cycle of the parasite. Moreover, we found that 11 of predicted elements strikingly resemble previously identified regulatory elements for the parasite. Additionally, comparison with previously predicted motifs on T. brucei suggested the superior performance of our approach based on the current limited knowledge of regulatory elements in T. brucei.
NASA Astrophysics Data System (ADS)
Huang, Yanhui; Zhao, He; Wang, Yixing; Ratcliff, Tyree; Breneman, Curt; Brinson, L. Catherine; Chen, Wei; Schadler, Linda S.
2017-08-01
It has been found that doping dielectric polymers with a small amount of nanofiller or molecular additive can stabilize the material under a high field and lead to increased breakdown strength and lifetime. Choosing appropriate fillers is critical to optimizing the material performance, but current research largely relies on experimental trial and error. The employment of computer simulations for nanodielectric design is rarely reported. In this work, we propose a multi-scale modeling approach that employs ab initio, Monte Carlo, and continuum scales to predict the breakdown strength and lifetime of polymer nanocomposites based on the charge trapping effect of the nanofillers. The charge transfer, charge energy relaxation, and space charge effects are modeled in respective hierarchical scales by distinctive simulation techniques, and these models are connected together for high fidelity and robustness. The preliminary results show good agreement with the experimental data, suggesting its promise for use in the computer aided material design of high performance dielectrics.
Sussman, Steven
2013-01-01
In this review, I examine the definition, etiology, measurement, prevention and treatment of workaholism, based on a systematic search of the literature. While there is some debate regarding the parameters of the concept, viewed as a negative consequential addiction, workaholism involves excessive time spent working, preoccupation with work to the exclusion of other life domains, loss of control over the parameters of one’s work and disenchantment with work, and negative social, emotional, and health consequences. The etiology of workaholism is not clear but may pertain to persons with compulsive personality traits, who are driven to work harder than that demanded from work contexts, and who have learned to place work as a main means of gratification compared to other lifestyle alternatives. Most measurement approaches rely on self-report questionnaires, tested primarily with convenience samples. Refinement of current assessments is ongoing. Prevention and treatment implications are discussed, which include intra- and extra-personal level approaches. Finally, limitations of the work completed in this arena are mentioned and needed future research directions are suggested. PMID:24273685
Kautsky, Ulrik; Lindborg, Tobias; Valentin, Jack
2013-05-01
This is an overview of the strategy used to describe the effects of a potential release from a radioactive waste repository on human exposure and future environments. It introduces a special issue of AMBIO, in which 13 articles show ways of understanding and characterizing the future. The study relies mainly on research performed in the context of a recent safety report concerning a repository for spent nuclear fuel in Sweden (the so-called SR-Site project). The development of a good understanding of on-site processes and acquisition of site-specific data facilitated the development of new approaches for assessment of surface ecosystems. A systematic and scientifically coherent methodology utilizes the understanding of the current spatial and temporal dynamics as an analog for future conditions. We conclude that future ecosystem can be inferred from a few variables and that this multidisciplinary approach is relevant in a much wider context than radioactive waste.
SONAR Discovers RNA-Binding Proteins from Analysis of Large-Scale Protein-Protein Interactomes.
Brannan, Kristopher W; Jin, Wenhao; Huelga, Stephanie C; Banks, Charles A S; Gilmore, Joshua M; Florens, Laurence; Washburn, Michael P; Van Nostrand, Eric L; Pratt, Gabriel A; Schwinn, Marie K; Daniels, Danette L; Yeo, Gene W
2016-10-20
RNA metabolism is controlled by an expanding, yet incomplete, catalog of RNA-binding proteins (RBPs), many of which lack characterized RNA binding domains. Approaches to expand the RBP repertoire to discover non-canonical RBPs are currently needed. Here, HaloTag fusion pull down of 12 nuclear and cytoplasmic RBPs followed by quantitative mass spectrometry (MS) demonstrates that proteins interacting with multiple RBPs in an RNA-dependent manner are enriched for RBPs. This motivated SONAR, a computational approach that predicts RNA binding activity by analyzing large-scale affinity precipitation-MS protein-protein interactomes. Without relying on sequence or structure information, SONAR identifies 1,923 human, 489 fly, and 745 yeast RBPs, including over 100 human candidate RBPs that contain zinc finger domains. Enhanced CLIP confirms RNA binding activity and identifies transcriptome-wide RNA binding sites for SONAR-predicted RBPs, revealing unexpected RNA binding activity for disease-relevant proteins and DNA binding proteins. Copyright © 2016 Elsevier Inc. All rights reserved.
Capturing the genetic makeup of the active microbiome in situ.
Singer, Esther; Wagner, Michael; Woyke, Tanja
2017-09-01
More than any other technology, nucleic acid sequencing has enabled microbial ecology studies to be complemented with the data volumes necessary to capture the extent of microbial diversity and dynamics in a wide range of environments. In order to truly understand and predict environmental processes, however, the distinction between active, inactive and dead microbial cells is critical. Also, experimental designs need to be sensitive toward varying population complexity and activity, and temporal as well as spatial scales of process rates. There are a number of approaches, including single-cell techniques, which were designed to study in situ microbial activity and that have been successively coupled to nucleic acid sequencing. The exciting new discoveries regarding in situ microbial activity provide evidence that future microbial ecology studies will indispensably rely on techniques that specifically capture members of the microbiome active in the environment. Herein, we review those currently used activity-based approaches that can be directly linked to shotgun nucleic acid sequencing, evaluate their relevance to ecology studies, and discuss future directions.
Towards the estimation of effect measures in studies using respondent-driven sampling.
Rotondi, Michael A
2014-06-01
Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.
Enhancing and expanding intersectional research for climate change adaptation in agrarian settings.
Thompson-Hall, Mary; Carr, Edward R; Pascual, Unai
2016-12-01
Most current approaches focused on vulnerability, resilience, and adaptation to climate change frame gender and its influence in a manner out-of-step with contemporary academic and international development research. The tendency to rely on analyses of the sex-disaggregated gender categories of 'men' and 'women' as sole or principal divisions explaining the abilities of different people within a group to adapt to climate change, illustrates this problem. This framing of gender persists in spite of established bodies of knowledge that show how roles and responsibilities that influence a person´s ability to deal with climate-induced and other stressors emerge at the intersection of diverse identity categories, including but not limited to gender, age, seniority, ethnicity, marital status, and livelihoods. Here, we provide a review of relevant literature on this topic and argue that approaching vulnerability to climate change through intersectional understandings of identity can help improve adaptation programming, project design, implementation, and outcomes.
Hyun, Kyung-A; Koo, Gi-Bang; Han, Hyunju; Sohn, Joohyuk; Choi, Wonshik; Kim, Seung-Il; Jung, Hyo-Il; Kim, You-Sun
2016-04-26
The dissemination of circulating tumor cells (CTCs) requires the Epithelial-to-Mesenchymal transition (EMT), in which cells lose their epithelial characteristics and acquire more mesenchymal-like phenotypes. Current isolation of CTCs relies on affinity-based approaches reliant on the expression of Epithelial Cell Adhesion Molecule (EpCAM). Here we show EMT-induced breast cancer cells maintained in prolonged mammosphere culture conditions possess increased EMT markers and cancer stem cell markers, as well as reduced cell mass and size by quantitative phase microscopy; however, EpCAM expression is dramatically decreased in these cells. Moreover, CTCs isolated from breast cancer patients using a label-free microfluidic flow fractionation device had differing expression patterns of EpCAM, indicating that affinity approaches reliant on EpCAM expression may underestimate CTC number and potentially miss critical subpopulations. Further characterization of CTCs, including low-EpCAM populations, using this technology may improve detection techniques and cancer diagnosis, ultimately improving cancer treatment.
NASA Astrophysics Data System (ADS)
Maharbiz, Michel M.
2017-05-01
The emerging field of bioelectronic medicine seeks methods for deciphering and modulating electrophysiological activity in the body to attain therapeutic effects at target organs. Current approaches to interfacing with peripheral nerves and muscles rely heavily on wires, creating problems for chronic use, while emerging wireless approaches lack the size scalability necessary to interrogate small-diameter nerves. Furthermore, conventional electrode-based technologies lack the capability to record from nerves with high spatial resolution or to record independently from many discrete sites within a nerve bundle. We recently demonstrated (Seo et al., arXiV, 2013; Seo et al., Neuron, 2016) "neural dust," a wireless and scalable ultrasonic backscatter system for powering and communicating with implanted bioelectronics. There, we showed that ultrasound is effective at delivering power to mm-scale devices in tissue; likewise, passive, battery-less communication using backscatter enabled high-fidelity transmission of electromyogram (EMG) and electroneurogram (ENG) signals from anesthetized rats. In this talk, I will review recent developments from my group and collaborators in this area.
Malpeli, Katherine C.; Chirico, Peter G.
2014-01-01
The Central African Republic (CAR), a country with rich diamond deposits and a tumultuous political history, experienced a government takeover by the Seleka rebel coalition in 2013. It is within this context that we developed and implemented a geospatial approach for assessing the lootability of high value-to-weight resource deposits, using the case of diamonds in CAR as an example. According to current definitions of lootability, or the vulnerability of deposits to exploitation, CAR's two major diamond deposits are similarly lootable. However, using this geospatial approach, we demonstrate that the deposits experience differing political geographic, spatial location, and cultural geographic contexts, rendering the eastern deposits more lootable than the western deposits. The patterns identified through this detailed analysis highlight the geographic complexities surrounding the issue of conflict resources and lootability, and speak to the importance of examining these topics at the sub-national scale, rather than relying on national-scale statistics.
A permutation testing framework to compare groups of brain networks.
Simpson, Sean L; Lyday, Robert G; Hayasaka, Satoru; Marsh, Anthony P; Laurienti, Paul J
2013-01-01
Brain network analyses have moved to the forefront of neuroimaging research over the last decade. However, methods for statistically comparing groups of networks have lagged behind. These comparisons have great appeal for researchers interested in gaining further insight into complex brain function and how it changes across different mental states and disease conditions. Current comparison approaches generally either rely on a summary metric or on mass-univariate nodal or edge-based comparisons that ignore the inherent topological properties of the network, yielding little power and failing to make network level comparisons. Gleaning deeper insights into normal and abnormal changes in complex brain function demands methods that take advantage of the wealth of data present in an entire brain network. Here we propose a permutation testing framework that allows comparing groups of networks while incorporating topological features inherent in each individual network. We validate our approach using simulated data with known group differences. We then apply the method to functional brain networks derived from fMRI data.
MobileODT: a case study of a novel approach to an mHealth-based model of sustainable impact
Mink, Jonah
2016-01-01
A persistent challenge facing global health actors is ensuring that time-bound interventions are ultimately adopted and integrated into local health systems for long term health system strengthening and capacity building. This level of sustainability is rarely achieved with current models of global health intervention that rely on continuous injection of resources or persistent external presence on the ground. Presented here is a case study of a flipped approach to creating capacity and adoption through an engagement strategy centered around an innovative mHealth device and connected service. Through an impact-oriented business model, this mHealth solution engages stakeholders in a cohesive and interdependent network by appealing to the pain points for each actor throughout the health system. This particular intervention centers around the MobileODT, Inc. Enhanced Visual Assessment (EVA) System for enhanced visualization. While focused on challenges to cervical cancer screening and treatment services, the lessons learned are offered as a model for lateral translation into adjacent health condition verticals. PMID:28293590
Probing noncommutative theories with quantum optical experiments
NASA Astrophysics Data System (ADS)
Dey, Sanjib; Bhat, Anha; Momeni, Davood; Faizal, Mir; Ali, Ahmed Farag; Dey, Tarun Kumar; Rehman, Atikur
2017-11-01
One of the major difficulties of modern science underlies at the unification of general relativity and quantum mechanics. Different approaches towards such theory have been proposed. Noncommutative theories serve as the root of almost all such approaches. However, the identification of the appropriate passage to quantum gravity is suffering from the inadequacy of experimental techniques. It is beyond our ability to test the effects of quantum gravity thorough the available scattering experiments, as it is unattainable to probe such high energy scale at which the effects of quantum gravity appear. Here we propose an elegant alternative scheme to test such theories by detecting the deformations emerging from the noncommutative structures. Our protocol relies on the novelty of an opto-mechanical experimental setup where the information of the noncommutative oscillator is exchanged via the interaction with an optical pulse inside an optical cavity. We also demonstrate that our proposal is within the reach of current technology and, thus, it could uncover a feasible route towards the realization of quantum gravitational phenomena thorough a simple table-top experiment.
NASA Astrophysics Data System (ADS)
Sartori, Martina; Schiavo, Stefano; Fracasso, Andrea; Riccaboni, Massimo
2017-12-01
The paper investigates how the topological features of the virtual water (VW) network and the size of the associated VW flows are likely to change over time, under different socio-economic and climate scenarios. We combine two alternative models of network formation -a stochastic and a fitness model, used to describe the structure of VW flows- with a gravity model of trade to predict the intensity of each bilateral flow. This combined approach is superior to existing methodologies in its ability to replicate the observed features of VW trade. The insights from the models are used to forecast future VW flows in 2020 and 2050, under different climatic scenarios, and compare them with future water availability. Results suggest that the current trend of VW exports is not sustainable for all countries. Moreover, our approach highlights that some VW importers might be exposed to "imported water stress" as they rely heavily on imports from countries whose water use is unsustainable.
A robust close-range photogrammetric target extraction algorithm for size and type variant targets
NASA Astrophysics Data System (ADS)
Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert
2016-05-01
The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.
Palladium-catalysed anti-Markovnikov selective oxidative amination
NASA Astrophysics Data System (ADS)
Kohler, Daniel G.; Gockel, Samuel N.; Kennemur, Jennifer L.; Waller, Peter J.; Hull, Kami L.
2018-03-01
In recent years, the synthesis of amines and other nitrogen-containing motifs has been a major area of research in organic chemistry because they are widely represented in biologically active molecules. Current strategies rely on a multistep approach and require one reactant to be activated prior to the carbon-nitrogen bond formation. This leads to a reaction inefficiency and functional group intolerance. As such, a general approach to the synthesis of nitrogen-containing compounds from readily available and benign starting materials is highly desirable. Here we present a palladium-catalysed oxidative amination reaction in which the addition of the nitrogen occurs at the less-substituted carbon of a double bond, in what is known as anti-Markovnikov selectivity. Alkenes are shown to react with imides in the presence of a palladate catalyst to generate the terminal imide through trans-aminopalladation. Subsequently, olefin isomerization occurs to afford the thermodynamically favoured products. Both the scope of the transformation and mechanistic investigations are reported.
Non-invasive pressure difference estimation from PC-MRI using the work-energy equation
Donati, Fabrizio; Figueroa, C. Alberto; Smith, Nicolas P.; Lamata, Pablo; Nordsletten, David A.
2015-01-01
Pressure difference is an accepted clinical biomarker for cardiovascular disease conditions such as aortic coarctation. Currently, measurements of pressure differences in the clinic rely on invasive techniques (catheterization), prompting development of non-invasive estimates based on blood flow. In this work, we propose a non-invasive estimation procedure deriving pressure difference from the work-energy equation for a Newtonian fluid. Spatial and temporal convergence is demonstrated on in silico Phase Contrast Magnetic Resonance Image (PC-MRI) phantoms with steady and transient flow fields. The method is also tested on an image dataset generated in silico from a 3D patient-specific Computational Fluid Dynamics (CFD) simulation and finally evaluated on a cohort of 9 subjects. The performance is compared to existing approaches based on steady and unsteady Bernoulli formulations as well as the pressure Poisson equation. The new technique shows good accuracy, robustness to noise, and robustness to the image segmentation process, illustrating the potential of this approach for non-invasive pressure difference estimation. PMID:26409245
Capturing the genetic makeup of the active microbiome in situ
Singer, Esther; Wagner, Michael; Woyke, Tanja
2017-01-01
More than any other technology, nucleic acid sequencing has enabled microbial ecology studies to be complemented with the data volumes necessary to capture the extent of microbial diversity and dynamics in a wide range of environments. In order to truly understand and predict environmental processes, however, the distinction between active, inactive and dead microbial cells is critical. Also, experimental designs need to be sensitive toward varying population complexity and activity, and temporal as well as spatial scales of process rates. There are a number of approaches, including single-cell techniques, which were designed to study in situ microbial activity and that have been successively coupled to nucleic acid sequencing. The exciting new discoveries regarding in situ microbial activity provide evidence that future microbial ecology studies will indispensably rely on techniques that specifically capture members of the microbiome active in the environment. Herein, we review those currently used activity-based approaches that can be directly linked to shotgun nucleic acid sequencing, evaluate their relevance to ecology studies, and discuss future directions. PMID:28574490
Water Resources Management and Hydrologic Design Under Uncertain Climate Change Scenarios
NASA Astrophysics Data System (ADS)
Teegavarapu, R. S.
2008-05-01
The impact of climate change on hydrologic design and management of water resource systems could be one of the important challenges faced by future practicing hydrologists and water resources managers. Many water resources managers currently rely on the historical hydrological data and adaptive real-time operations without consideration of the impact of climate change on major inputs influencing the behavior of hydrologic systems and the operating rules. Issues such as risk, reliability and robustness of water resources systems under different climate change scenarios were addressed in the past. However, water resources management with the decision maker's preferences attached to climate change has never been dealt with. This presentation discusses issues related to impacts of climate change on water resources management and application of a soft-computing approach, fuzzy set theory, for climate-sensitive management of water resources systems. A real-life case study example is presented to illustrate the applicability of soft-computing approach for handling the decision maker's preferences in accepting or rejecting the magnitude and direction of climate change.
Epistemic and aleatory uncertainty in the study of dynamic human-water systems
NASA Astrophysics Data System (ADS)
Di Baldassarre, Giuliano; Brandimarte, Luigia; Beven, Keith
2016-04-01
Here we discuss epistemic and aleatory uncertainty in the study of dynamic human-water systems (e.g. socio-hydrology), which is one of the main topics of Panta Rhei, the current scientific decade of the International Association of Hydrological Sciences (IAHS). In particular, we identify three types of lack of understanding: (i) known unknowns, which are things we know we don't know; (ii) unknown unknowns, which are things we don't know we don't know; and (iii) wrong assumptions, things we think we know, but we actually don't know. We posit that a better understanding of human-water interactions and feedbacks can help coping with wrong assumptions and known unknowns. Moreover, being aware of the existence of unknown unknowns, and their potential capability to generate surprises or black swans, suggest the need to rely more on bottom-up approaches, based on social vulnerabilities and possibilities of failures, and less on top-down approaches, based on optimization and quantitative predictions.
Iera, Jaclyn A; Jenkins, Lisa M Miller; Kajiyama, Hiroshi; Kopp, Jeffrey B; Appella, Daniel H
2010-11-15
Inhibitors for protein-protein interactions are challenging to design, in part due to the unique and complex architectures of each protein's interaction domain. Most approaches to develop inhibitors for these interactions rely on rational design, which requires prior structural knowledge of the target and its ligands. In the absence of structural information, a combinatorial approach may be the best alternative to finding inhibitors of a protein-protein interaction. Current chemical libraries, however, consist mostly of molecules designed to inhibit enzymes. In this manuscript, we report the synthesis and screening of a library based on an N-acylated polyamine (NAPA) scaffold that we designed to have specific molecular features necessary to inhibit protein-protein interactions. Screens of the library identified a member with favorable binding properties to the HIV viral protein R (Vpr), a regulatory protein from HIV, that is involved in numerous interactions with other proteins critical for viral replication. Published by Elsevier Ltd.
Systems engineering interfaces: A model based approach
NASA Astrophysics Data System (ADS)
Fosse, E.; Delp, C. L.
The engineering of interfaces is a critical function of the discipline of Systems Engineering. Included in interface engineering are instances of interaction. Interfaces provide the specifications of the relevant properties of a system or component that can be connected to other systems or components while instances of interaction are identified in order to specify the actual integration to other systems or components. Current Systems Engineering practices rely on a variety of documents and diagrams to describe interface specifications and instances of interaction. The SysML[1] specification provides a precise model based representation for interfaces and interface instance integration. This paper will describe interface engineering as implemented by the Operations Revitalization Task using SysML, starting with a generic case and culminating with a focus on a Flight System to Ground Interaction. The reusability of the interface engineering approach presented as well as its extensibility to more complex interfaces and interactions will be shown. Model-derived tables will support the case studies shown and are examples of model-based documentation products.
NASA Astrophysics Data System (ADS)
Jarujareet, Ungkarn; Amarit, Rattasart; Sumriddetchkajorn, Sarun
2016-11-01
Realizing that current microfluidic chip fabrication techniques are time consuming and labor intensive as well as always have material leftover after chip fabrication, this research work proposes an innovative approach for rapid microfluidic chip production. The key idea relies on a combination of a widely-used inkjet printing method and a heat-based polymer curing technique with an electronic-mechanical control, thus eliminating the need of masking and molds compared to typical microfluidic fabrication processes. In addition, as the appropriate amount of polymer is utilized during printing, there is much less amount of material wasted. Our inkjet-based microfluidic printer can print out the desired microfluidic chip pattern directly onto a heated glass surface, where the printed polymer is suddenly cured. Our proof-of-concept demonstration for widely-used single-flow channel, Y-junction, and T-junction microfluidic chips shows that the whole microfluidic chip fabrication process requires only 3 steps with a fabrication time of 6 minutes.
NASA Technical Reports Server (NTRS)
Thome, Kurtis; Gubbels, Timothy; Barnes, Robert
2011-01-01
The Climate Absolute Radiance and Refractivity Observatory (CLARREO) plans to observe climate change trends over decadal time scales to determine the accuracy of climate projections. The project relies on spaceborne earth observations of SI-traceable variables sensitive to key decadal change parameters. The mission includes a reflected solar instrument retrieving at-sensor reflectance over the 320 to 2300 nm spectral range with 500-m spatial resolution and 100-km swath. Reflectance is obtained from the ratio of measurements of the earth s surface to those while viewing the sun relying on a calibration approach that retrieves reflectance with uncertainties less than 0.3%. The calibration is predicated on heritage hardware, reduction of sensor complexity, adherence to detector-based calibration standards, and an ability to simulate in the laboratory on-orbit sources in both size and brightness to provide the basis of a transfer to orbit of the laboratory calibration including a link to absolute solar irradiance measurements. The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission addresses the need to observe high-accuracy, long-term climate change trends and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those in the IPCC Report. A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO Project will implement a spaceborne earth observation mission designed to provide rigorous SI traceable observations (i.e., radiance, reflectance, and refractivity) that are sensitive to a wide range of key decadal change variables, including: 1) Surface temperature and atmospheric temperature profile 2) Atmospheric water vapor profile 3) Far infrared water vapor greenhouse 4) Aerosol properties and anthropogenic aerosol direct radiative forcing 5) Total and spectral solar irradiance 6) Broadband reflected and emitted radiative fluxes 7) Cloud properties 8) Surface albedo There are two methods the CLARREO mission will rely on to achieve these critical decadal change benchmarks: direct and reference inter-calibration. A quantitative analysis of the strengths and weaknesses of the two methods has led to the recommended CLARREO mission approach. The project consists of two satellites launched into 90-degree, precessing orbits separated by 90 degrees. The instrument suite receiver on each spacecraft includes one emitted infrared spectrometer, two reflected solar spectrometers: dividing the spectrum from ultraviolet through near infrared, and one global navigation receiver for radio occultation. The measurements will be acquired for a period of three years minimum, with a five-year lifetime goal, enabling follow-on missions to extend the climate record over the decades needed to understand climate change. The current work concentrates on the reflected solar instrument giving an overview of its design and calibration approach. The calibration description includes the approach to achieving an SI-traceable system on orbit. The calibration overview is followed by a preliminary error budget based on techniques currently in place at the National Institute of Standards and Technology (NIST).
NASA Astrophysics Data System (ADS)
Alam, Muhammad
2014-03-01
The discovery dye sensitized and bulk heterojunction (BHJ) solar cells in early 1990s introduced a new class of PV technology that rely on (i) distributed photogeneration of excitons, (ii) dissociation of excitons into free carriers by the heterojunction between two organic semiconductors (OSC), and (iii) collection of free carriers through electron and hole transport layers. The success of the approach is undisputed: the highest efficiency OPV cells have all relied on variants of BHJ approach. Yet, three concerns related to the use of a pair of OSCs, namely, low Voc, process sensitivity, and reliability, suggest that the technology may never achieve efficiency-variability-reliability metrics comparable to inorganic solar cells. This encourages a reconsideration of the prospects of Single semiconductor OPV (SS-OPV), a system presumably doomed by the exciton bottleneck. In this talk, we use an inverted SS-OPV to demonstrate how the historical SS-OPV experiments may have been misinterpreted. No one disputes the signature of excitons in polymer under narrowband excitation, but our experiments show that exciton dissociation need not be a bottleneck for OPV under broadband solar illumination. We demonstrate that an alternate collection-limited theory consistently interprets the classical and new experiments, resolves puzzles such as efficiency loss with increasing light intensity, and voltage-dependent reverse photo-current, etc. The theory and experiments suggest a new ``perovskite-like'' strategy to efficiency-variability-reliability of organic solar cells. The work was supported by the Columbia DOE-EFRC (DE-SC0001085) and NSF-NCN (EEC-0228390).
Identifying poor performance among doctors in NHS organizations.
Locke, Rachel; Scallan, Samantha; Leach, Camilla; Rickenbach, Mark
2013-10-01
To account for the means by which poor performance among career doctors is identified by National Health Service organizations, whether the tools are considered effective and how these processes may be strengthened in the light of revalidation and the requirement for doctors to demonstrate their fitness to practice. This study sought to look beyond the 'doctor as individual'; as well as considering the typical approaches to managing the practice of an individual, the systems within which the doctor is working were reviewed, as these are also relevant to standards of performance. A qualitative review was undertaken consisting of a literature review of current practice, a policy review of current documentation from 15 trusts in one deanery locality, and 14 semi-structured interviews with respondents with an overview of processes in use. The framework for the analysis of the data considered tools at three levels: individual, team and organizational. Tools are, in the main, reactive--with an individual focus. They rely on colleagues and others to speak out, so their effectiveness is hindered by a reluctance to do so. Tools can lack an evidence base for their use, and there is limited linking of data across contexts and tools. There is more work to be done in evaluating current tools and developing stronger processes. Linkage between data sources needs to be improved and proactive tools at the organizational level need further development to help with the early identification of performance issues. This would also assist in balancing a wider systems approach with a current over emphasis on individual doctors. © 2012 John Wiley & Sons Ltd.
Nevers, Meredith B.; Whitman, Richard L.
2011-01-01
Efforts to improve public health protection in recreational swimming waters have focused on obtaining real-time estimates of water quality. Current monitoring techniques rely on the time-intensive culturing of fecal indicator bacteria (FIB) from water samples, but rapidly changing FIB concentrations result in management errors that lead to the public being exposed to high FIB concentrations (type II error) or beaches being closed despite acceptable water quality (type I error). Empirical predictive models may provide a rapid solution, but their effectiveness at improving health protection has not been adequately assessed. We sought to determine if emerging monitoring approaches could effectively reduce risk of illness exposure by minimizing management errors. We examined four monitoring approaches (inactive, current protocol, a single predictive model for all beaches, and individual models for each beach) with increasing refinement at 14 Chicago beaches using historical monitoring and hydrometeorological data and compared management outcomes using different standards for decision-making. Predictability (R2) of FIB concentration improved with model refinement at all beaches but one. Predictive models did not always reduce the number of management errors and therefore the overall illness burden. Use of a Chicago-specific single-sample standard-rather than the default 235 E. coli CFU/100 ml widely used-together with predictive modeling resulted in the greatest number of open beach days without any increase in public health risk. These results emphasize that emerging monitoring approaches such as empirical models are not equally applicable at all beaches, and combining monitoring approaches may expand beach access.
THE FUTURE OF TOXICOLOGY-PREDICTIVE TOXICOLOGY: AN EXPANDED VIEW OF CHEMICAL TOXICITY
A chemistry approach to predictive toxicology relies on structure−activity relationship (SAR) modeling to predict biological activity from chemical structure. Such approaches have proven capabilities when applied to well-defined toxicity end points or regions of chemical space. T...
Iterative Addition of Kinetic Effects to Cold Plasma RF Wave Solvers
NASA Astrophysics Data System (ADS)
Green, David; Berry, Lee; RF-SciDAC Collaboration
2017-10-01
The hot nature of fusion plasmas requires a wave vector dependent conductivity tensor for accurate calculation of wave heating and current drive. Traditional methods for calculating the linear, kinetic full-wave plasma response rely on a spectral method such that the wave vector dependent conductivity fits naturally within the numerical method. These methods have seen much success for application to the well-confined core plasma of tokamaks. However, quantitative prediction of high power RF antenna designs for fusion applications has meant a requirement of resolving the geometric details of the antenna and other plasma facing surfaces for which the Fourier spectral method is ill-suited. An approach to enabling the addition of kinetic effects to the more versatile finite-difference and finite-element cold-plasma full-wave solvers was presented by where an operator-split iterative method was outlined. Here we expand on this approach, examine convergence and present a simplified kinetic current estimator for rapidly updating the right-hand side of the wave equation with kinetic corrections. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
Top-down approach for the direct characterization of low molecular weight heparins using LC-FT-MS.
Li, Lingyun; Zhang, Fuming; Zaia, Joseph; Linhardt, Robert J
2012-10-16
Low molecular heparins (LMWHs) are structurally complex, heterogeneous, polydisperse, and highly negatively charged mixtures of polysaccharides. The direct characterization of LMWH is a major challenge for currently available analytical technologies. Electrospray ionization (ESI) liquid chromatography-mass spectrometry (LC-MS) is a powerful tool for the characterization complex biological samples in the fields of proteomics, metabolomics, and glycomics. LC-MS has been applied to the analysis of heparin oligosaccharides, separated by size exclusion, reversed phase ion-pairing chromatography, and chip-based amide hydrophilic interaction chromatography (HILIC). However, there have been limited applications of ESI-LC-MS for the direct characterization of intact LMWHs (top-down analysis) due to their structural complexity, low ionization efficiency, and sulfate loss. Here we present a simple and reliable HILIC-Fourier transform (FT)-ESI-MS platform to characterize and compare two currently marketed LMWH products using the top-down approach requiring no special sample preparation steps. This HILIC system relies on cross-linked diol rather than amide chemistry, affording highly resolved chromatographic separations using a relatively high percentage of acetonitrile in the mobile phase, resulting in stable and high efficiency ionization. Bioinformatics software (GlycReSoft 1.0) was used to automatically assign structures within 5-ppm mass accuracy.
Bioinformatic approaches to augment study of epithelial-to-mesenchymal transition in lung cancer
Beck, Tim N.; Chikwem, Adaeze J.; Solanki, Nehal R.
2014-01-01
Bioinformatic approaches are intended to provide systems level insight into the complex biological processes that underlie serious diseases such as cancer. In this review we describe current bioinformatic resources, and illustrate how they have been used to study a clinically important example: epithelial-to-mesenchymal transition (EMT) in lung cancer. Lung cancer is the leading cause of cancer-related deaths and is often diagnosed at advanced stages, leading to limited therapeutic success. While EMT is essential during development and wound healing, pathological reactivation of this program by cancer cells contributes to metastasis and drug resistance, both major causes of death from lung cancer. Challenges of studying EMT include its transient nature, its molecular and phenotypic heterogeneity, and the complicated networks of rewired signaling cascades. Given the biology of lung cancer and the role of EMT, it is critical to better align the two in order to advance the impact of precision oncology. This task relies heavily on the application of bioinformatic resources. Besides summarizing recent work in this area, we use four EMT-associated genes, TGF-β (TGFB1), NEDD9/HEF1, β-catenin (CTNNB1) and E-cadherin (CDH1), as exemplars to demonstrate the current capacities and limitations of probing bioinformatic resources to inform hypothesis-driven studies with therapeutic goals. PMID:25096367
Comparing mechanistic and empirical approaches to modeling the thermal niche of almond
NASA Astrophysics Data System (ADS)
Parker, Lauren E.; Abatzoglou, John T.
2017-09-01
Delineating locations that are thermally viable for cultivating high-value crops can help to guide land use planning, agronomics, and water management. Three modeling approaches were used to identify the potential distribution and key thermal constraints on on almond cultivation across the southwestern United States (US), including two empirical species distribution models (SDMs)—one using commonly used bioclimatic variables (traditional SDM) and the other using more physiologically relevant climate variables (nontraditional SDM)—and a mechanistic model (MM) developed using published thermal limitations from field studies. While models showed comparable results over the majority of the domain, including over existing croplands with high almond density, the MM suggested the greatest potential for the geographic expansion of almond cultivation, with frost susceptibility and insufficient heat accumulation being the primary thermal constraints in the southwestern US. The traditional SDM over-predicted almond suitability in locations shown by the MM to be limited by frost, whereas the nontraditional SDM showed greater agreement with the MM in these locations, indicating that incorporating physiologically relevant variables in SDMs can improve predictions. Finally, opportunities for geographic expansion of almond cultivation under current climatic conditions in the region may be limited, suggesting that increasing production may rely on agronomical advances and densifying current almond plantations in existing locations.
Making connections for life: an in vivo map of the yeast interactome.
Kast, Juergen
2008-10-01
Proteins are the true workhorses of any cell. To carry out specific tasks, they frequently bind other molecules in their surroundings. Due to their structural complexity and flexibility, the most diverse array of interactions is seen with other proteins. The different geometries and affinities available for such interactions typically bestow specific functions on proteins. Having available a map of protein-protein interactions is therefore of enormous importance for any researcher interested in gaining insight into biological systems at the level of cells and organisms. In a recent report, a novel approach has been employed that relies on the spontaneous folding of complementary enzyme fragments fused to two different proteins to test whether these interact in their actual cellular context [Tarassov et al., Science 320, 1465-1470 (2008)]. Genome-wide application of this protein-fragment complementation assay has resulted in the first map of the in vivo interactome of Saccharomyces cerevisiae. The current data show striking similarities but also significant differences to those obtained using other large-scale approaches for the same task. This warrants a general discussion of the current state of affairs of protein-protein interaction studies and foreseeable future trends, highlighting their significance for a variety of applications and their potential to revolutionize our understanding of the architecture and dynamics of biological systems.
Making connections for life: an in vivo map of the yeast interactome
Kast, Juergen
2008-01-01
Proteins are the true workhorses of any cell. To carry out specific tasks, they frequently bind other molecules in their surroundings. Due to their structural complexity and flexibility, the most diverse array of interactions is seen with other proteins. The different geometries and affinities available for such interactions typically bestow specific functions on proteins. Having available a map of protein–protein interactions is therefore of enormous importance for any researcher interested in gaining insight into biological systems at the level of cells and organisms. In a recent report, a novel approach has been employed that relies on the spontaneous folding of complementary enzyme fragments fused to two different proteins to test whether these interact in their actual cellular context [Tarassov et al., Science 320, 1465–1470 (2008)]. Genome-wide application of this protein-fragment complementation assay has resulted in the first map of the in vivo interactome of Saccharomyces cerevisiae. The current data show striking similarities but also significant differences to those obtained using other large-scale approaches for the same task. This warrants a general discussion of the current state of affairs of protein–protein interaction studies and foreseeable future trends, highlighting their significance for a variety of applications and their potential to revolutionize our understanding of the architecture and dynamics of biological systems. PMID:19404434
Critical Care and Personalized or Precision Medicine: Who needs whom?
Sugeir, Shihab; Naylor, Stephen
2018-02-01
The current paradigm of modern healthcare is a reactive response to patient symptoms, subsequent diagnosis and corresponding treatment of the specific disease(s). This approach is predicated on methodologies first espoused by the Cnidean School of Medicine approximately 2500years ago. More recently escalating healthcare costs and relatively poor disease treatment outcomes have fermented a rethink in how we carry out medical practices. This has led to the emergence of "P-Medicine" in the form of Personalized and Precision Medicine. The terms are used interchangeably, but in fact there are significant differences in the way they are implemented. The former relies on an "N-of-1" model whereas the latter uses a "1-in-N" model. Personalized Medicine is still in a fledgling and evolutionary phase and there has been much debate over its current status and future prospects. A confounding factor has been the sudden development of Precision Medicine, which has currently captured the imagination of policymakers responsible for modern healthcare systems. There is some confusion over the terms Personalized versus Precision Medicine. Here we attempt to define the key differences and working definitions of each P-Medicine approach, as well as a taxonomic relationship tree. Finally, we discuss the impact of Personalized and Precision Medicine on the practice of Critical Care Medicine (CCM). Practitioners of CCM have been participating in Personalized Medicine unknowingly as it takes the protocols of sepsis, mechanical ventilation, and daily awakening trials and applies it to each individual patient. However, the immediate next step for CCM should be an active development of Precision Medicine. This developmental process should break down the silos of modern medicine and create a multidisciplinary approach between clinicians and basic/translational scientists. Copyright © 2017 Elsevier Inc. All rights reserved.
Mol, Ben W; Bossuyt, Patrick M; Sunkara, Sesh K; Garcia Velasco, Juan A; Venetis, Christos; Sakkas, Denny; Lundin, Kersti; Simón, Carlos; Taylor, Hugh S; Wan, Robert; Longobardi, Salvatore; Cottell, Evelyn; D'Hooghe, Thomas
2018-06-01
Although most medical treatments are designed for the average patient with a one-size-fits-all-approach, they may not benefit all. Better understanding of the function of genes, proteins, and metabolite, and of personal and environmental factors has led to a call for personalized medicine. Personalized reproductive medicine is still in its infancy, without clear guidance on treatment aspects that could be personalized and on trial design to evaluate personalized treatment effect and benefit-harm balance. While the rationale for a personalized approach often relies on retrospective analyses of large observational studies or real-world data, solid evidence of superiority of a personalized approach will come from randomized trials comparing outcomes and safety between a personalized and one-size-fits-all strategy. A more efficient, targeted randomized trial design may recruit only patients or couples for which the personalized approach would differ from the previous, standard approach. Multiple monocenter studies using the same study protocol (allowing future meta-analysis) might reduce the major center effect associated with multicenter studies. In certain cases, single-arm observational studies can generate the necessary evidence for a personalized approach. This review describes each of the main segments of patient care in assisted reproductive technologies treatment, addressing which aspects could be personalized, emphasizing current evidence and relevant study design. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
The PNEI holistic approach in coloproctology.
Pescatori, M; Podzemny, V; Pescatori, L C; Dore, M P; Bassotti, G
2015-05-01
The psycho-neuroendocrine-immune approach relies on the concept of considering diseases from a holistic point of view: the various components (psyche, nervous system, endocrine system, and immune system) control the diseased organ/apparatus and in turn are influenced by a feedback mechanism. In this article, we will consider the psycho-neuroendocrine-immune approach to coloproctological disorders, by providing clinical cases and discussing them in light of this approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... received. Table of Contents I. Introduction A. Statutory Framework B. Consultations C. Approach to Drafting.... Generally B. Consistency With CFTC Approach IV. Paperwork Reduction Act A. Summary of Collections of... that may rely on security-based swaps to manage risk and reduce volatility. C. Approach to Drafting the...
Adaptive Role Playing Games: An Immersive Approach for Problem Based Learning
ERIC Educational Resources Information Center
Sancho, Pilar; Moreno-Ger, Pablo; Fuentes-Fernandez, Ruben; Fernandez-Manjon, Baltasar
2009-01-01
In this paper we present a general framework, called NUCLEO, for the application of socio-constructive educational approaches in higher education. The underlying pedagogical approach relies on an adaptation model in order to improve group dynamics, as this has been identified as one of the key features in the success of collaborative learning…
Anderson, Melinda C; Arehart, Kathryn H; Souza, Pamela E
2018-02-01
Current guidelines for adult hearing aid fittings recommend the use of a prescriptive fitting rationale with real-ear verification that considers the audiogram for the determination of frequency-specific gain and ratios for wide dynamic range compression. However, the guidelines lack recommendations for how other common signal-processing features (e.g., noise reduction, frequency lowering, directional microphones) should be considered during the provision of hearing aid fittings and fine-tunings for adult patients. The purpose of this survey was to identify how audiologists make clinical decisions regarding common signal-processing features for hearing aid provision in adults. An online survey was sent to audiologists across the United States. The 22 survey questions addressed four primary topics including demographics of the responding audiologists, factors affecting selection of hearing aid devices, the approaches used in the fitting of signal-processing features, and the strategies used in the fine-tuning of these features. A total of 251 audiologists who provide hearing aid fittings to adults completed the electronically distributed survey. The respondents worked in a variety of settings including private practice, physician offices, university clinics, and hospitals/medical centers. Data analysis was based on a qualitative analysis of the question responses. The survey results for each of the four topic areas (demographics, device selection, hearing aid fitting, and hearing aid fine-tuning) are summarized descriptively. Survey responses indicate that audiologists vary in the procedures they use in fitting and fine-tuning based on the specific feature, such that the approaches used for the fitting of frequency-specific gain differ from other types of features (i.e., compression time constants, frequency lowering parameters, noise reduction strength, directional microphones, feedback management). Audiologists commonly rely on prescriptive fitting formulas and probe microphone measures for the fitting of frequency-specific gain and rely on manufacturers' default settings and recommendations for both the initial fitting and the fine-tuning of signal-processing features other than frequency-specific gain. The survey results are consistent with a lack of published protocols and guidelines for fitting and adjusting signal-processing features beyond frequency-specific gain. To streamline current practice, a transparent evidence-based tool that enables clinicians to prescribe the setting of other features from individual patient characteristics would be desirable. American Academy of Audiology
Options for veterinary drug analysis using mass spectrometry.
Le Bizec, Bruno; Pinel, Gaud; Antignac, Jean-Philippe
2009-11-13
Several classes of chemical compounds, exhibiting many different chemical properties, are classified under the generic term of "veterinary drugs", among which are the antimicrobial medicines such as antibiotics or dyes, and drugs exhibiting growth promoting properties like steroids, beta-agonist compounds, thyrostats or growth hormones. For food safety purposes, the resort to these substances in animal breeding has been submitted to strict regulation within the European Union for more than 15 years. Systems of control have therefore been set up within the same period of time to ensure compliance with the regulation. The current strategy relies on targeted analytical approaches focusing on the detection of residues of the administered compounds or their metabolites in different kinds of feed, food or biological matrices. If screening methods, which provide rapid discrimination between compliant and suspect samples, may be based on several techniques such as immunoassays or mass spectrometry, confirmatory methods mainly rely on the latter, which provides adequate specificity and sensitivity for unambiguous identification of the target analytes in biological matrices at trace level. The present article reviews the main mass spectrometric strategies, from the very first, nonetheless still efficient, single MS and multidimensional and high-resolution MS through to advanced isotope ratio MS. Several applications in the field of residue analysis illustrate each of these approaches and focus on the balance between issues related to the compounds of interest (chemistry, matrix, concentration, ...) and the large offer of mass spectrometric-related technical possibilities, from the choice of the ionization conditions (EI, NCI, PCI, reagent gases, ESI+, ESI-), to the mass analyzers (single quadrupole, triple quadrupole, ion traps, time-of-flight, magnetic sectors, isotope ratio mass spectrometer) and corresponding acquisition modes (full scan, LR-SIM, HR-SIM, SRM, precursor scan, ...). All the displayed strategies, from the importance of sample preparation to MS analysis to potential derivatization steps and chromatographic separation parameters are discussed in that context. Besides the advantages of each strategy, main issues associated to such MS approaches are commented with an emphasis not only on such critical points as ion suppression and resolution, but also on the adequacy of the current regulation regarding the evolution of the technology. Finally, future trends which may lead to strong and positive impacts in the field of residue analysis are presented, including latest developments and improvements in chromatography or software dedicated to signal acquisition and data analysis.
NASA Astrophysics Data System (ADS)
Scheele, C. J.; Huang, Q.
2016-12-01
In the past decade, the rise in social media has led to the development of a vast number of social media services and applications. Disaster management represents one of such applications leveraging massive data generated for event detection, response, and recovery. In order to find disaster relevant social media data, current approaches utilize natural language processing (NLP) methods based on keywords, or machine learning algorithms relying on text only. However, these approaches cannot be perfectly accurate due to the variability and uncertainty in language used on social media. To improve current methods, the enhanced text-mining framework is proposed to incorporate location information from social media and authoritative remote sensing datasets for detecting disaster relevant social media posts, which are determined by assessing the textual content using common text mining methods and how the post relates spatiotemporally to the disaster event. To assess the framework, geo-tagged Tweets were collected for three different spatial and temporal disaster events: hurricane, flood, and tornado. Remote sensing data and products for each event were then collected using RealEarthTM. Both Naive Bayes and Logistic Regression classifiers were used to compare the accuracy within the enhanced text-mining framework. Finally, the accuracies from the enhanced text-mining framework were compared to the current text-only methods for each of the case study disaster events. The results from this study address the need for more authoritative data when using social media in disaster management applications.
Fogleman, Sarah; Santana, Casey; Bishop, Casey; Miller, Alyssa; Capco, David G
2016-01-01
Thousands of mothers are at risk of transmitting mitochondrial diseases to their offspring each year, with the most severe form of these diseases being fatal [1]. With no cure, transmission prevention is the only current hope for decreasing the disease incidence. Current methods of prevention rely on low mutant maternal mitochondrial DNA levels, while those with levels close to or above threshold (>60%) are still at a very high risk of transmission [2]. Two novel approaches may offer hope for preventing and treating mitochondrial disease: mitochondrial replacement therapy, and CRISPR/Cas9. Mitochondrial replacement therapy has emerged as a promising tool that has the potential to prevent transmission in patients with higher mutant mitochondrial loads. This method is the subject of many ethical concerns due its use of a donor embryo to transplant the patient’s nuclear DNA; however, it has ultimately been approved for use in the United Kingdom and was recently declared ethically permissible by the FDA. The leading-edge CRISPR/Cas9 technology exploits the principles of bacterial immune function to target and remove specific sequences of mutated DNA. This may have potential in treating individuals with disease caused by mutant mitochondrial DNA. As the technology progresses, it is important that the ethical considerations herein emerge and become more established. The purpose of this review is to discuss current research surrounding the procedure and efficacy of the techniques, compare the ethical concerns of each approach, and look into the future of mitochondrial gene replacement therapy. PMID:27725916
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnack, Dalton D.
Final technical report for research performed by Dr. Thomas G. Jenkins in collaboration with Professor Dalton D. Schnack on SciDAC Cooperative Agreement: Center for Wave Interactions with Magnetohydrodyanics, DE-FC02-06ER54899, for the period of 8/15/06 - 8/14/11. This report centers on the Slow MHD physics campaign work performed by Dr. Jenkins while at UW-Madison and then at Tech-X Corporation. To make progress on the problem of RF induced currents affect magnetic island evolution in toroidal plasmas, a set of research approaches are outlined. Three approaches can be addressed in parallel. These are: (1) Analytically prescribed additional term in Ohm's law tomore » model the effect of localized ECCD current drive; (2) Introduce an additional evolution equation for the Ohm's law source term. Establish a RF source 'box' where information from the RF code couples to the fluid evolution; and (3) Carry out a more rigorous analytic calculation treating the additional RF terms in a closure problem. These approaches rely on the necessity of reinvigorating the computation modeling efforts of resistive and neoclassical tearing modes with present day versions of the numerical tools. For the RF community, the relevant action item is - RF ray tracing codes need to be modified so that general three-dimensional spatial information can be obtained. Further, interface efforts between the two codes require work as well as an assessment as to the numerical stability properties of the procedures to be used.« less
2011-01-01
Measuring forest degradation and related forest carbon stock changes is more challenging than measuring deforestation since degradation implies changes in the structure of the forest and does not entail a change in land use, making it less easily detectable through remote sensing. Although we anticipate the use of the IPCC guidance under the United Framework Convention on Climate Change (UNFCCC), there is no one single method for monitoring forest degradation for the case of REDD+ policy. In this review paper we highlight that the choice depends upon a number of factors including the type of degradation, available historical data, capacities and resources, and the potentials and limitations of various measurement and monitoring approaches. Current degradation rates can be measured through field data (i.e. multi-date national forest inventories and permanent sample plot data, commercial forestry data sets, proxy data from domestic markets) and/or remote sensing data (i.e. direct mapping of canopy and forest structural changes or indirect mapping through modelling approaches), with the combination of techniques providing the best options. Developing countries frequently lack consistent historical field data for assessing past forest degradation, and so must rely more on remote sensing approaches mixed with current field assessments of carbon stock changes. Historical degradation estimates will have larger uncertainties as it will be difficult to determine their accuracy. However improving monitoring capacities for systematic forest degradation estimates today will help reduce uncertainties even for historical estimates. PMID:22115360
Towards a Framework for Developing Semantic Relatedness Reference Standards
Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.
2010-01-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697
One commonly used approach to CSO pollution abatement is to rely on a storm-event based design of storage-tank volume to capture CSO for pump-back and/or bleed-back (gravity flow) to the existing WWTP for treatment. However, this approach may not be by itself the most economical...
Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...
Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds
When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...
DOT National Transportation Integrated Search
2017-12-01
Traditionally, highway agencies relied mainly on man-entry approach for assessing in-service conditions of their culverts. And, this direct approach left many drainage structures unapproachable and uninspected. This is because a large number of drain...
Nutrient Use Efficiency in Bioenergy Cropping Systems: Critical Research Questions
USDA-ARS?s Scientific Manuscript database
Current U.S. plans for energy security rely on converting large areas of cropland from food to biofuel production. Additionally, lands currently considered too marginal for intensive food production may be considered suitable for biofuels production; predominant cropping systems may shift to more va...
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
Single and Multiple Microphone Noise Reduction Strategies in Cochlear Implants
Azimi, Behnam; Hu, Yi; Friedland, David R.
2012-01-01
To restore hearing sensation, cochlear implants deliver electrical pulses to the auditory nerve by relying on sophisticated signal processing algorithms that convert acoustic inputs to electrical stimuli. Although individuals fitted with cochlear implants perform well in quiet, in the presence of background noise, the speech intelligibility of cochlear implant listeners is more susceptible to background noise than that of normal hearing listeners. Traditionally, to increase performance in noise, single-microphone noise reduction strategies have been used. More recently, a number of approaches have suggested that speech intelligibility in noise can be improved further by making use of two or more microphones, instead. Processing strategies based on multiple microphones can better exploit the spatial diversity of speech and noise because such strategies rely mostly on spatial information about the relative position of competing sound sources. In this article, we identify and elucidate the most significant theoretical aspects that underpin single- and multi-microphone noise reduction strategies for cochlear implants. More analytically, we focus on strategies of both types that have been shown to be promising for use in current-generation implant devices. We present data from past and more recent studies, and furthermore we outline the direction that future research in the area of noise reduction for cochlear implants could follow. PMID:22923425
Byrne, Greg; Feighery, Conleth F
2015-01-01
Historically the diagnosis of celiac disease has relied upon clinical, serological, and histological evidence. In recent years the use of sensitive serological methods has meant an increase in the diagnosis of celiac disease. The heterogeneous nature of the disorder presents a challenge in the study and diagnosis of the disease with patients varying from subclinical or latent disease to patients with overt symptoms. Furthermore the related gluten-sensitive disease dermatitis herpetiformis, while distinct in some respects, shares clinical and serological features with celiac disease. Here we summarize current best practice for the diagnosis of celiac disease and briefly discuss newer approaches. The advent of next-generation assays for diagnosis and newer clinical protocols may result in more sensitive screening and ultimately the possible replacement of the intestinal biopsy as the gold standard for celiac disease diagnosis.
Importance of Calibration/Validation Traceability for Multi-Sensor Imaging Spectrometry Applications
NASA Technical Reports Server (NTRS)
Thome, K.
2017-01-01
Knowledge of calibration traceability is essential for ensuring the quality of data products relying on multiple sensors and especially true for imaging spectrometers. The current work discusses the expected impact that imaging spectrometers have in ensuring radiometric traceability for both multispectral and hyperspectral products. The Climate Absolute Radiance and Refractivity Observatory Pathfinder mission is used to show the role that high-accuracy imaging spectrometers can play in understanding test sites used for vicarious calibration of sensors. The associated Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer calibration demonstration system is used to illustrate recent advances in laboratory radiometric calibration approaches that will allow both the use of imaging spectrometers as calibration standards as well as to ensure the consistency of the multiple imaging spectrometers expected to be on orbit in the next decade.
Allergic reactions to Anisakis found in fish.
Nieuwenhuizen, Natalie E; Lopata, Andreas L
2014-08-01
The food-borne parasite Anisakis is an important hidden food allergen. Anisakis is a parasitic nematode which has a third-stage larval form that infects mainly fish, and ingestion of contaminated seafood can result in severe allergic reactions. Symptoms experienced due to exposure to this parasite include gastrointestinal disorders, urticaria, dermatitis, asthma and even anaphylaxis. Accurate prevalence data of allergic sensitisation to Anisakis are difficult to estimate due to the lack of well-designed population-based studies. Current diagnostic approaches rely on the detection of serum IgE antibodies to allergenic proteins, which however demonstrate considerable immunological cross-reactivity to other invertebrate allergens. While exposure to this parasite seems to increase due to the increasing consumption of seafood worldwide, the immunology of infection and allergic sensitization is not fully understood.
MaxSynBio - Avenues towards creating cells from the bottom up.
Schwille, Petra; Spatz, Joachim; Landfester, Katharina; Bodenschatz, Eberhard; Herminghaus, Stephan; Sourjik, Victor; Erb, Tobias; Bastiaens, Philippe; Lipowsky, Reinhard; Hyman, Anthony; Dabrock, Peter; Baret, Jean-Christophe; Vidakovic-Koch, Tanja; Bieling, Peter; Dimova, Rumiana; Mutschler, Hannes; Robinson, Tom; Tang, Dora; Wegner, Seraphine; Sundmacher, Kai
2018-05-11
A large Max Planck-based German research consortium ('MaxSynBio') was formed to investigate living systems from a fundamental perspective. The research program of MaxSynBio relies solely on the bottom-up approach to Synthetic Biology. MaxSynBio focuses on the detailed analysis and understanding of essential processes of life, via their modular reconstitution in minimal synthetic systems. The ultimate goal is to construct a basic living unit entirely from non-living components. The fundamental insights gained from the activities in MaxSynBio can eventually be utilized for establishing a new generation of biotechnological processes, which would be based on synthetic cell constructs that replace natural cells currently used in conventional biotechnology. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Forecasting Flare Activity Using Deep Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Hernandez, T.
2017-12-01
Current operational flare forecasting relies on human morphological analysis of active regions and the persistence of solar flare activity through time (i.e. that the Sun will continue to do what it is doing right now: flaring or remaining calm). In this talk we present the results of applying deep Convolutional Neural Networks (CNNs) to the problem of solar flare forecasting. CNNs operate by training a set of tunable spatial filters that, in combination with neural layer interconnectivity, allow CNNs to automatically identify significant spatial structures predictive for classification and regression problems. We will start by discussing the applicability and success rate of the approach, the advantages it has over non-automated forecasts, and how mining our trained neural network provides a fresh look into the mechanisms behind magnetic energy storage and release.
The Development of Eating Behavior - Biology and Context
Gahagan, Sheila
2012-01-01
Eating is necessary for survival, gives great pleasure and can be perturbed leading to undernutrition, overnutrition and eating disorders. The development of feeding in humans relies on complex interplay between homeostatic mechanisms; neural reward systems; and child motor, sensory and socio-emotional capability. Furthermore, parenting, social influences and the food environment influence the development of eating behavior. The rapid expansion of new knowledge in this field, from basic science to clinical and community-based research, is expected to lead to urgently needed research in support of effective, evidence-based prevention and treatment strategies for undernutrition, overnutrition and eating disorders in early childhood. Using a biopsychosocial approach, this review covers current knowledge of the development of eating behavior from the brain to the individual child, taking into account important contextual influences. PMID:22472944
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J
2008-11-06
To assist with the development of a French online quality-controlled health gateway(CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (FMTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. In this paper,we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French.
Kreuzer, Karl‐Anton; Soosapilla, Asha; Spacek, Martin; Stehlikova, Olga; Gambell, Peter; McIver‐Brown, Neil; Villamor, Neus; Psarra, Katherina; Arroz, Maria; Milani, Raffaella; de la Serna, Javier; Cedena, M. Teresa; Jaksic, Ozren; Nomdedeu, Josep; Moreno, Carol; Rigolin, Gian Matteo; Cuneo, Antonio; Johansen, Preben; Johnsen, Hans E.; Rosenquist, Richard; Niemann, Carsten Utoft; Kern, Wolfgang; Westerman, David; Trneny, Marek; Mulligan, Stephen; Doubek, Michael; Pospisilova, Sarka; Hillmen, Peter; Oscier, David; Hallek, Michael; Ghia, Paolo; Montserrat, Emili
2018-01-01
The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the diagnosis of CLL. ERIC/ESCCA members classified 14 of 35 potential markers as “required” or “recommended” for CLL diagnosis, consensus being defined as >75% and >50% agreement, respectively. An approach to validate “required” markers using normal peripheral blood was developed. Responses were received from 150 participants with a diagnostic workload >20 CLL cases per week in 23/150 (15%), 5–20 in 82/150 (55%), and <5 cases per week in 45/150 (30%). The consensus for “required” diagnostic markers included: CD19, CD5, CD20, CD23, Kappa, and Lambda. “Recommended” markers potentially useful for differential diagnosis were: CD43, CD79b, CD81, CD200, CD10, and ROR1. Reproducible criteria for component reagents were assessed retrospectively in 14,643 cases from 13 different centers and showed >97% concordance with current approaches. A pilot study to validate staining quality was completed in 11 centers. Markers considered as “required” for the diagnosis of CLL by the participants in this study (CD19, CD5, CD20, CD23, Kappa, and Lambda) are consistent with current diagnostic criteria and practice. Importantly, a reproducible approach to validate and apply these markers in individual laboratories has been identified. Finally, a consensus “recommended” panel of markers to refine diagnosis in borderline cases (CD43, CD79b, CD81, CD200, CD10, and ROR1) has been defined and will be prospectively evaluated. © 2017 International Clinical Cytometry Society PMID:29024461
Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha; Spacek, Martin; Stehlikova, Olga; Gambell, Peter; McIver-Brown, Neil; Villamor, Neus; Psarra, Katherina; Arroz, Maria; Milani, Raffaella; de la Serna, Javier; Cedena, M Teresa; Jaksic, Ozren; Nomdedeu, Josep; Moreno, Carol; Rigolin, Gian Matteo; Cuneo, Antonio; Johansen, Preben; Johnsen, Hans E; Rosenquist, Richard; Niemann, Carsten Utoft; Kern, Wolfgang; Westerman, David; Trneny, Marek; Mulligan, Stephen; Doubek, Michael; Pospisilova, Sarka; Hillmen, Peter; Oscier, David; Hallek, Michael; Ghia, Paolo; Montserrat, Emili
2018-01-01
The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the diagnosis of CLL. ERIC/ESCCA members classified 14 of 35 potential markers as "required" or "recommended" for CLL diagnosis, consensus being defined as >75% and >50% agreement, respectively. An approach to validate "required" markers using normal peripheral blood was developed. Responses were received from 150 participants with a diagnostic workload >20 CLL cases per week in 23/150 (15%), 5-20 in 82/150 (55%), and <5 cases per week in 45/150 (30%). The consensus for "required" diagnostic markers included: CD19, CD5, CD20, CD23, Kappa, and Lambda. "Recommended" markers potentially useful for differential diagnosis were: CD43, CD79b, CD81, CD200, CD10, and ROR1. Reproducible criteria for component reagents were assessed retrospectively in 14,643 cases from 13 different centers and showed >97% concordance with current approaches. A pilot study to validate staining quality was completed in 11 centers. Markers considered as "required" for the diagnosis of CLL by the participants in this study (CD19, CD5, CD20, CD23, Kappa, and Lambda) are consistent with current diagnostic criteria and practice. Importantly, a reproducible approach to validate and apply these markers in individual laboratories has been identified. Finally, a consensus "recommended" panel of markers to refine diagnosis in borderline cases (CD43, CD79b, CD81, CD200, CD10, and ROR1) has been defined and will be prospectively evaluated. © 2017 International Clinical Cytometry Society. © 2017 The Authors. Cytometry Part B: Clinical Cytometry published by Wiley Periodicals, Inc. on behalf of International Clinical Cytometry Society.
Medical home implementation: a sensemaking taxonomy of hard and soft best practices.
Hoff, Timothy
2013-12-01
The patient-centered medical home (PCMH) model of care is currently a central focus of U.S. health system reform, but less is known about the model's implementation in the practice of everyday primary care. Understanding its implementation is key to ensuring the approach's continued support and success nationally. This article addresses this gap through a qualitative examination of the best practices associated with PCMH implementation for older adult patients in primary care. I used a multicase, comparative study design that relied on a sensemaking approach and fifty-one in-depth interviews with physicians, nurses, and clinic support staff working in six accredited medical homes located in various geographic areas. My emphasis was on gaining descriptive insights into the staff's experiences delivering medical home care to older adult patients in particular and then analyzing how these experiences shaped the staff's thinking, learning, and future actions in implementing medical home care. I found two distinct taxonomies of implementation best practices, which I labeled "hard" and "soft" because of their differing emphasis and content. Hard implementation practices are normative activities and structural interventions that align well with existing national standards for medical home care. Soft best practices are more relational in nature and derive from the existing practice social structure and everyday interactions between staff and patients. Currently, external stakeholders are less apt to recognize, encourage, or incentivize soft best practices. The results suggest that there may be no standardized, one-size-fits-all approach to making medical home implementation work, particularly for special patient populations such as the elderly. My study also raises the issue of broadening current PCMH assessments and reward systems to include implementation practices that contain heavy social and relational components of care, in addition to the emphasis now placed on building structural supports for medical home work. Further study of these softer implementation practices and a continued call for qualitative methodological approaches that gain insight into everyday practice behavior are warranted. © 2013 Milbank Memorial Fund.
Novel concept for driving the linear compressor of a micro-miniature split Stirling cryogenic cooler
NASA Astrophysics Data System (ADS)
Maron, V.; Veprik, A.; Finkelstein, L.; Vilenchik, H.; Ziv, I.; Pundak, N.
2009-05-01
New methods of carrying out homeland security and antiterrorist operations call for the development of a new generation of mechanically cooled, portable, battery powered infrared imagers, relying on micro-miniature Stirling cryogenic coolers of rotary or linear types. Since split Stirling linearly driven micro-miniature cryogenic coolers have inherently longer life spans, low vibration export and better aural stealth as compared to their rotary driven rivals, they are more suitable for the above applications. The performance of such cryogenic coolers depends strongly on the efficacy of their electronic drivers. In a traditional approach, the PWM power electronics produce the fixed frequency tonal driving voltage/current, the magnitude of which is modulated via a PID control law so as to maintain the desired focal plane array temperature. The disadvantage of such drivers is that they draw high ripple current from the system's power bus. This results in the need for an oversized DC power supply (battery packs) and power electronic components, low efficiency due to excessive conductive losses and high residual electromagnetic interference which in turn degrades the performance of other systems connected to the same power bus. Without either an active line filter or large and heavy passive filtering, other electronics can not be powered from the same power bus, unless they incorporate heavy filtering at their inputs. The authors present the results of a feasibility study towards developing a novel "pumping" driver consuming essentially constant instant battery power/current without making use of an active or passive filter. In the tested setup, the driver relies on a bidirectional controllable bridge, invertible with the driving frequency, and a fast regulated DC/DC converter which maintains a constant level of current consumed from the DC power supply and thus operates in input current control mode. From the experimental results, the steady-state power consumed by the linear compressor remains the same as compared with the traditional sine wave driver, the voltage and current drawn from the battery pack is essentially free of low frequency ripple (this without use of any kind of filtering) and the overall coefficient of performance of the driver is in excess of 94% over the entire working range of supply voltages. Such a driver free of sine forming PWM stage and have reduced power peaks in all power conversion components.
NASA Astrophysics Data System (ADS)
Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime
2016-04-01
The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of tidal current patterns of the Guanabara Bay, venue for the sailing competitions of Rio 2016 Olympic Games. The methodology relies on the integration of a consistent amount of data collected in the field, hydrodynamic model output, cartography and "key-signs" visible on the water into a GIS, proving to be particularly useful to simplify the final information, to help the learning process and to improve the decision making.
Chemical Calculations; An Audiotutorial Approach.
ERIC Educational Resources Information Center
Lower, Stephen K.
An audiotutorial approach to problem-solving in college chemistry relying upon audio tapes is available. The program is designed to increase the teacher's effectiveness by providing individualized attention to student difficulties related to problem-solving. Problem solutions are recorded on audio tapes (designed for use with Sony TC-160 cassettes…
Neighborhood size of training data influences soil map disaggregation
USDA-ARS?s Scientific Manuscript database
Soil class mapping relies on the ability of sample locations to represent portions of the landscape with similar soil types; however, most digital soil mapping (DSM) approaches intersect sample locations with one raster pixel per covariate layer regardless of pixel size. This approach does not take ...
An Alternative Approach to Zero Tolerance Policies.
ERIC Educational Resources Information Center
Ilg, Timothy J.; Russo, Charles J.
2001-01-01
School officials should adopt no-tolerance policies that require educators' discretion in punishing misbehaving students (based on due process and fundamental fairness), rather than relying on the zero-tolerance approach, which fails to differentiate among different levels of offenses. Even disruptive students deserve due process and appropriate…
Schwartz, Andrew J.; Walton, Courtney L.; Williams, Kelsey L.; Hieftje, Gary M.
2016-01-01
Modern “-omics” (e.g., proteomics, glycomics, metabolomics, etc.) analyses rely heavily on electrospray ionization and tandem mass spectrometry to determine the structural identity of target species. Unfortunately, these methods are limited to specialized mass spectrometry instrumentation. Here, a novel approach is described that enables ionization and controlled, tunable fragmentation of peptides at atmospheric pressure. In the new source, a direct-current plasma is sustained between a tapered metal rod and a flowing sample-containing solution. As the liquid stream contacts the electrical discharge, peptides from the solution are volatilized, ionized, and fragmented. At high discharge currents (e.g., 70 mA), electrospray-like spectra are observed, dominated by singly and doubly protonated molecular ions. At lower currents (35 mA), many peptides exhibit extensive fragmentation, with a-, b-, c-, x-, and y-type ion series present as well as complex fragments, such as d-type ions, not previously observed with atmospheric-pressure dissociation. Though the mechanism of fragmentation is currently unclear, observations indicate it could result from the interaction of peptides with gas-phase radicals or ultraviolet radiation generated within the plasma. PMID:28451101
Is it time to reassess current safety standards for glyphosate-based herbicides?
Vandenberg, Laura N; Blumberg, Bruce; Antoniou, Michael N; Benbrook, Charles M; Carroll, Lynn; Colborn, Theo; Everett, Lorne G; Hansen, Michael; Landrigan, Philip J; Lanphear, Bruce P; Mesnage, Robin; Vom Saal, Frederick S; Welshons, Wade V; Myers, John Peterson
2017-06-01
Use of glyphosate-based herbicides (GBHs) increased ∼100-fold from 1974 to 2014. Additional increases are expected due to widespread emergence of glyphosate-resistant weeds, increased application of GBHs, and preharvest uses of GBHs as desiccants. Current safety assessments rely heavily on studies conducted over 30 years ago. We have considered information on GBH use, exposures, mechanisms of action, toxicity and epidemiology. Human exposures to glyphosate are rising, and a number of in vitro and in vivo studies challenge the basis for the current safety assessment of glyphosate and GBHs. We conclude that current safety standards for GBHs are outdated and may fail to protect public health or the environment. To improve safety standards, the following are urgently needed: (1) human biomonitoring for glyphosate and its metabolites; (2) prioritisation of glyphosate and GBHs for hazard assessments, including toxicological studies that use state-of-the-art approaches; (3) epidemiological studies, especially of occupationally exposed agricultural workers, pregnant women and their children and (4) evaluations of GBHs in commercially used formulations, recognising that herbicide mixtures likely have effects that are not predicted by studying glyphosate alone. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
NASA Astrophysics Data System (ADS)
Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.
2017-09-01
With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.
Old and new challenges in Parkinson's disease therapeutics.
Pires, Ana O; Teixeira, F G; Mendes-Pinheiro, B; Serra, Sofia C; Sousa, Nuno; Salgado, António J
2017-09-01
Parkinson's disease (PD) is a neurodegenerative disorder characterized by the degeneration of dopaminergic neurons and/or loss od neuronal projections, in several dopaminergic networks. Current treatments for idiopathic PD rely mainly on the use of pharmacologic agents to improve motor symptomatology of PD patients. Nevertheless, so far PD remains an incurable disease. Therefore, it is of utmost importance to establish new therapeutic strategies for PD treatment. Over the last 20 years, several molecular, gene and cell/stem-cell therapeutic approaches have been developed with the aim of counteracting or retarding PD progression. The scope of this review is to provide an overview of PD related therapies and major breakthroughs achieved within this field. In order to do so, this review will start by focusing on PD characterization and current treatment options covering thereafter molecular, gene and cell/stem cell-based therapies that are currently being studied in animal models of PD or have recently been tested in clinical trials. Among stem cell-based therapies, those using MSCs as possible disease modifying agents for PD therapy and, specifically, the MSCs secretome contribution to meet the clinical challenge of counteracting or retarding PD progression, will be more deeply explored. Copyright © 2017 Elsevier Ltd. All rights reserved.
Eddy-current inversion in the thin-skin limit: Determination of depth and opening for a long crack
NASA Astrophysics Data System (ADS)
Burke, S. K.
1994-09-01
A method for crack size determination using eddy-current nondestructive evaluation is presented for the case of a plate containing an infinitely long crack of uniform depth and uniform crack opening. The approach is based on the approximate solution to Maxwell's equations for nonmagnetic conductors in the limit of small skin depth and relies on least-squares polynomial fits to a normalized coil impedance function as a function of skin depth. The method is straightforward to implement and is relatively insensitive to both systematic and random errors. The procedure requires the computation of two functions: a normalizing function, which depends both on the coil parameters and the skin depth, and a crack-depth function which depends only on the coil parameters in addition to the crack depth. The practical perfomance of the method was tested using a set of simulated cracks in the form of electro-discharge machined slots in aluminum alloy plates. The crack depths and crack opening deduced from the eddy-current measurements agree with the actual crack dimensions to within 10% or better. Recommendations concerning the optimum conditions for crack sizing are also made.
Harvesting energy from the natural vibration of human walking.
Yang, Weiqing; Chen, Jun; Zhu, Guang; Yang, Jin; Bai, Peng; Su, Yuanjie; Jing, Qingsheng; Cao, Xia; Wang, Zhong Lin
2013-12-23
The triboelectric nanogenerator (TENG), a unique technology for harvesting ambient mechanical energy based on the triboelectric effect, has been proven to be a cost-effective, simple, and robust approach for self-powered systems. However, a general challenge is that the output current is usually low. Here, we demonstrated a rationally designed TENG with integrated rhombic gridding, which greatly improved the total current output owing to the structurally multiplied unit cells connected in parallel. With the hybridization of both the contact-separation mode and sliding electrification mode among nanowire arrays and nanopores fabricated onto the surfaces of two contact plates, the newly designed TENG produces an open-circuit voltage up to 428 V, and a short-circuit current of 1.395 mA with the peak power density of 30.7 W/m(2). Relying on the TENG, a self-powered backpack was developed with a vibration-to-electric energy conversion efficiency up to 10.62(±1.19) %. And it was also demonstrated as a direct power source for instantaneously lighting 40 commercial light-emitting diodes by harvesting the vibration energy from natural human walking. The newly designed TENG can be a mobile power source for field engineers, explorers, and disaster-relief workers.
A Data-Driven Approach to Interactive Visualization of Power Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Jun
Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less
Metabolomics and Personalized Medicine.
Koen, Nadia; Du Preez, Ilse; Loots, Du Toit
2016-01-01
Current clinical practice strongly relies on the prognosis, diagnosis, and treatment of diseases using methods determined and averaged for the specific diseased cohort/population. Although this approach complies positively with most patients, misdiagnosis, treatment failure, relapse, and adverse drug effects are common occurrences in many individuals, which subsequently hamper the control and eradication of a number of diseases. These incidences can be explained by individual variation in the genome, transcriptome, proteome, and metabolome of a patient. Various "omics" approaches have investigated the influence of these factors on a molecular level, with the intention of developing personalized approaches to disease diagnosis and treatment. Metabolomics, the newest addition to the "omics" domain and the closest to the observed phenotype, reflects changes occurring at all molecular levels, as well as influences resulting from other internal and external factors. By comparing the metabolite profiles of two or more disease phenotypes, metabolomics can be applied to identify biomarkers related to the perturbation being investigated. These biomarkers can, in turn, be used to develop personalized prognostic, diagnostic, and treatment approaches, and can also be applied to the monitoring of disease progression, treatment efficacy, predisposition to drug-related side effects, and potential relapse. In this review, we discuss the contributions that metabolomics has made, and can potentially still make, towards the field of personalized medicine. © 2016 Elsevier Inc. All rights reserved.
Irizarry, Kristopher J L; Bryant, Doug; Kalish, Jordan; Eng, Curtis; Schmidt, Peggy L; Barrett, Gini; Barr, Margaret C
2016-01-01
Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs) that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management.
Irizarry, Kristopher J. L.; Bryant, Doug; Kalish, Jordan; Eng, Curtis; Schmidt, Peggy L.; Barrett, Gini; Barr, Margaret C.
2016-01-01
Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs) that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management. PMID:27376076
de Nazelle, Audrey; Arunachalam, Saravanan; Serre, Marc L
2010-08-01
States in the USA are required to demonstrate future compliance of criteria air pollutant standards by using both air quality monitors and model outputs. In the case of ozone, the demonstration tests aim at relying heavily on measured values, due to their perceived objectivity and enforceable quality. Weight given to numerical models is diminished by integrating them in the calculations only in a relative sense. For unmonitored locations, the EPA has suggested the use of a spatial interpolation technique to assign current values. We demonstrate that this approach may lead to erroneous assignments of nonattainment and may make it difficult for States to establish future compliance. We propose a method that combines different sources of information to map air pollution, using the Bayesian Maximum Entropy (BME) Framework. The approach gives precedence to measured values and integrates modeled data as a function of model performance. We demonstrate this approach in North Carolina, using the State's ozone monitoring network in combination with outputs from the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. We show that the BME data integration approach, compared to a spatial interpolation of measured data, improves the accuracy and the precision of ozone estimations across the state.
NASA Astrophysics Data System (ADS)
Sivalingam, Udhayaraj; Wels, Michael; Rempfler, Markus; Grosskopf, Stefan; Suehling, Michael; Menze, Bjoern H.
2016-03-01
In this paper, we present a fully automated approach to coronary vessel segmentation, which involves calcification or soft plaque delineation in addition to accurate lumen delineation, from 3D Cardiac Computed Tomography Angiography data. Adequately virtualizing the coronary lumen plays a crucial role for simulating blood ow by means of fluid dynamics while additionally identifying the outer vessel wall in the case of arteriosclerosis is a prerequisite for further plaque compartment analysis. Our method is a hybrid approach complementing Active Contour Model-based segmentation with an external image force that relies on a Random Forest Regression model generated off-line. The regression model provides a strong estimate of the distance to the true vessel surface for every surface candidate point taking into account 3D wavelet-encoded contextual image features, which are aligned with the current surface hypothesis. The associated external image force is integrated in the objective function of the active contour model, such that the overall segmentation approach benefits from the advantages associated with snakes and from the ones associated with machine learning-based regression alike. This yields an integrated approach achieving competitive results on a publicly available benchmark data collection (Rotterdam segmentation challenge).
Luck, Jeff; Hagigi, Fred; Parker, Louise E; Yano, Elizabeth M; Rubenstein, Lisa V; Kirchner, JoAnn E
2009-09-28
Collaborative care models for depression in primary care are effective and cost-effective, but difficult to spread to new sites. Translating Initiatives for Depression into Effective Solutions (TIDES) is an initiative to promote evidence-based collaborative care in the U.S. Veterans Health Administration (VHA). Social marketing applies marketing techniques to promote positive behavior change. Described in this paper, TIDES used a social marketing approach to foster national spread of collaborative care models. The approach relied on a sequential model of behavior change and explicit attention to audience segmentation. Segments included VHA national leadership, Veterans Integrated Service Network (VISN) regional leadership, facility managers, frontline providers, and veterans. TIDES communications, materials and messages targeted each segment, guided by an overall marketing plan. Depression collaborative care based on the TIDES model was adopted by VHA as part of the new Primary Care Mental Health Initiative and associated policies. It is currently in use in more than 50 primary care practices across the United States, and continues to spread, suggesting success for its social marketing-based dissemination strategy. Development, execution and evaluation of the TIDES marketing effort shows that social marketing is a promising approach for promoting implementation of evidence-based interventions in integrated healthcare systems.
Robustly detecting differential expression in RNA sequencing data using observation weights
Zhou, Xiaobei; Lindsay, Helen; Robinson, Mark D.
2014-01-01
A popular approach for comparing gene expression levels between (replicated) conditions of RNA sequencing data relies on counting reads that map to features of interest. Within such count-based methods, many flexible and advanced statistical approaches now exist and offer the ability to adjust for covariates (e.g. batch effects). Often, these methods include some sort of ‘sharing of information’ across features to improve inferences in small samples. It is important to achieve an appropriate tradeoff between statistical power and protection against outliers. Here, we study the robustness of existing approaches for count-based differential expression analysis and propose a new strategy based on observation weights that can be used within existing frameworks. The results suggest that outliers can have a global effect on differential analyses. We demonstrate the effectiveness of our new approach with real data and simulated data that reflects properties of real datasets (e.g. dispersion-mean trend) and develop an extensible framework for comprehensive testing of current and future methods. In addition, we explore the origin of such outliers, in some cases highlighting additional biological or technical factors within the experiment. Further details can be downloaded from the project website: http://imlspenticton.uzh.ch/robinson_lab/edgeR_robust/. PMID:24753412
Inertial mass sensing with low Q-factor vibrating microcantilevers
NASA Astrophysics Data System (ADS)
Adhikari, S.
2017-10-01
Mass sensing using micromechanical cantilever oscillators has been established as a promising approach. The scientific principle underpinning this technique is the shift in the resonance frequency caused by the additional mass in the dynamic system. This approach relies on the fact that the Q-factor of the underlying oscillator is high enough so that it does not significantly affect the resonance frequencies. We consider the case when the Q-factor is low to the extent that the effect of damping is prominent. It is shown that the mass sensing can be achieved using a shift in the damping factor. We prove that the shift in the damping factor is of the same order as that of the resonance frequency. Based on this crucial observation, three new approaches have been proposed, namely, (a) mass sensing using frequency shifts in the complex plane, (b) mass sensing from damped free vibration response in the time domain, and (c) mass sensing from the steady-state response in the frequency domain. Explicit closed-form expressions relating absorbed mass with changes in the measured dynamic properties have been derived. The rationale behind each new method has been explained using non-dimensional graphical illustrations. The new mass sensing approaches using damped dynamic characteristics can expand the current horizon of micromechanical sensing by incorporating a wide range of additional measurements.
ERIC Educational Resources Information Center
Janor, Hawati; Rahim, Ruzita Abdul; Rahman, Aisyah Abdul; Auzairy, Noor Azryani; Hashim, Noor Azuan; Yusof, Muhamad Zain
2013-01-01
The student-centered learning (SCL) approach is an approach to education that focuses on learners and their needs, rather than relying upon the input of the teacher's. The present paper examines how the SCL approach is integrated as a learner-centered paradigm into finance courses offered at a business school in a research university in Malaysia.…
ERIC Educational Resources Information Center
Drabinová, Adéla; Martinková, Patrícia
2017-01-01
In this article we present a general approach not relying on item response theory models (non-IRT) to detect differential item functioning (DIF) in dichotomous items with presence of guessing. The proposed nonlinear regression (NLR) procedure for DIF detection is an extension of method based on logistic regression. As a non-IRT approach, NLR can…
Interpolation methods and the accuracy of lattice-Boltzmann mesh refinement
Guzik, Stephen M.; Weisgraber, Todd H.; Colella, Phillip; ...
2013-12-10
A lattice-Boltzmann model to solve the equivalent of the Navier-Stokes equations on adap- tively refined grids is presented. A method for transferring information across interfaces between different grid resolutions was developed following established techniques for finite- volume representations. This new approach relies on a space-time interpolation and solving constrained least-squares problems to ensure conservation. The effectiveness of this method at maintaining the second order accuracy of lattice-Boltzmann is demonstrated through a series of benchmark simulations and detailed mesh refinement studies. These results exhibit smaller solution errors and improved convergence when compared with similar approaches relying only on spatial interpolation. Examplesmore » highlighting the mesh adaptivity of this method are also provided.« less
Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models
Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.
2011-01-01
We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.
Adaptive management in the context of barriers in European freshwater ecosystems.
Birnie-Gauvin, Kim; Tummers, Jeroen S; Lucas, Martyn C; Aarestrup, Kim
2017-12-15
Many natural habitats have been modified to accommodate for the presence of humans and their needs. Infrastructures - such as hydroelectric dams, weirs, culverts and bridges - are now a common occurrence in streams and rivers across the world. As a result, freshwater ecosystems have been altered extensively, affecting both biological and geomorphological components of the habitats. Many fish species rely on these freshwater ecosystems to complete their lifecycles, and the presence of barriers has been shown to reduce their ability to migrate and sustain healthy populations. In the long run, barriers may have severe repercussions on population densities and dynamics of aquatic animal species. There is currently an urgent need to address these issues with adequate conservation approaches. Adaptive management provides a relevant approach to managing barriers in freshwater ecosystems as it addresses the uncertainties of dealing with natural systems, and accommodates for future unexpected events, though this approach may not be suitable in all instances. A literature search on this subject yielded virtually no output. Hence, we propose a step-by-step guide for implementing adaptive management, which could be used to manage freshwater barriers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Smirnova, Alexandra; deCamp, Linda; Chowell, Gerardo
2017-05-02
Deterministic and stochastic methods relying on early case incidence data for forecasting epidemic outbreaks have received increasing attention during the last few years. In mathematical terms, epidemic forecasting is an ill-posed problem due to instability of parameter identification and limited available data. While previous studies have largely estimated the time-dependent transmission rate by assuming specific functional forms (e.g., exponential decay) that depend on a few parameters, here we introduce a novel approach for the reconstruction of nonparametric time-dependent transmission rates by projecting onto a finite subspace spanned by Legendre polynomials. This approach enables us to effectively forecast future incidence cases, the clear advantage over recovering the transmission rate at finitely many grid points within the interval where the data are currently available. In our approach, we compare three regularization algorithms: variational (Tikhonov's) regularization, truncated singular value decomposition (TSVD), and modified TSVD in order to determine the stabilizing strategy that is most effective in terms of reliability of forecasting from limited data. We illustrate our methodology using simulated data as well as case incidence data for various epidemics including the 1918 influenza pandemic in San Francisco and the 2014-2015 Ebola epidemic in West Africa.
From head to tail: new models and approaches in primate functional anatomy and biomechanics.
Organ, Jason M; Deleon, Valerie B; Wang, Qian; Smith, Timothy D
2010-04-01
This special issue of The Anatomical Record (AR) is based on interest generated by a symposium at the 2008 annual meeting of the American Association of Anatomists (AAA) at Experimental Biology, entitled "An Evolutionary Perspective on Human Anatomy." The development of this volume in turn provided impetus for a Biological Anthropology Mini-Meeting, organized by members of the AAA for the 2010 Experimental Biology meeting in Anaheim, California. The research presented in these pages reflects the themes of these symposia and provides a snapshot of the current state of primate functional anatomy and biomechanics research. The 17 articles in this special issue utilize new models and/or approaches to study long-standing questions about the evolution of our closest relatives, including soft-tissue dissection and microanatomical techniques, experimental approaches to morphology, kinematic and kinetic biomechanics, high-resolution computed tomography, and Finite Element Analysis (FEA). This volume continues a close historical association between the disciplines of anatomy and biological anthropology: anatomists benefit from an understanding of the evolutionary history of our modern form, and biological anthropologists rely on anatomical principles to make informed evolutionary inferences about our closest relatives. (c) 2010 Wiley-Liss, Inc.
Current approaches to exploit actinomycetes as a source of novel natural products.
Genilloud, Olga; González, Ignacio; Salazar, Oscar; Martín, Jesus; Tormo, José Rubén; Vicente, Francisca
2011-03-01
For decades, microbial natural products have been one of the major sources of novel drugs for pharmaceutical companies, and today all evidence suggests that novel molecules with potential therapeutic applications are still waiting to be discovered from these natural sources, especially from actinomycetes. Any appropriate exploitation of the chemical diversity of these microbial sources relies on proper understanding of their biological diversity and other related key factors that maximize the possibility of successful identification of novel molecules. Without doubt, the discovery of platensimycin has shown that microbial natural products can continue to deliver novel scaffolds if appropriate tools are put in place to reveal them in a cost-effective manner. Whereas today innovative technologies involving exploitation of uncultivated environmental diversity, together with chemical biology and in silico approaches, are seeing rapid development in natural products research, maximization of the chances of exploiting chemical diversity from microbial collections is still essential for novel drug discovery. This work provides an overview of the integrated approaches developed at the former Basic Research Center of Merck Sharp and Dohme in Spain to exploit the diversity and biosynthetic potential of actinomycetes, and includes some examples of those that were successfully applied to the discovery of novel antibiotics.
A hybrid model for computing nonthermal ion distributions in a long mean-free-path plasma
NASA Astrophysics Data System (ADS)
Tang, Xianzhu; McDevitt, Chris; Guo, Zehua; Berk, Herb
2014-10-01
Non-thermal ions, especially the suprathermal ones, are known to make a dominant contribution to a number of important physics such as the fusion reactivity in controlled fusion, the ion heat flux, and in the case of a tokamak, the ion bootstrap current. Evaluating the deviation from a local Maxwellian distribution of these non-thermal ions can be a challenging task in the context of a global plasma fluid model that evolves the plasma density, flow, and temperature. Here we describe a hybrid model for coupling such constrained kinetic calculation to global plasma fluid models. The key ingredient is a non-perturbative treatment of the tail ions where the ion Knudsen number approaches or surpasses order unity. This can be sharply constrasted with the standard Chapman-Enskog approach which relies on a perturbative treatment that is frequently invalidated. The accuracy of our coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space. Although our specific application examples will be drawn from laboratory controlled fusion experiments, the general approach is applicable to space and astrophysical plasmas as well. Work supported by DOE.
Electromagnetic structure of few-nucleon ground states
Marcucci, Laura E.; Gross, Franz L.; Peña, M. T.; ...
2016-01-08
Experimental form factors of the hydrogen and helium isotopes, extracted from an up-to-date global analysis of cross sections and polarization observables measured in elastic electron scattering from these systems, are compared to predictions obtained in three different theoretical approaches: the first is based on realistic interactions and currents, including relativistic corrections (labeled as the conventional approach); the second relies on a chiral effective field theory description of the strong and electromagnetic interactions in nuclei (labeled ChiEFT); the third utilizes a fully relativistic treatment of nuclear dynamics as implemented in the covariant spectator theory (labeled CST). Furthermore, for momentum transfers belowmore » Q < 5 fm -1 there is satisfactory agreement between experimental data and theoretical results in all three approaches. Conversely, at Q > 5 fm -1, particularly in the case of the deuteron, a relativistic treatment of the dynamics, as is done in the CST, is necessary. The experimental data on the deuteron A structure function extend to Q ~ 12 fm -1, and the close agreement between these data and the CST results suggests that, even in this extreme kinematical regime, there is no evidence for new effects coming from quark and gluon degrees of freedom at short distances.« less
Liba, Amir; Wanagat, Jonathan
2014-11-01
Complex diseases such as heart disease, stroke, cancer, and aging are the primary causes of death in the US. These diseases cause heterogeneous conditions among cells, conditions that cannot be measured in tissue homogenates and require single cell approaches. Understanding protein levels within tissues is currently assayed using various molecular biology techniques (e.g., Western blots) that rely on milligram to gram quantities of tissue homogenates or immunofluorescent (IF) techniques that are limited by spectral overlap. Tissue homogenate studies lack references to tissue structure and mask signals from individual or rare cellular events. Novel techniques are required to bring protein measurement sensitivity to the single cell level and offer spatiotemporal resolution and scalability. We are developing a novel approach to protein quantification by exploiting the inherently low concentration of rare earth elements (REE) in biological systems. By coupling REE-antibody immunolabeling of cells with laser capture microdissection (LCM) and ICP-QQQ, we are achieving multiplexed protein measurement in histological sections of single cells. This approach will add to evolving single cell techniques and our ability to understand cellular heterogeneity in complex biological systems and diseases.
Bioinspired Infrared Sensing Materials and Systems.
Shen, Qingchen; Luo, Zhen; Ma, Shuai; Tao, Peng; Song, Chengyi; Wu, Jianbo; Shang, Wen; Deng, Tao
2018-05-11
Bioinspired engineering offers a promising alternative approach in accelerating the development of many man-made systems. Next-generation infrared (IR) sensing systems can also benefit from such nature-inspired approach. The inherent compact and uncooled operation of biological IR sensing systems provides ample inspiration for the engineering of portable and high-performance artificial IR sensing systems. This review overviews the current understanding of the biological IR sensing systems, most of which are thermal-based IR sensors that rely on either bolometer-like or photomechanic sensing mechanism. The existing efforts inspired by the biological IR sensing systems and possible future bioinspired approaches in the development of new IR sensing systems are also discussed in the review. Besides these biological IR sensing systems, other biological systems that do not have IR sensing capabilities but can help advance the development of engineered IR sensing systems are also discussed, and the related engineering efforts are overviewed as well. Further efforts in understanding the biological IR sensing systems, the learning from the integration of multifunction in biological systems, and the reduction of barriers to maximize the multidiscipline collaborations are needed to move this research field forward. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Life cycle approaches to sustainable consumption: a critical review.
Hertwich, Edgar G
2005-07-01
The 2002 World Summit for Sustainable Development in Johannesburg called for a comprehensive set of programs focusing on sustainable consumption and production. According to world leaders, these programs should rely on life cycle assessment (LCA) to promote sustainable patterns of production and consumption. Cleaner production is a well-established activity, and it uses LCA. UNEP, the European Union, and a number of national organizations have now begun to work on sustainable consumption. In developing sustainable consumption policies and activities, the use of LCA presents interesting opportunities that are not yet well understood by policy makers. This paper reviews how life cycle approaches, primarily based on input-output analysis, have been used in the area of sustainable consumption: to inform policy making, select areas of action, identify which lifestyles are more sustainable, advise consumers, and evaluate the effectiveness of sustainable consumption measures. Information on consumption patterns usually comes from consumer expenditure surveys. Different study designs and a better integration with consumer research can provide further interesting insights. Life-cycle approaches still need to be developed and tested. Current research is mostly descriptive; policy makers, however, require more strategic analysis addressing their decision options, including scenario analysis and backcasting.
Vazquez-Anderson, Jorge; Mihailovic, Mia K; Baldridge, Kevin C; Reyes, Kristofer G; Haning, Katie; Cho, Seung Hee; Amador, Paul; Powell, Warren B; Contreras, Lydia M
2017-05-19
Current approaches to design efficient antisense RNAs (asRNAs) rely primarily on a thermodynamic understanding of RNA-RNA interactions. However, these approaches depend on structure predictions and have limited accuracy, arguably due to overlooking important cellular environment factors. In this work, we develop a biophysical model to describe asRNA-RNA hybridization that incorporates in vivo factors using large-scale experimental hybridization data for three model RNAs: a group I intron, CsrB and a tRNA. A unique element of our model is the estimation of the availability of the target region to interact with a given asRNA using a differential entropic consideration of suboptimal structures. We showcase the utility of this model by evaluating its prediction capabilities in four additional RNAs: a group II intron, Spinach II, 2-MS2 binding domain and glgC 5΄ UTR. Additionally, we demonstrate the applicability of this approach to other bacterial species by predicting sRNA-mRNA binding regions in two newly discovered, though uncharacterized, regulatory RNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Struct2Net: a web service to predict protein–protein interactions using a structure-based approach
Singh, Rohit; Park, Daniel; Xu, Jinbo; Hosur, Raghavendra; Berger, Bonnie
2010-01-01
Struct2Net is a web server for predicting interactions between arbitrary protein pairs using a structure-based approach. Prediction of protein–protein interactions (PPIs) is a central area of interest and successful prediction would provide leads for experiments and drug design; however, the experimental coverage of the PPI interactome remains inadequate. We believe that Struct2Net is the first community-wide resource to provide structure-based PPI predictions that go beyond homology modeling. Also, most web-resources for predicting PPIs currently rely on functional genomic data (e.g. GO annotation, gene expression, cellular localization, etc.). Our structure-based approach is independent of such methods and only requires the sequence information of the proteins being queried. The web service allows multiple querying options, aimed at maximizing flexibility. For the most commonly studied organisms (fly, human and yeast), predictions have been pre-computed and can be retrieved almost instantaneously. For proteins from other species, users have the option of getting a quick-but-approximate result (using orthology over pre-computed results) or having a full-blown computation performed. The web service is freely available at http://struct2net.csail.mit.edu. PMID:20513650
New Approaches to Capture High Frequency Agricultural Dynamics in Africa through Mobile Phones
NASA Astrophysics Data System (ADS)
Evans, T. P.; Attari, S.; Plale, B. A.; Caylor, K. K.; Estes, L. D.; Sheffield, J.
2015-12-01
Crop failure early warning systems relying on remote sensing constitute a new critical resource to assess areas where food shortages may arise, but there is a disconnect between the patterns of crop production on the ground and the environmental and decision-making dynamics that led to a particular crop production outcome. In Africa many governments use mid-growing season household surveys to get an on-the-ground assessment of current agricultural conditions. But these efforts are cost prohibitive over large scales and only offer a one-time snapshot at a particular time point. They also rely on farmers to recall past decisions and farmer recall may be imperfect when answering retrospectively on a decision made several months back (e.g. quantity of seed planted). We introduce a novel mobile-phone based approach to acquire information from farmers over large spatial extents, at high frequency at relatively low-cost compared to household survey approaches. This system makes compromises in number of questions which can feasibly be asked of a respondent (compared to household interviews), but the benefit of capturing weekly data from farmers is very exciting. We present data gathered from farmers in Kenya and Zambia to understand key dimensions of agricultural decision making such as choice of seed variety/planting date, frequency and timing of weeding/fertilizing and coping strategies such as pursuing off-farm labor. A particularly novel aspect of this work is reporting from farmers of what their expectation of end-season harvest will be on a week-by-week basis. Farmer's themselves can serve as sentinels of crop failure in this system. And farmers responses to drought are as much driven by their expectations of looming crop failure that may be different from that gleaned from remote sensing based assessment. This work is one piece of a larger design to link farmers to high-density meteorological data in Africa as an additional tool to improve crop failure early warning systems and understand adaptation to climate variability.
An Alternative Approach to Human Servicing of Crewed Earth Orbiting Spacecraft
NASA Technical Reports Server (NTRS)
Mularski, John R.; Alpert, Brian K.
2017-01-01
As crewed spacecraft have grown larger and more complex, they have come to rely on spacewalks, or Extravehicular Activities (EVA), for mission success and crew safety. Typically, these spacecraft maintain all of the hardware and trained personnel needed to perform an EVA on-board at all times. Maintaining this capability requires volume and up-mass for storage of EVA hardware, crew time for ground and on-orbit training, and on-orbit maintenance of EVA hardware. This paper proposes an alternative methodology, utilizing launch on-need hardware and crew to provide EVA capability for space stations in Earth orbit after assembly complete, in the same way that one would call a repairman to fix something at their home. This approach would reduce ground training requirements, save Intravehicular Activity (IVA) crew time in the form of EVA hardware maintenance and on-orbit training, and lead to more efficient EVAs because they would be performed by specialists with detailed knowledge and training stemming from their direct involvement in the development of the EVA. The on-orbit crew would then be available to focus on the immediate response to the failure as well as the day-to-day operations of the spacecraft and payloads. This paper will look at how current unplanned EVAs are conducted, including the time required for preparation, and offer alternatives for future spacecraft. As this methodology relies on the on-time and on-need launch of spacecraft, any space station that utilized this approach would need a robust transportation system including more than one launch vehicle capable of carrying crew. In addition, the fault tolerance of the space station would be an important consideration in how much time was available for EVA preparation after the failure. Each future program would have to weigh the risk of on-time launch against the increase in available crew time for the main objective of the spacecraft.
An Alternative Approach to Human Servicing of Manned Earth Orbiting Spacecraft
NASA Technical Reports Server (NTRS)
Mularski, John; Alpert, Brian
2011-01-01
As manned spacecraft have grown larger and more complex, they have come to rely on spacewalks or Extravehicular Activities (EVA) for both mission success and crew safety. Typically these spacecraft maintain all of the hardware and trained personnel needed to perform an EVA on-board at all times. Maintaining this capability requires volume and up-mass for storage of EVA hardware, crew time for ground and on-orbit training, and on-orbit maintenance of EVA hardware . This paper proposes an alternative methodology to utilize launch-on-need hardware and crew to provide EVA capability for space stations in Earth orbit after assembly complete, in the same way that most people would call a repairman to fix something at their home. This approach would not only reduce ground training requirements and save Intravehicular Activity (IVA) crew time in the form of EVA hardware maintenance and on-orbit training, but would also lead to more efficient EVAs because they would be performed by specialists with detailed knowledge and training stemming from their direct involvement in the development of the EVA. The on-orbit crew would then be available to focus on the immediate response to the failure as well as the day-to-day operations of the spacecraft and payloads. This paper will look at how current ISS unplanned EVAs are conducted, including the time required for preparation, and offer alternatives for future spacecraft utilizing lessons learned from ISS. As this methodology relies entirely on the on-time and on-need launch of spacecraft, any space station that utilized this approach would need a robust transportation system including more than one launch vehicle capable of carrying crew. In addition the fault tolerance of the space station would be an important consideration in how much time was available for EVA preparation after the failure. Each future program would have to weigh the risk of on-time launch against the increase in available crew time for the main objective of the spacecraft.
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.
Das, Rahul Deb; Winter, Stephan
2016-11-23
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation
Das, Rahul Deb; Winter, Stephan
2016-01-01
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation. PMID:27886053
Environmental health risk assessments of chemical mixtures that rely on component approaches often begin by grouping the chemicals of concern according to toxicological similarity. Approaches that assume dose addition typically are used for groups of similarly-acting chemicals an...
Multi-Sensory Input in the Non-Academic ESL Classroom.
ERIC Educational Resources Information Center
Bassano, Sharron
Teaching approaches for adult English as second language students with little previous formal education or native language literacy cannot rely on the traditional written materials. For students who cannot be reached through the written word, approaches must be devised that engage other channels of perceptions. Classroom activities are suggested…
Navigating Uncertainty and Responsibility: Understanding Inequality in the Senior-Year Transition
ERIC Educational Resources Information Center
Silver, Blake R.; Roksa, Josipa
2017-01-01
Relying on interviews with 62 college seniors, this study explores students' experiences with uncertainty and responsibility as they approach graduation. Notable differences between first-generation and continuing-generation students emerged in relation to: (a) how seniors experienced responsibility and commitment as they approached graduation,…
ERIC Educational Resources Information Center
Smith, Calvin
2008-01-01
In this paper an approach to the writing of evaluation questions is outlined and developed which focuses attention on the question of the effectiveness of an educational design for bringing about the learning it is intended to facilitate. The approach develops from the idea that all educational designs rely on instructional alignment, implicitly…
Looking to Near Peers to Guide Student Discussions about Race
ERIC Educational Resources Information Center
Kaplowitz, Donna Rich; Lee, Jasmine A.; Seyka, Sheri L.
2018-01-01
The authors describe a promising approach to engaging high school students in intergroup dialogues, relying on "near peers"--in this case, local college students--to facilitate a series of classroom discussions about racial identity, differences, and opportunities to connect. Early results suggest that the approach had significant…
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
Promoting Process-Oriented Listening Instruction in the ESL Classroom
ERIC Educational Resources Information Center
Nguyen, Huong; Abbott, Marilyn L.
2016-01-01
When teaching listening, second language instructors tend to rely on product-oriented approaches that test learners' abilities to identify words and answer comprehension questions, but this does little to help learners improve upon their listening skills (e.g., Vandergrift & Goh, 2012). To address this issue, alternative approaches that guide…
Can Creativity Predict Cognitive Reserve?
ERIC Educational Resources Information Center
Palmiero, Massimiliano; Di Giacomo, Dina; Passafiume, Domenico
2016-01-01
Cognitive reserve relies on the ability to effectively cope with aging and brain damage by using alternate processes to approach tasks when standard approaches are no longer available. In this study, the issue if creativity can predict cognitive reserve has been explored. Forty participants (mean age: 61 years) filled out: the Cognitive Reserve…
Jobs and Productivity. National Issues Forum.
ERIC Educational Resources Information Center
Cohen, Richard; And Others
Rising unemployment and declining industrial productivity are major problems in the United States today. Four different strategies have been proposed for dealing with these problems. The free market approach promises economic prosperity by reducing the government's role and relying on the private sector. Advocates of this approach feel that it…
Measuring Intervention Effectiveness: The Benefits of an Item Response Theory Approach
ERIC Educational Resources Information Center
McEldoon, Katherine; Cho, Sun-Joo; Rittle-Johnson, Bethany
2012-01-01
Assessing the effectiveness of educational interventions relies on quantifying differences between interventions groups over time in a between-within design. Binary outcome variables (e.g., correct responses versus incorrect responses) are often assessed. Widespread approaches use percent correct on assessments, and repeated measures analysis of…
Sugar beet cell wall protein confers fungal and pest resistance in genetically engineered plants
USDA-ARS?s Scientific Manuscript database
Sugar beet biomass and sugar yield are reduced by diseases caused by microbial pathogens and insect pest infestations. Since disease and pest control measures continue to rely on harmful chemical fungicides and insecticides, biotechnological approaches offer an alternate approach for disease and pe...
The Flipped Class: Experience in a University Business Communication Course
ERIC Educational Resources Information Center
Sherrow, Tammy; Lang, Brenda; Corbett, Rod
2016-01-01
Business, like many other programs in higher education, continues to rely largely on traditional classroom environments. In this article, another approach to teaching and learning, the flipped classroom, is explored. After a review of relevant literature, the authors present their experience with the flipped classroom approach to teaching and…
Intercontinental Ballistic Missiles and their Role in Future Nuclear Forces
2017-05-01
they cannot carry nuclear weapons. The B-52 relies entirely on the ALCM, whereas the B-2 currently relies on unguided bombs . A new stealthy bomber...SLBM program within the next five to seven years to maintain SLBM availability into the 2050s and beyond. bomb . The new bomb will be used by stealthy...accuracy combination in an ICBM, an SLBM, a guided bomb , or a cruise missile. Similarly, speed of response and in-flight survivability favor ICBMs
Site Specific Probable Maximum Precipitation Estimates and Professional Judgement
NASA Astrophysics Data System (ADS)
Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.
2015-12-01
State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially interpolating 100-year dew point values rather than a more gauge-based approach. Site specific reviews demonstrated that both issues had potential for lowering the PMP estimate significantly by affecting the in-place and transposed moisture maximization value and, in turn, the final controlling storm for a given basin size and PMP estimate.
Validation of Commercial Fiber Optic Components for Aerospace Environments
NASA Technical Reports Server (NTRS)
Ott, Melanie N.
2005-01-01
Full qualification for commercial photonic parts as defined by the Military specification system in the past, is not feasible. Due to changes in the photonic components industry and the Military specification system that NASA had relied upon so heavily in the past, an approach to technology validation of commercial off the shelf parts had to be devised. This approach involves knowledge of system requirements, environmental requirements and failure modes of the particular components under consideration. Synthesizing the criteria together with the major known failure modes to formulate a test plan is an effective way of establishing knowledge based "qualification". Although this does not provide the type of reliability assurance that the Military specification system did in the past, it is an approach that allows for increased risk mitigation. The information presented will introduce the audience to the technology validation approach that is currently applied at NASA for the usage of commercial-off-the-shelf (COTS) fiber optic components for space flight environments. The focus will be on how to establish technology validation criteria for commercial fiber products such that continued reliable performance is assured under the harsh environmental conditions of typical missions. The goal of this presentation is to provide the audience with an approach to formulating a COTS qualification test plan for these devices. Examples from past NASA missions will be discussed.
2009-01-01
Abstract Collaborative care models for depression in primary care are effective and cost-effective, but difficult to spread to new sites. Translating Initiatives for Depression into Effective Solutions (TIDES) is an initiative to promote evidence-based collaborative care in the U.S. Veterans Health Administration (VHA). Social marketing applies marketing techniques to promote positive behavior change. Described in this paper, TIDES used a social marketing approach to foster national spread of collaborative care models. TIDES social marketing approach The approach relied on a sequential model of behavior change and explicit attention to audience segmentation. Segments included VHA national leadership, Veterans Integrated Service Network (VISN) regional leadership, facility managers, frontline providers, and veterans. TIDES communications, materials and messages targeted each segment, guided by an overall marketing plan. Results Depression collaborative care based on the TIDES model was adopted by VHA as part of the new Primary Care Mental Health Initiative and associated policies. It is currently in use in more than 50 primary care practices across the United States, and continues to spread, suggesting success for its social marketing-based dissemination strategy. Discussion and conclusion Development, execution and evaluation of the TIDES marketing effort shows that social marketing is a promising approach for promoting implementation of evidence-based interventions in integrated healthcare systems. PMID:19785754
A Conceptual Modeling Approach for OLAP Personalization
NASA Astrophysics Data System (ADS)
Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan
Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.
Michelin, Paul; Kasprzak, Kevin; Dacher, Jean Nicolas; Lefebvre, Valentin; Duparc, Fabrice
2015-08-01
In the literature, shoulder ultrasound (US) protocols rely on the widely accepted anatomical concept of the infraspinatus tendon (IST) running parallel and posterior to the supraspinatus tendon (SST). To assess the IST, authors currently recommend placing the transducer posteroinferior to the acromion; however the examination of the anterosuperior part of the IST remains problematic. The aim of our study was to apply recent anatomical knowledge to propose a simple protocol to assess the IST over its entire width including its anterosuperior margin. Six non-diseased shoulders from four cadavers were assessed in hyperextended internal rotation (HIR) position with US anterosuperolateral approach followed by dissection. Twelve healthy volunteers underwent similar US examination of the shoulder. The IST is a thin, wide, strap-like tendon. The HIR position exposed the largest area of IST beyond the acromion; combined anterosuperolateral US approach enabled imaging of the IST over its entire width with transverse and longitudinal views. The anterosuperior margin of the IST was distinguishable from the SST. The anterosuperolateral US approach in HIR position enables an accurate assessment of the IST including the transverse plane. The limit between the SST and IST appears more clearly. • The hyperextended internal rotation of the shoulder brings the infraspinatus tendon forward. • The infraspinatus tendon is visible with anterosuperolateral ultrasound approach. • The anterosuperior margin of the infraspinatus tendon is visible with this technique.
Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel
2018-05-01
Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Rouillon, M.; Taylor, M. P.; Dong, C.
2016-12-01
This research assesses the advantages of integrating field portable X-ray Fluorescence (pXRF) technology for reducing the risk and increase confidence of decision making for metal-contaminated site assessments. Metal-contaminated sites are often highly heterogeneous and require a high sampling density to accurately characterize the distribution and concentration of contaminants. The current regulatory assessment approaches rely on a small number of samples processed using standard wet-chemistry methods. In New South Wales (NSW), Australia, the current notification trigger for characterizing metal-contaminated sites require the upper 95% confidence interval of the site mean to equal or exceed the relevant guidelines. The method's low `minimum' sampling requirements can misclassify sites due to the heterogeneous nature of soil contamination, leading to inaccurate decision making. To address this issue, we propose integrating infield pXRF analysis with the established sampling method to overcome sampling limitations. This approach increases the minimum sampling resolution and reduces the 95% CI of the site mean. Infield pXRF analysis at contamination hotspots enhances sample resolution efficiently and without the need to return to the site. In this study, the current and proposed pXRF site assessment methods are compared at five heterogeneous metal-contaminated sites by analysing the spatial distribution of contaminants, 95% confidence intervals of site means, and the sampling and analysis uncertainty associated with each method. Finally, an analysis of costs associated with both the current and proposed methods is presented to demonstrate the advantages of incorporating pXRF into metal-contaminated site assessments. The data shows that pXRF integrated site assessments allows for faster, cost-efficient, characterisation of metal-contaminated sites with greater confidence for decision making.
Spatial redistribution of nano-particles using electrokinetic micro-focuser
NASA Astrophysics Data System (ADS)
Garcia, Daniel E.; Silva, Aleidy; Ho, Chih-Ming
2007-09-01
Current microfabrication technologies rely on top-down, photolithographic techniques that are ultimately limited by the wavelength of light. While systems for nanofabrication do exist, they frequently suffer from high costs and slow processing times, creating a need for a new manufacturing paradigm. The combination of top-down and bottom-up fabrication approaches in device construction creates a new paradigm in micro- and nano-manufacturing. The pre-requisite for the realization of the manufacturing paradigm relies on the manipulation of molecules in a deterministic and controlled manner. The use of AC electrokinetic forces, such as dielectrophoresis (DEP) and AC electroosmosis, is a promising technology for manipulating nano-sized particle in a parallel fashion. A three-electrode micro-focusing system was designed to expoit this forces in order to control the spatial distribution of nano-particles in different frequency ranges. Thus far, we have demonstrated the ability to concentrate 40 nm and 300 nm diameter particles using a 50 μm diameter focusing system. AC electroosmotic motion of the nano-particles was observed while using low frequencies (in a range of 30 Hz - 1 KHz). By using different frequencies and changing the ground location, we have manipulated the nano-particles into circular band structures with different width, and focused the nanoparticles into circular spots with different diameters. Currently, we are in the progress of optimizing the operation parameters (e.g. frequency and AC voltages) by using the technique of particle image velocimetry (PIV). In the future, design of different electrode geometries and the numerical simulation of electric field distribution will be carried out to manipulate the nano-particles into a variety of geometries.
A fast optimization approach for treatment planning of volumetric modulated arc therapy.
Yan, Hui; Dai, Jian-Rong; Li, Ye-Xiong
2018-05-30
Volumetric modulated arc therapy (VMAT) is widely used in clinical practice. It not only significantly reduces treatment time, but also produces high-quality treatment plans. Current optimization approaches heavily rely on stochastic algorithms which are time-consuming and less repeatable. In this study, a novel approach is proposed to provide a high-efficient optimization algorithm for VMAT treatment planning. A progressive sampling strategy is employed for beam arrangement of VMAT planning. The initial beams with equal-space are added to the plan in a coarse sampling resolution. Fluence-map optimization and leaf-sequencing are performed for these beams. Then, the coefficients of fluence-maps optimization algorithm are adjusted according to the known fluence maps of these beams. In the next round the sampling resolution is doubled and more beams are added. This process continues until the total number of beams arrived. The performance of VMAT optimization algorithm was evaluated using three clinical cases and compared to those of a commercial planning system. The dosimetric quality of VMAT plans is equal to or better than the corresponding IMRT plans for three clinical cases. The maximum dose to critical organs is reduced considerably for VMAT plans comparing to those of IMRT plans, especially in the head and neck case. The total number of segments and monitor units are reduced for VMAT plans. For three clinical cases, VMAT optimization takes < 5 min accomplished using proposed approach and is 3-4 times less than that of the commercial system. The proposed VMAT optimization algorithm is able to produce high-quality VMAT plans efficiently and consistently. It presents a new way to accelerate current optimization process of VMAT planning.
Towards sustainable and renewable systems for electrochemical energy storage.
Tarascon, Jean-Marie
2008-01-01
Renewable energy sources and electric automotive transportation are popular topics in our belated energy-conscious society, placing electrochemical energy management as one of the major technological developments for this new century. Besides efficiency, any new storage technologies will have to provide advantages in terms of cost and environmental footprint and thus rely on sustainable materials that can be processed at low temperature. To meet such challenges future devices will require inspiration from living organisms and rely on either bio-inspired or biomimetic approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biedermann, G. W.; McGuinness, H. J.; Rakholia, A. V.
Here, we demonstrate matter-wave interference in a warm vapor of rubidium atoms. Established approaches to light-pulse atom interferometry rely on laser cooling to concentrate a large ensemble of atoms into a velocity class resonant with the atom optical light pulse. In our experiment, we show that clear interference signals may be obtained without laser cooling. This effect relies on the Doppler selectivity of the atom interferometer resonance. Lastly, this interferometer may be configured to measure accelerations, and we demonstrate that multiple interferometers may be operated simultaneously by addressing multiple velocity classes.
Systems biology impact on antiepileptic drug discovery.
Margineanu, Doru Georg
2012-02-01
Systems biology (SB), a recent trend in bioscience research to consider the complex interactions in biological systems from a holistic perspective, sees the disease as a disturbed network of interactions, rather than alteration of single molecular component(s). SB-relying network pharmacology replaces the prevailing focus on specific drug-receptor interaction and the corollary of rational drug design of "magic bullets", by the search for multi-target drugs that would act on biological networks as "magic shotguns". Epilepsy being a multi-factorial, polygenic and dynamic pathology, SB approach appears particularly fit and promising for antiepileptic drug (AED) discovery. In fact, long before the advent of SB, AED discovery already involved some SB-like elements. A reported SB project aimed to find out new drug targets in epilepsy relies on a relational database that integrates clinical information, recordings from deep electrodes and 3D-brain imagery with histology and molecular biology data on modified expression of specific genes in the brain regions displaying spontaneous epileptic activity. Since hitting a single target does not treat complex diseases, a proper pharmacological promiscuity might impart on an AED the merit of being multi-potent. However, multi-target drug discovery entails the complicated task of optimizing multiple activities of compounds, while having to balance drug-like properties and to control unwanted effects. Specific design tools for this new approach in drug discovery barely emerge, but computational methods making reliable in silico predictions of poly-pharmacology did appear, and their progress might be quite rapid. The current move away from reductionism into network pharmacology allows expecting that a proper integration of the intrinsic complexity of epileptic pathology in AED discovery might result in literally anti-epileptic drugs. Copyright © 2011 Elsevier B.V. All rights reserved.
Proactive Fault Tolerance for HPC with Xen Virtualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagarajan, Arun Babu; Mueller, Frank; Engelmann, Christian
2007-01-01
with thousands of processors. At such large counts of compute nodes, faults are becoming common place. Current techniques to tolerate faults focus on reactive schemes to recover from faults and generally rely on a checkpoint/restart mechanism. Yet, in today's systems, node failures can often be anticipated by detecting a deteriorating health status. Instead of a reactive scheme for fault tolerance (FT), we are promoting a proactive one where processes automatically migrate from unhealthy nodes to healthy ones. Our approach relies on operating system virtualization techniques exemplied by but not limited to Xen. This paper contributes an automatic and transparent mechanismmore » for proactive FT for arbitrary MPI applications. It leverages virtualization techniques combined with health monitoring and load-based migration. We exploit Xen's live migration mechanism for a guest operating system (OS) to migrate an MPI task from a health-deteriorating node to a healthy one without stopping the MPI task during most of the migration. Our proactive FT daemon orchestrates the tasks of health monitoring, load determination and initiation of guest OS migration. Experimental results demonstrate that live migration hides migration costs and limits the overhead to only a few seconds making it an attractive approach to realize FT in HPC systems. Overall, our enhancements make proactive FT a valuable asset for long-running MPI application that is complementary to reactive FT using full checkpoint/ restart schemes since checkpoint frequencies can be reduced as fewer unanticipated failures are encountered. In the context of OS virtualization, we believe that this is the rst comprehensive study of proactive fault tolerance where live migration is actually triggered by health monitoring.« less
Ducrotoy, Jean-Paul; Dauvin, Jean-Claude
2008-01-01
Megatidal estuaries such as the Seine and the Somme (North-Western France) are rather well delimited and human impacts on them are well understood. Since the middle of the 19th Century, there has been a slow but irreversible degradation of the state of these English Channel estuaries. However, current conservation and restoration strategies tend to freeze habitats in a particular state, their status being defined, most often, through a patrimonial or utilitarian approach. Connectedness between biotopes (sensu habitat+community) has a tendency to be neglected, especially with regard to main ecological gradients, i.e., salinity. In this paper, evaluation methodologies are proposed with the intention of assessing changes to ecosystem functions, under anthropogenic disturbance, controlled or otherwise. The Seine (a heavily industrialised ecosystem) is compared to the Somme (considered here for its pseudo-natural features) in order to discriminate between oceanic processes (siltation and plugging of estuaries) and anthropogenic influences. Preservation and restoration of habitats rely on a robust scientific methodology. The multi-scale approach adopted in the projects presented here relies on sensitive socio-ecological assessment procedures, tools for evaluating ecological quality, and well-built monitoring programmes based upon pertinent indicators. Such managerial tools were used to refine strategies and make them compatible with the sustainable co-development of resources in a European context. This paper demonstrates how scientists were able to acquire and apply knowledge in the field of rehabilitation and restoration. Jointly with managers and policy-makers, they have brought scientific information and socio-economics together in order to answer questions about the restoration of sites or habitats and to anticipate future propositions in the spirit of Integrated Coastal Zone Management (ICZM).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; McManamay, Ryan A; Nagle, Nicholas N
Abstract As urban areas continue to grow and evolve in a world of increasing environmental awareness, the need for high resolution spatially explicit estimates for energy and water demand has become increasingly important. Though current modeling efforts mark significant progress in the effort to better understand the spatial distribution of energy and water consumption, many are provided at a course spatial resolution or rely on techniques which depend on detailed region-specific data sources that are not publicly available for many parts of the U.S. Furthermore, many existing methods do not account for errors in input data sources and may thereforemore » not accurately reflect inherent uncertainties in model outputs. We propose an alternative and more flexible Monte-Carlo simulation approach to high-resolution residential and commercial electricity and water consumption modeling that relies primarily on publicly available data sources. The method s flexible data requirement and statistical framework ensure that the model is both applicable to a wide range of regions and reflective of uncertainties in model results. Key words: Energy Modeling, Water Modeling, Monte-Carlo Simulation, Uncertainty Quantification Acknowledgment This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less
Tobón, Diana P.; Jayaraman, Srinivasan
2017-01-01
The last few years has seen a proliferation of wearable electrocardiogram (ECG) devices in the market with applications in fitness tracking, patient monitoring, athletic performance assessment, stress and fatigue detection, and biometrics, to name a few. The majority of these applications rely on the computation of the heart rate (HR) and the so-called heart rate variability (HRV) index via time-, frequency-, or non-linear-domain approaches. Wearable/portable devices, however, are highly susceptible to artifacts, particularly those resultant from movement. These artifacts can hamper HR/HRV measurement, thus pose a serious threat to cardiac monitoring applications. While current solutions rely on ECG enhancement as a pre-processing step prior to HR/HRV calculation, existing artifact removal algorithms still perform poorly under extremely noisy scenarios. To overcome this limitation, we take an alternate approach and propose the use of a spectro-temporal ECG signal representation that we show separates cardiac components from artifacts. More specifically, by quantifying the rate-of-change of ECG spectral components over time, we show that heart rate estimates can be reliably obtained even in extremely noisy signals, thus bypassing the need for ECG enhancement. With such HR measurements in hands, we then propose a new noise-robust HRV index termed MD-HRV (modulation-domain HRV) computed as the standard deviation of the obtained HR values. Experiments with synthetic ECG signals corrupted at various different signal-to-noise levels, as well as recorded noisy signals show the proposed measure outperforming several HRV benchmark parameters computed post wavelet-based enhancement. These findings suggest that the proposed HR measures and derived MD-HRV metric are well-suited for ambulant cardiac monitoring applications, particularly those involving intense movement (e.g., elite athletic training). PMID:29255653
ERIC Educational Resources Information Center
Sakurai, Yusuke; Pyhältö, Kirsi; Lindblom-Ylänne, Sari
2014-01-01
This article is based on a study which investigated whether Chinese international students at a university in Finland are more likely to rely on a Surface approach to learning and dismiss a Deep approach than are other international students in the same university educational context. In responding to a survey, students' scores with respect to the…
28 CFR 35.139 - Direct threat.
Code of Federal Regulations, 2011 CFR
2011-07-01
... public entity must make an individualized assessment, based on reasonable judgment that relies on current medical knowledge or on the best available objective evidence, to ascertain: the nature, duration, and...
A review of group ICA for fMRI data and ICA for joint inference of imaging, genetic, and ERP data
Calhoun, Vince D.; Liu, Jingyu; Adalı, Tülay
2009-01-01
Independent component analysis (ICA) has become an increasingly utilized approach for analyzing brain imaging data. In contrast to the widely used general linear model (GLM) that requires the user to parameterize the data (e.g. the brain's response to stimuli), ICA, by relying upon a general assumption of independence, allows the user to be agnostic regarding the exact form of the response. In addition, ICA is intrinsically a multivariate approach, and hence each component provides a grouping of brain activity into regions that share the same response pattern thus providing a natural measure of functional connectivity. There are a wide variety of ICA approaches that have been proposed, in this paper we focus upon two distinct methods. The first part of this paper reviews the use of ICA for making group inferences from fMRI data. We provide an overview of current approaches for utilizing ICA to make group inferences with a focus upon the group ICA approach implemented in the GIFT software. In the next part of this paper, we provide an overview of the use of ICA to combine or fuse multimodal data. ICA has proven particularly useful for data fusion of multiple tasks or data modalities such as single nucleotide polymorphism (SNP) data or event-related potentials. As demonstrated by a number of examples in this paper, ICA is a powerful and versatile data-driven approach for studying the brain. PMID:19059344
A review of group ICA for fMRI data and ICA for joint inference of imaging, genetic, and ERP data.
Calhoun, Vince D; Liu, Jingyu; Adali, Tülay
2009-03-01
Independent component analysis (ICA) has become an increasingly utilized approach for analyzing brain imaging data. In contrast to the widely used general linear model (GLM) that requires the user to parameterize the data (e.g. the brain's response to stimuli), ICA, by relying upon a general assumption of independence, allows the user to be agnostic regarding the exact form of the response. In addition, ICA is intrinsically a multivariate approach, and hence each component provides a grouping of brain activity into regions that share the same response pattern thus providing a natural measure of functional connectivity. There are a wide variety of ICA approaches that have been proposed, in this paper we focus upon two distinct methods. The first part of this paper reviews the use of ICA for making group inferences from fMRI data. We provide an overview of current approaches for utilizing ICA to make group inferences with a focus upon the group ICA approach implemented in the GIFT software. In the next part of this paper, we provide an overview of the use of ICA to combine or fuse multimodal data. ICA has proven particularly useful for data fusion of multiple tasks or data modalities such as single nucleotide polymorphism (SNP) data or event-related potentials. As demonstrated by a number of examples in this paper, ICA is a powerful and versatile data-driven approach for studying the brain.
Development of Mathematical Literacy: Results of an Empirical Study
ERIC Educational Resources Information Center
Kaiser, Gabriele; Willander, Torben
2005-01-01
In the paper the results of an empirical study, which has evaluated the development of mathematical literacy in an innovative teaching programme, are presented. The theoretical approach of mathematical literacy relies strongly on applications and modelling and the study follows the approach of R. Bybee, who develops a theoretical concept of…
Hermeneutic-Narrative Approach to Career Counselling: An Alternative to Postmodernism
ERIC Educational Resources Information Center
Thrif, Erin; Amundson, Norman
2005-01-01
Postmodern approaches to career counselling have been suggested as a viable alternative to traditional career theories that rely on modernist assumptions. However, some of the assumptions that underlie postmodernism may prove to be unhelpful to career development practice in the long run. In this article we examine critiques of postmodern…
Ability-Growth Interactions in the Acquisition of a Complex Skill: A Spline-Modeling Approach
ERIC Educational Resources Information Center
Schuelke, Matthew J.
2010-01-01
While investigating how the relationship of abilities and skill acquisition changes over the course of training, researchers have unknowingly obscured the very relationship they sought to examine by relying on analyses that focused on attainment and did not model acquisition. Although more recent approaches have modeled acquisition independently…
ERIC Educational Resources Information Center
Cattaneo, Matias D.; Titiunik, Rocío; Vazquez-Bare, Gonzalo
2017-01-01
The regression discontinuity (RD) design is a popular quasi-experimental design for causal inference and policy evaluation. The most common inference approaches in RD designs employ "flexible" parametric and nonparametric local polynomial methods, which rely on extrapolation and large-sample approximations of conditional expectations…
Environmental health risk assessments of chemical mixtures that rely on component approaches often begin by grouping the chemicals of concern according to toxicological similarity. Approaches that assume dose addition typically are used for groups of similarly-acting chemicals an...
Turning around Failing Organizations: Insights for Educational Leaders
ERIC Educational Resources Information Center
Murphy, Joseph
2010-01-01
Purpose: In this article, we review the literature from the organizational sciences to develop a grounded narrative of turnaround in education. Approach: The approach is a review of literature. We employ an integrated process to unpack and make sense of the turnaround literature from the organizational sciences. We rely on strategies appropriate…
ERIC Educational Resources Information Center
Chen, Chung-Yang; Hong, Ya-Chun; Chen, Pei-Chi
2014-01-01
Software development relies heavily on teamwork; determining how to streamline this collaborative development is an essential training subject in computer and software engineering education. A team process known as the meetings-flow (MF) approach has recently been introduced in software capstone projects in engineering programs at various…
The Crystallization of Work Values in Adolescence: A Sociocultural Approach.
ERIC Educational Resources Information Center
Krau, Edgar
1987-01-01
Investigated crystallization of work values in adolescence through the normative approach which relies on conception of value enculturation. Results from 913 ninth- and twelfth-graders from Jewish, Arab, and Catholic monastic schools. Supported hypothesis that source of work values is the subculture of the social group of affiliation, which has…
Critical Methods in Longitudinal Research with Latino Immigrant Families
ERIC Educational Resources Information Center
Díaz, Yethzèll; Denner, Jill; Ortiz, Eloy
2017-01-01
We have an ethical and a scientific imperative to do research that reflects the views and learning experiences of historically marginalized groups. Most studies that use a critical methodological approach rely on qualitative data. This article describes how a critical approach to recruitment, data collection, and retention can help to ensure that…
Understanding Genetic Toxicity Through Data Mining: The ...
This paper demonstrates the usefulness of representing a chemical by its structural features and the use of these features to profile a battery of tests rather than relying on a single toxicity test of a given chemical. This paper presents data mining/profiling methods applied in a weight-of-evidence approach to assess potential for genetic toxicity, and to guide the development of intelligent testing strategies. This paper demonstrates the usefulness of representing a chemical by its structural features and the use of these features to profile a battery of tests rather than relying on a single toxicity test of a given chemical. This paper presents data mining/profiling methods applied in a weight-of-evidence approach to assess potential for genetic toxicity, and to guide the development of intelligent testing strategies.
Approach for computing 1D fracture density: application to fracture corridor characterization
NASA Astrophysics Data System (ADS)
Viseur, Sophie; Chatelée, Sebastien; Akriche, Clement; Lamarche, Juliette
2016-04-01
Fracture density is an important parameter for characterizing fractured reservoirs. Many stochastic simulation algorithms that generate fracture networks indeed rely on the determination of a fracture density on volumes (P30) to populate the reservoir zones with individual fracture surfaces. However, only 1D fracture density (P10) are available from subsurface data and it is then important to be able to accurately estimate this entity. In this paper, a novel approach is proposed to estimate fracture density from scan-line or well data. This method relies on regression, hypothesis testing and clustering techniques. The objective of the proposed approach is to highlight zones where fracture density are statistically very different or similar. This technique has been applied on both synthetic and real case studies. These studies concern fracture corridors, which are particular tectonic features that are generally difficult to characterize from subsurface data. These tectonic features are still not well known and studies must be conducted to better understand their internal spatial organization and variability. The presented synthetic cases aim at showing the ability of the approach to extract known features. The real case study illustrates how this approach allows the internal spatial organization of fracture corridors to be characterized.
THE FUTURE OF TOXICOLOGY-PREDICTIVE TOXICOLOGY ...
A chemistry approach to predictive toxicology relies on structure−activity relationship (SAR) modeling to predict biological activity from chemical structure. Such approaches have proven capabilities when applied to well-defined toxicity end points or regions of chemical space. These approaches are less well-suited, however, to the challenges of global toxicity prediction, i.e., to predicting the potential toxicity of structurally diverse chemicals across a wide range of end points of regulatory and pharmaceutical concern. New approaches that have the potential to significantly improve capabilities in predictive toxicology are elaborating the “activity” portion of the SAR paradigm. Recent advances in two areas of endeavor are particularly promising. Toxicity data informatics relies on standardized data schema, developed for particular areas of toxicological study, to facilitate data integration and enable relational exploration and mining of data across both historical and new areas of toxicological investigation. Bioassay profiling refers to large-scale high-throughput screening approaches that use chemicals as probes to broadly characterize biological response space, extending the concept of chemical “properties” to the biological activity domain. The effective capture and representation of legacy and new toxicity data into mineable form and the large-scale generation of new bioassay data in relation to chemical toxicity, both employing chemical stru
Rethinking antibiotic research and development: World War II and the penicillin collaborative.
Quinn, Roswell
2013-03-01
Policy leaders and public health experts may be overlooking effective ways to stimulate innovative antibiotic research and development. I analyzed archival resources concerning the US government's efforts to produce penicillin during World War II, which demonstrate how much science policy can differ from present approaches. By contrast to current attempts to invigorate commercial participation in antibiotic development, the effort to develop the first commercially produced antibiotic did not rely on economic enticements or the further privatization of scientific resources. Rather, this extremely successful scientific and, ultimately, commercial endeavor was rooted in government stewardship, intraindustry cooperation, and the open exchange of scientific information. For policymakers facing the problem of stimulating antibiotic research and development, the origins of the antibiotic era offer a template for effective policy solutions that concentrate primarily on scientific rather than commercial goals.
Understanding adolescents' sleep patterns and school performance: a critical appraisal.
Wolfson, Amy R; Carskadon, Mary A
2003-12-01
The present paper reviews and critiques studies assessing the relation between sleep patterns, sleep quality, and school performance of adolescents attending middle school, high school, and/or college. The majority of studies relied on self-report, yet the researchers approached the question with different designs and measures. Specifically, studies looked at (1) sleep/wake patterns and usual grades, (2) school start time and phase preference in relation to sleep habits and quality and academic performance, and (3) sleep patterns and classroom performance (e.g., examination grades). The findings strongly indicate that self-reported shortened total sleep time, erratic sleep/wake schedules, late bed and rise times, and poor sleep quality are negatively associated with academic performance for adolescents from middle school through the college years. Limitations of the current published studies are also discussed in detail in this review.
Advanced image based methods for structural integrity monitoring: Review and prospects
NASA Astrophysics Data System (ADS)
Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.
2018-02-01
There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.
Towards thermodynamics of universal horizons in Einstein-æther theory.
Berglund, Per; Bhattacharyya, Jishnu; Mattingly, David
2013-02-15
Holography grew out of black hole thermodynamics, which relies on the causal structure and general covariance of general relativity. In Einstein-æther theory, a generally covariant theory with a dynamical timelike unit vector, every solution breaks local Lorentz invariance, thereby grossly modifying the causal structure of gravity. However, there are still absolute causal boundaries, called "universal horizons," which are not Killing horizons yet obey a first law of black hole mechanics and must have an entropy if they do not violate a generalized second law. We couple a scalar field to the timelike vector and show via the tunneling approach that the universal horizon radiates as a blackbody at a fixed temperature, even if the scalar field equations also violate local Lorentz invariance. This suggests that the class of holographic theories may be much broader than currently assumed.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment.
Szucs, Denes; Ioannidis, John P A
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out.
Nitrite Biosensing via Selective Enzymes—A Long but Promising Route
Almeida, M. Gabriela; Serra, Alexandra; Silveira, Celia M.; Moura, Jose J.G.
2010-01-01
The last decades have witnessed a steady increase of the social and political awareness for the need of monitoring and controlling environmental and industrial processes. In the case of nitrite ion, due to its potential toxicity for human health, the European Union has recently implemented a number of rules to restrict its level in drinking waters and food products. Although several analytical protocols have been proposed for nitrite quantification, none of them enable a reliable and quick analysis of complex samples. An alternative approach relies on the construction of biosensing devices using stable enzymes, with both high activity and specificity for nitrite. In this paper we review the current state-of-the-art in the field of electrochemical and optical biosensors using nitrite reducing enzymes as biorecognition elements and discuss the opportunities and challenges in this emerging market. PMID:22163541
Single-Molecule Plasmon Sensing: Current Status and Future Prospects
2017-01-01
Single-molecule detection has long relied on fluorescent labeling with high quantum-yield fluorophores. Plasmon-enhanced detection circumvents the need for labeling by allowing direct optical detection of weakly emitting and completely nonfluorescent species. This review focuses on recent advances in single molecule detection using plasmonic metal nanostructures as a sensing platform, particularly using a single particle–single molecule approach. In the past decade two mechanisms for plasmon-enhanced single-molecule detection have been demonstrated: (1) by plasmonically enhancing the emission of weakly fluorescent biomolecules, or (2) by monitoring shifts of the plasmon resonance induced by single-molecule interactions. We begin with a motivation regarding the importance of single molecule detection, and advantages plasmonic detection offers. We describe both detection mechanisms and discuss challenges and potential solutions. We finalize by highlighting the exciting possibilities in analytical chemistry and medical diagnostics. PMID:28762723
Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk
2012-12-01
Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.
Omnify: Investigating the Visibility and Effectiveness of Copyright Monitors
NASA Astrophysics Data System (ADS)
Potharaju, Rahul; Seibert, Jeff; Fahmy, Sonia; Nita-Rotaru, Cristina
The arms race between copyright agencies and P2P users is an ongoing and evolving struggle. On the one hand, content providers are using several techniques to stealthily find unauthorized distribution of copyrighted work in order to deal with the problem of Internet piracy. On the other hand, P2P users are relying increasingly on blacklists and anonymization methods in order to avoid detection. In this work, we propose a number of techniques to reveal copyright monitors' current approaches and evaluate their effectiveness. We apply these techniques on data we collected from more than 2.75 million BitTorrent swarms containing 71 million IP addresses. We provide strong evidence that certain nodes are indeed copyright monitors, show that monitoring is a world-wide phenomenon, and devise a methodology for generating blacklists for paranoid and conservative P2P users.
Localization from Visual Landmarks on a Free-Flying Robot
NASA Technical Reports Server (NTRS)
Coltin, Brian; Fusco, Jesse; Moratto, Zack; Alexandrov, Oleg; Nakamura, Robert
2016-01-01
We present the localization approach for Astrobee,a new free-flying robot designed to navigate autonomously on board the International Space Station (ISS). Astrobee will conduct experiments in microgravity, as well as assisst astronauts and ground controllers. Astrobee replaces the SPHERES robots which currently operate on the ISS, which were limited to operating in a small cube since their localization system relied on triangulation from ultrasonic transmitters. Astrobee localizes with only monocular vision and an IMU, enabling it to traverse the entire US segment of the station. Features detected on a previously-built map, optical flow information,and IMU readings are all integrated into an extended Kalman filter (EKF) to estimate the robot pose. We introduce several modifications to the filter to make it more robust to noise.Finally, we extensively evaluate the behavior of the filter on atwo-dimensional testing surface.
Arrayed antibody library technology for therapeutic biologic discovery.
Bentley, Cornelia A; Bazirgan, Omar A; Graziano, James J; Holmes, Evan M; Smider, Vaughn V
2013-03-15
Traditional immunization and display antibody discovery methods rely on competitive selection amongst a pool of antibodies to identify a lead. While this approach has led to many successful therapeutic antibodies, targets have been limited to proteins which are easily purified. In addition, selection driven discovery has produced a narrow range of antibody functionalities focused on high affinity antagonism. We review the current progress in developing arrayed protein libraries for screening-based, rather than selection-based, discovery. These single molecule per microtiter well libraries have been screened in multiplex formats against both purified antigens and directly against targets expressed on the cell surface. This facilitates the discovery of antibodies against therapeutically interesting targets (GPCRs, ion channels, and other multispanning membrane proteins) and epitopes that have been considered poorly accessible to conventional discovery methods. Copyright © 2013. Published by Elsevier Inc.
Rethinking Antibiotic Research and Development: World War II and the Penicillin Collaborative
2013-01-01
Policy leaders and public health experts may be overlooking effective ways to stimulate innovative antibiotic research and development. I analyzed archival resources concerning the US government’s efforts to produce penicillin during World War II, which demonstrate how much science policy can differ from present approaches. By contrast to current attempts to invigorate commercial participation in antibiotic development, the effort to develop the first commercially produced antibiotic did not rely on economic enticements or the further privatization of scientific resources. Rather, this extremely successful scientific and, ultimately, commercial endeavor was rooted in government stewardship, intraindustry cooperation, and the open exchange of scientific information. For policymakers facing the problem of stimulating antibiotic research and development, the origins of the antibiotic era offer a template for effective policy solutions that concentrate primarily on scientific rather than commercial goals. PMID:22698031
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment
Szucs, Denes; Ioannidis, John P. A.
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out. PMID:28824397
Wang, Yuchen; Newman, Maureen R; Benoit, Danielle S W
2018-06-01
Impaired fracture healing is a major clinical problem that can lead to patient disability, prolonged hospitalization, and significant financial burden. Although the majority of fractures heal using standard clinical practices, approximately 10% suffer from delayed unions or non-unions. A wide range of factors contribute to the risk for nonunions including internal factors, such as patient age, gender, and comorbidities, and external factors, such as the location and extent of injury. Current clinical approaches to treat nonunions include bone grafts and low-intensity pulsed ultrasound (LIPUS), which realizes clinical success only to select patients due to limitations including donor morbidities (grafts) and necessity of fracture reduction (LIPUS), respectively. To date, therapeutic approaches for bone regeneration rely heavily on protein-based growth factors such as INFUSE, an FDA-approved scaffold for delivery of bone morphogenetic protein 2 (BMP-2). Small molecule modulators and RNAi therapeutics are under development to circumvent challenges associated with traditional growth factors. While preclinical studies has shown promise, drug delivery has become a major hurdle stalling clinical translation. Therefore, this review overviews current therapies employed to stimulate fracture healing pre-clinically and clinically, including a focus on drug delivery systems for growth factors, parathyroid hormone (PTH), small molecules, and RNAi therapeutics, as well as recent advances and future promise of fracture-targeted drug delivery. Copyright © 2018 Elsevier B.V. All rights reserved.
An adaptive ARX model to estimate the RUL of aluminum plates based on its crack growth
NASA Astrophysics Data System (ADS)
Barraza-Barraza, Diana; Tercero-Gómez, Víctor G.; Beruvides, Mario G.; Limón-Robles, Jorge
2017-01-01
A wide variety of Condition-Based Maintenance (CBM) techniques deal with the problem of predicting the time for an asset fault. Most statistical approaches rely on historical failure data that might not be available in several practical situations. To address this issue, practitioners might require the use of self-starting approaches that consider only the available knowledge about the current degradation process and the asset operating context to update the prognostic model. Some authors use Autoregressive (AR) models for this purpose that are adequate when the asset operating context is constant, however, if it is variable, the accuracy of the models can be affected. In this paper, three autoregressive models with exogenous variables (ARX) were constructed, and their capability to estimate the remaining useful life (RUL) of a process was evaluated following the case of the aluminum crack growth problem. An existing stochastic model of aluminum crack growth was implemented and used to assess RUL estimation performance of the proposed ARX models through extensive Monte Carlo simulations. Point and interval estimations were made based only on individual history, behavior, operating conditions and failure thresholds. Both analytic and bootstrapping techniques were used in the estimation process. Finally, by including recursive parameter estimation and a forgetting factor, the ARX methodology adapts to changing operating conditions and maintain the focus on the current degradation level of an asset.
Role of automation in the ACRV operations
NASA Technical Reports Server (NTRS)
Sepahban, S. F.
1992-01-01
The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.
Contaminants in blood cultures: importance, implications, interpretation and prevention.
Dargère, S; Cormier, H; Verdon, R
2018-04-03
Despite the development of new microbiologic technologies, blood cultures (BCs) remain the first-line tool for the diagnosis of bloodstream infections. Their diagnostic value may be affected when a microorganism of questionable evidence is isolated-for example, coagulase-negative staphylococci, Bacillus spp., viridans group streptococci, Corynebacterium spp., Propionibacterium spp. and Micrococcus spp. Finally, making a correct diagnosis of pathogenicity (vs. contamination) is challenging. To review the current ways of dealing with the problem of BC contaminants (BCCs) and to provide practical suggestions to decrease BCC rates. PubMed electronic databases and existing reviews were searched up to December 2017 to retrieve relevant publications related to the topic. This review describes the burden of BCC and analyses the main current issues and controversies in interpreting the occurrence of potential BC contaminants. It focuses on the best-described approaches to decide whether BCC is present and discusses the different strategies of prevention in adults. Each institution should have an efficient policy to prevent BCC, emphasizing the importance of following guidelines for prescribing and collecting BCs. Training healthcare workers should focus on detrimental influence on patient care and highlight the work and costs due to contaminants. The accurate differentiation of a contaminant from a true pathogen relies on a multidisciplinary approach and the clinical judgement of experienced practitioners. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Bloomfield, Sally F.; Carling, Philip C.; Exner, Martin
2017-01-01
Hygiene procedures for hands, surfaces and fabrics are central to preventing spread of infection in settings including healthcare, food production, catering, agriculture, public settings, and home and everyday life. They are used in situations including hand hygiene, clinical procedures, decontamination of environmental surfaces, respiratory hygiene, food handling, laundry hygiene, toilet hygiene and so on. Although the principles are common to all, approaches currently used in different settings are inconsistent. A concern is the use of inconsistent terminology which is misleading, especially to people we need to communicate with such as the public or cleaning professionals. This paper reviews the data on current approaches, alongside new insights to developing hygiene procedures. Using this data, we propose a more scientifically-grounded framework for developing procedures that maximize protection against infection, based on consistent principles and terminology, and applicable across all settings. A key feature is use of test models which assess the state of surfaces after treatment rather than product performance alone. This allows procedures that rely on removal of microbes to be compared with those employing chemical or thermal inactivation. This makes it possible to ensure that a consistent “safety target level” is achieved regardless of the type of procedure used, and allows us deliver maximum health benefit whilst ensuring prudent usage of antimicrobial agents, detergents, water and energy. PMID:28670508
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
Camalier, Corrie R; Wang, Alice Y; McIntosh, Lindsey G; Park, Sohee; Neimat, Joseph S
2017-03-01
Computational and theoretical accounts hypothesize the basal ganglia play a supramodal "gating" role in the maintenance of working memory representations, especially in preservation from distractor interference. There are currently two major limitations to this account. The first is that supporting experiments have focused exclusively on the visuospatial domain, leaving questions as to whether such "gating" is domain-specific. The second is that current evidence relies on correlational measures, as it is extremely difficult to causally and reversibly manipulate subcortical structures in humans. To address these shortcomings, we examined non-spatial, auditory working memory performance during reversible modulation of the basal ganglia, an approach afforded by deep brain stimulation of the subthalamic nucleus. We found that subthalamic nucleus stimulation impaired auditory working memory performance, specifically in the group tested in the presence of distractors, even though the distractors were predictable and completely irrelevant to the encoding of the task stimuli. This study provides key causal evidence that the basal ganglia act as a supramodal filter in working memory processes, further adding to our growing understanding of their role in cognition. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rieder, Florian; Kessler, Sean; Sans, Miquel
2012-01-01
Fibrosis is a serious condition complicating chronic inflammatory processes affecting the intestinal tract. Advances in this field that rely on human studies have been slow and seriously restricted by practical and logistic reasons. As a consequence, well-characterized animal models of intestinal fibrosis have emerged as logical and essential systems to better define and understand the pathophysiology of fibrosis. In point of fact, animal models allow the execution of mechanistic studies as well as the implementation of clinical trials with novel, pathophysiology-based therapeutic approaches. This review provides an overview of the currently available animal models of intestinal fibrosis, taking into consideration the methods of induction, key characteristics of each model, and underlying mechanisms. Currently available models will be classified into seven categories: spontaneous, gene-targeted, chemical-, immune-, bacteria-, and radiation-induced as well as postoperative fibrosis. Each model will be discussed in regard to its potential to create research opportunities to gain insights into the mechanisms of intestinal fibrosis and stricture formation and assist in the development of effective and specific antifibrotic therapies. PMID:22878121
Measuring residual renal function for hemodialysis adequacy: Is there an easier option?
Davenport, Andrew
2017-10-01
Most patients starting hemodialysis (HD) have residual renal function. As such, there has been increased interest in starting patients with less frequent and shorter dialysis session times. However, for this incremental approach to be successful, patients require regular monitoring of residual renal function, so that as residual renal function declines, the amount of HD is appropriately increased. Currently most dialysis centers rely on interdialytic urine collections. However, many patients find these inconvenient and there may be marked intrapatient variability due to compliance issues. Thus, alternative markers of residual renal function are required for routine clinical practice. Currently three middle sized molecules; cystatin C, β2 microglobulin, and βtrace protein have been investigated as potential endogenous markers of glomerular filtration. Although none is ideal, combinations of these markers have been proposed to provide a more accurate estimation of glomerular clearance, and in particular cut offs for minimal residual renal function. However, in patients with low levels of residual renal function it remains unclear as to whether the benefits of residual renal function equally apply to glomerular filtration or tubular function. © 2017 International Society for Hemodialysis.
Early Breakdown of Area-Law Entanglement at the Many-Body Delocalization Transition
NASA Astrophysics Data System (ADS)
Devakul, Trithep; Singh, Rajiv R. P.
2015-10-01
We introduce the numerical linked cluster expansion as a controlled numerical tool for the study of the many-body localization transition in a disordered system with continuous nonperturbative disorder. Our approach works directly in the thermodynamic limit, in any spatial dimension, and does not rely on any finite size scaling procedure. We study the onset of many-body delocalization through the breakdown of area-law entanglement in a generic many-body eigenstate. By looking for initial signs of an instability of the localized phase, we obtain a value for the critical disorder, which we believe should be a lower bound for the true value, that is higher than current best estimates from finite size studies. This implies that most current methods tend to overestimate the extent of the localized phase due to finite size effects making the localized phase appear stable at small length scales. We also study the mobility edge in these systems as a function of energy density, and we find that our conclusion is the same at all examined energies.
Rajão, Daniela S.; Pérez, Daniel R.
2018-01-01
Influenza virus infections pose a significant threat to public health due to annual seasonal epidemics and occasional pandemics. Influenza is also associated with significant economic losses in animal production. The most effective way to prevent influenza infections is through vaccination. Current vaccine programs rely heavily on the vaccine's ability to stimulate neutralizing antibody responses to the hemagglutinin (HA) protein. One of the biggest challenges to an effective vaccination program lies on the fact that influenza viruses are ever-changing, leading to antigenic drift that results in escape from earlier immune responses. Efforts toward overcoming these challenges aim at improving the strength and/or breadth of the immune response. Novel vaccine technologies, the so-called universal vaccines, focus on stimulating better cross-protection against many or all influenza strains. However, vaccine platforms or manufacturing technologies being tested to improve vaccine efficacy are heterogeneous between different species and/or either tailored for epidemic or pandemic influenza. Here, we discuss current vaccines to protect humans and animals against influenza, highlighting challenges faced to effective and uniform novel vaccination strategies and approaches. PMID:29467737
A Framework to Understand Extreme Space Weather Event Probability.
Jonas, Seth; Fronczyk, Kassandra; Pratt, Lucas M
2018-03-12
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well-being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments. © 2018 Society for Risk Analysis.
[The role of biotechnology in pharmaceutical drug design].
Gaisser, Sibylle; Nusser, Michael
2010-01-01
Biotechnological methods have become an important tool in pharmaceutical drug research and development. Today approximately 15 % of drug revenues are derived from biopharmaceuticals. The most relevant indications are oncology, metabolic disorders and disorders of the musculoskeletal system. For the future it can be expected that the relevance of biopharmaceuticals will further increase. Currently, the share of substances in preclinical testing that rely on biotechnology is more than 25 % of all substances in preclinical testing. Products for the treatment of cancer, metabolic disorders and infectious diseases are most important. New therapeutic approaches such as RNA interference only play a minor role in current commercial drug research and development with 1.5 % of all biological preclinical substances. Investments in sustainable high technology such as biotechnology are of vital importance for a highly developed country like Germany because of its lack of raw materials. Biotechnology helps the pharmaceutical industry to develop new products, new processes, methods and services and to improve existing ones. Thus, international competitiveness can be strengthened, new jobs can be created and existing jobs preserved.
[Interventional therapy of acute myocardial infarction].
Zahn, R; Zeymer, U
2008-09-01
Currently an acute myocardial infarction has to be differentiated into ST-elevation myocardial infarction (STEMI) or non ST-elevation myocardial infarction (NSTEMI). However, there exists another definition of acute coronary syndromes (ACS), which is more important in clinical practice, for all recommendations from the guidelines of the cardiac societies concerning the invasive strategies rely on this one. Here one has to differentiate an ACS with ST-elevation (STE-ACS = STEMI) from an ACS without ST-elevation (NSTE-ACS). The last one is further divided into an NSTE-ACS with or without high risk. In patients with an NSTE-ACS with high risk an early invasive strategy is recommended within 72 h after the diagnosis. In patients with an NSTE-ACS without high risk a more conservative approach can be pursued. In STE-ACS patients primary angioplasty is the reperfusion therapy of choice, if it can be performed in a timely fashion within 2 h after diagnosis at an interventional centre with experienced interventionalists and short "door-to-balloon" times. In Germany this goal is achievable almost everywhere. Therefore it is currently the most important task to establish local networks to reach this goal.
Time-Dependent Thomas-Fermi Approach for Electron Dynamics in Metal Clusters
NASA Astrophysics Data System (ADS)
Domps, A.; Reinhard, P.-G.; Suraud, E.
1998-06-01
We propose a time-dependent Thomas-Fermi approach to the (nonlinear) dynamics of many-fermion systems. The approach relies on a hydrodynamical picture describing the system in terms of collective flow. We investigate in particular an application to electron dynamics in metal clusters. We make extensive comparisons with fully fledged quantal dynamical calculations and find overall good agreement. The approach thus provides a reliable and inexpensive scheme to study the electronic response of large metal clusters.
Towards a framework for developing semantic relatedness reference standards.
Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G
2011-04-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.
Hepatitis-Associated Liver Cancer: Gaps and Opportunities to Improve Care
McMahon, Brian; Block, Timothy; Cohen, Chari; Evans, Alison A.; Hosangadi, Anu; London, W. Thomas; Sherman, Morris
2016-01-01
The global burden of hepatocellular carcinoma (HCC; primary liver cancer) is increasing. HCC is often unaccompanied by clear symptomatology, causing patients to be unaware of their disease. Moreover, effective treatment for those with advanced disease is lacking. As such, effective surveillance and early detection of HCC are essential. However, current screening and surveillance guidelines are not being fully implemented. Some at-risk populations fall outside of the guidelines, and patients who are screened are often not diagnosed at an early enough stage for treatment to be effective. From March 17 to 19, 2015, the Hepatitis B Foundation sponsored a workshop to identify gaps and limitations in current approaches to the detection and treatment of HCC and to define research priorities and opportunities for advocacy. In this Commentary, we summarize areas for further research and action that were discussed throughout the workshop to improve the recognition of liver disease generally, improve the recognition of liver cancer risk, and improve the recognition that screening for HCC makes a life-saving difference. Participants agreed that primary prevention of HCC relies on prevention and treatment of viral hepatitis and other underlying etiologies. Earlier diagnosis (secondary prevention) needs to be substantially improved. Areas for attention include increasing practitioner awareness, better definition of at-risk populations, and improved performance of screening approaches (ultrasound, biomarkers for detection, risk stratification, targeted therapies). The heterogeneous nature of HCC makes it unlikely that a single therapeutic agent will be universally effective. Medical management will benefit from the development of new, targeted treatment approaches. PMID:26626106
Capturing the genetic makeup of the active microbiome in situ
Singer, Esther; Wagner, Michael; Woyke, Tanja
2017-06-02
More than any other technology, nucleic acid sequencing has enabled microbial ecology studies to be complemented with the data volumes necessary to capture the extent of microbial diversity and dynamics in a wide range of environments. In order to truly understand and predict environmental processes, however, the distinction between active, inactive and dead microbial cells is critical. Also, experimental designs need to be sensitive toward varying population complexity and activity, and temporal as well as spatial scales of process rates. There are a number of approaches, including single-cell techniques, which were designed to study in situ microbial activity and thatmore » have been successively coupled to nucleic acid sequencing. The exciting new discoveries regarding in situ microbial activity provide evidence that future microbial ecology studies will indispensably rely on techniques that specifically capture members of the microbiome active in the environment. Herein, we review those currently used activity-based approaches that can be directly linked to shotgun nucleic acid sequencing, evaluate their relevance to ecology studies, and discuss future directions.« less
Dynamical Systems Theory: Application to Pedagogy
NASA Astrophysics Data System (ADS)
Abraham, Jane L.
Theories of learning affect how cognition is viewed, and this subsequently leads to the style of pedagogical practice that is used in education. Traditionally, educators have relied on a variety of theories on which to base pedagogy. Behavioral learning theories influenced the teaching/learning process for over 50 years. In the 1960s, the information processing approach brought the mind back into the learning process. The current emphasis on constructivism integrates the views of Piaget, Vygotsky, and cognitive psychology. Additionally, recent scientific advances have allowed researchers to shift attention to biological processes in cognition. The problem is that these theories do not provide an integrated approach to understanding principles responsible for differences among students in cognitive development and learning ability. Dynamical systems theory offers a unifying theoretical framework to explain the wider context in which learning takes place and the processes involved in individual learning. This paper describes how principles of Dynamic Systems Theory can be applied to cognitive processes of students, the classroom community, motivation to learn, and the teaching/learning dynamic giving educational psychologists a framework for research and pedagogy.
Object tracking on mobile devices using binary descriptors
NASA Astrophysics Data System (ADS)
Savakis, Andreas; Quraishi, Mohammad Faiz; Minnehan, Breton
2015-03-01
With the growing ubiquity of mobile devices, advanced applications are relying on computer vision techniques to provide novel experiences for users. Currently, few tracking approaches take into consideration the resource constraints on mobile devices. Designing efficient tracking algorithms and optimizing performance for mobile devices can result in better and more efficient tracking for applications, such as augmented reality. In this paper, we use binary descriptors, including Fast Retina Keypoint (FREAK), Oriented FAST and Rotated BRIEF (ORB), Binary Robust Independent Features (BRIEF), and Binary Robust Invariant Scalable Keypoints (BRISK) to obtain real time tracking performance on mobile devices. We consider both Google's Android and Apple's iOS operating systems to implement our tracking approach. The Android implementation is done using Android's Native Development Kit (NDK), which gives the performance benefits of using native code as well as access to legacy libraries. The iOS implementation was created using both the native Objective-C and the C++ programing languages. We also introduce simplified versions of the BRIEF and BRISK descriptors that improve processing speed without compromising tracking accuracy.
The Nurse Project: an analysis for nurses to take back our work.
Rankin, Janet M
2009-12-01
This paper challenges nurses to join together as a collective in order to facilitate ongoing analysis of the issues that arise for nurses and patients when nursing care is harnessed for health care efficiencies. It is a call for nurses to respond with a collective strategy through which we can 'talk back' and 'act back' to the powerful rationality of current thinking and practices. The paper uses examples from an institutional ethnographic (IE) research project to demonstrate how dominant approaches to understanding nursing position nurses to overlook how we activate practices of reform that reorganize how we nurse. The paper then describes two classroom strategies taken from my work with students in undergraduate and graduate programs. The teaching strategies I describe rely on the theoretical framework that underpin the development of an IE analysis. Taken into the classroom (or into other venues of nursing activism) the tools of IE can be adapted to inform a pedagogical approach that supports nurses to develop an alternate analysis to what is happening in our work.
Jasin, Maria; Haber, James E
2016-08-01
DNA double-strand breaks (DSBs) are dangerous lesions that if not properly repaired can lead to genomic change or cell death. Organisms have developed several pathways and have many factors devoted to repairing DSBs, which broadly occurs by homologous recombination, which relies on an identical or homologous sequence to template repair, or nonhomologous end-joining. Much of our understanding of these repair mechanisms has come from the study of induced DNA cleavage by site-specific endonucleases. In addition to their biological role, these cellular pathways can be co-opted for gene editing to study gene function or for gene therapy or other applications. While the first gene editing experiments were done more than 20 years ago, the recent discovery of RNA-guided endonucleases has simplified approaches developed over the years to make gene editing an approach that is available to the entire biomedical research community. Here, we review DSB repair mechanisms and site-specific cleavage systems that have provided insight into these mechanisms and led to the current gene editing revolution. Copyright © 2016. Published by Elsevier B.V.
Oxime Ether Lipids as Transfection Agents: Assembly and Complexation with siRNA.
Puri, Anu; Zampino, Serena; Viard, Mathias; Shapiro, Bruce A
2017-01-01
RNAi-based therapeutic approaches to combat cancer and other diseases are currently an area of great interest. However, practical applications of this approach rely on optimal tools to carry and deliver siRNA to the desired site. Oxime ether lipids (OELs) are a class of molecules among other various carriers being examined for siRNA delivery. OELs, relatively new candidates, belong to a class of non-glycerol based lipids and have begun to claim their place as an siRNA delivery carrier in the field of RNAi therapy. Chemical synthesis steps of OELs are considered relatively simple with the ability to modify the functionalities as desired. OEL-siRNA complexes can be assembled in the presence of serum-containing buffers (or cell culture media) and recent data from our and other groups have demonstrated that OELs are viable carriers for siRNA delivery in the cell culture systems. In this chapter, we provide the details of experimental protocols routinely used in our laboratory to examine OEL-siRNA complexes including their assembly, stability, and transfection efficiencies.
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.