Personomics: The Missing Link in the Evolution from Precision Medicine to Personalized Medicine.
Ziegelstein, Roy C
2017-10-16
Clinical practice guidelines have been developed for many common conditions based on data from randomized controlled trials. When medicine is informed solely by clinical practice guidelines, however, the patient is not treated as an individual, but rather a member of a group. Precision medicine, as defined herein, characterizes unique biological characteristics of the individual or of specimens obtained from an individual to tailor diagnostics and therapeutics to a specific patient. These unique biological characteristics are defined by the tools of precision medicine: genomics, proteomics, metabolomics, epigenomics, pharmacogenomics, and other "-omics." Personalized medicine, as defined herein, uses additional information about the individual derived from knowing the patient as a person. These unique personal characteristics are defined by tools known as personomics which takes into account an individual's personality, preferences, values, goals, health beliefs, social support network, financial resources, and unique life circumstances that affect how and when a given health condition will manifest in that person and how that condition will respond to treatment. In this paradigm, precision medicine may be considered a necessary step in the evolution of medical care to personalized medicine, with personomics as the missing link.
Ogden R. Lindsley and the historical development of precision teaching
Potts, Lisa; Eshleman, John W.; Cooper, John O.
1993-01-01
This paper presents the historical developments of precision teaching, a technological offshoot of radical behaviorism and free-operant conditioning. The sequence progresses from the scientific precursors of precision teaching and the beginnings of precision teaching to principal developments since 1965. Information about the persons, events, and accomplishments presented in this chronology was compiled in several ways. Journals, books, and conference presentations provided the essential information. The most important source for this account was Ogden Lindsley himself, because Lindsley and his students established the basic practices that define precision teaching. PMID:22478145
Methods for evaluating stream, riparian, and biotic conditions
William S. Platts; Walter F. Megahan; G. Wayne Minshall
1983-01-01
This report develops a standard way of measuring stream, riparian, and biotic conditions and evaluates the validity of the measurements recommended. Accuracy and precision of most measurements are defined. This report will be of value to those persons documenting, monitoring, or predicting stream conditions and their biotic resources, especially those related to...
Perception of time under conditions of brief weightlessness
NASA Technical Reports Server (NTRS)
Lebedev, V. I.; Chekidra, I. F.; Kolosov, I. A.
1975-01-01
Results of experiments under conditions of brief weightlessness confirmed the theoretical concepts of the dependence of time perception on the emotional state of a man. The time test, together with other methods, can be used to precisely define the emotional state of subjects in stress situations.
Lim, Chun Ping; Mai, Phuong Nguyen Quoc; Roizman Sade, Dan; Lam, Yee Cheong; Cohen, Yehuda
2016-01-01
Life of bacteria is governed by the physical dimensions of life in microscales, which is dominated by fast diffusion and flow at low Reynolds numbers. Microbial biofilms are structurally and functionally heterogeneous and their development is suggested to be interactively related to their microenvironments. In this study, we were guided by the challenging requirements of precise tools and engineered procedures to achieve reproducible experiments at high spatial and temporal resolutions. Here, we developed a robust precise engineering approach allowing for the quantification of real-time, high-content imaging of biofilm behaviour under well-controlled flow conditions. Through the merging of engineering and microbial ecology, we present a rigorous methodology to quantify biofilm development at resolutions of single micrometre and single minute, using a newly developed flow cell. We designed and fabricated a high-precision flow cell to create defined and reproducible flow conditions. We applied high-content confocal laser scanning microscopy and developed image quantification using a model biofilm of a defined opportunistic strain, Pseudomonas putida OUS82. We observed complex patterns in the early events of biofilm formation, which were followed by total dispersal. These patterns were closely related to the flow conditions. These biofilm behavioural phenomena were found to be highly reproducible, despite the heterogeneous nature of biofilm. PMID:28721252
A numerical method of detecting singularity
NASA Technical Reports Server (NTRS)
Laporte, M.; Vignes, J.
1978-01-01
A numerical method is reported which determines a value C for the degree of conditioning of a matrix. This value is C = 0 for a singular matrix and has progressively larger values for matrices which are increasingly well-conditioned. This value is C sub = C max sub max (C defined by the precision of the computer) when the matrix is perfectly well conditioned.
Profiling Systems Using the Defining Characteristics of Systems of Systems (SoS)
2010-02-01
system exhaust and emissions system gas engine heating and air conditioning system fuel system regenerative braking system safety system...overcome the limitations of these fuzzy scales, measurement scales are often divided into a relatively small number of disjoint categories so that the...precision is not justified. This lack of precision can typically be addressed by breaking the measurement scale into a set of categories , the use of
NASA Astrophysics Data System (ADS)
Qian, Elaine A.; Wixtrom, Alex I.; Axtell, Jonathan C.; Saebi, Azin; Jung, Dahee; Rehak, Pavel; Han, Yanxiao; Moully, Elamar Hakim; Mosallaei, Daniel; Chow, Sylvia; Messina, Marco S.; Wang, Jing Yang; Royappa, A. Timothy; Rheingold, Arnold L.; Maynard, Heather D.; Král, Petr; Spokoyny, Alexander M.
2017-04-01
The majority of biomolecules are intrinsically atomically precise, an important characteristic that enables rational engineering of their recognition and binding properties. However, imparting a similar precision to hybrid nanoparticles has been challenging because of the inherent limitations of existing chemical methods and building blocks. Here we report a new approach to form atomically precise and highly tunable hybrid nanomolecules with well-defined three-dimensionality. Perfunctionalization of atomically precise clusters with pentafluoroaryl-terminated linkers produces size-tunable rigid cluster nanomolecules. These species are amenable to facile modification with a variety of thiol-containing molecules and macromolecules. Assembly proceeds at room temperature within hours under mild conditions, and the resulting nanomolecules exhibit high stabilities because of their full covalency. We further demonstrate how these nanomolecules grafted with saccharides can exhibit dramatically improved binding affinity towards a protein. Ultimately, the developed strategy allows the rapid generation of precise molecular assemblies to investigate multivalent interactions.
Multisensory Self-Motion Compensation During Object Trajectory Judgments
Dokka, Kalpana; MacNeilage, Paul R.; DeAngelis, Gregory C.; Angelaki, Dora E.
2015-01-01
Judging object trajectory during self-motion is a fundamental ability for mobile organisms interacting with their environment. This fundamental ability requires the nervous system to compensate for the visual consequences of self-motion in order to make accurate judgments, but the mechanisms of this compensation are poorly understood. We comprehensively examined both the accuracy and precision of observers' ability to judge object trajectory in the world when self-motion was defined by vestibular, visual, or combined visual–vestibular cues. Without decision feedback, subjects demonstrated no compensation for self-motion that was defined solely by vestibular cues, partial compensation (47%) for visually defined self-motion, and significantly greater compensation (58%) during combined visual–vestibular self-motion. With decision feedback, subjects learned to accurately judge object trajectory in the world, and this generalized to novel self-motion speeds. Across conditions, greater compensation for self-motion was associated with decreased precision of object trajectory judgments, indicating that self-motion compensation comes at the cost of reduced discriminability. Our findings suggest that the brain can flexibly represent object trajectory relative to either the observer or the world, but a world-centered representation comes at the cost of decreased precision due to the inclusion of noisy self-motion signals. PMID:24062317
Ductile and brittle transition behavior of titanium alloys in ultra-precision machining.
Yip, W S; To, S
2018-03-02
Titanium alloys are extensively applied in biomedical industries due to their excellent material properties. However, they are recognized as difficult to cut materials due to their low thermal conductivity, which induces a complexity to their deformation mechanisms and restricts precise productions. This paper presents a new observation about the removal regime of titanium alloys. The experimental results, including the chip formation, thrust force signal and surface profile, showed that there was a critical cutting distance to achieve better surface integrity of machined surface. The machined areas with better surface roughness were located before the clear transition point, defining as the ductile to brittle transition. The machined area at the brittle region displayed the fracture deformation which showed cracks on the surface edge. The relationship between depth of cut and the ductile to brittle transaction behavior of titanium alloys in ultra-precision machining(UPM) was also revealed in this study, it showed that the ductile to brittle transaction behavior of titanium alloys occurred mainly at relatively small depth of cut. The study firstly defines the ductile to brittle transition behavior of titanium alloys in UPM, contributing the information of ductile machining as an optimal machining condition for precise productions of titanium alloys.
Disc valve for sampling erosive process streams
Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.
1986-01-07
A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.
Zhang, Chi; Zhang, Ge; Chen, Ke-ji; Lu, Ai-ping
2016-04-01
The development of an effective classification method for human health conditions is essential for precise diagnosis and delivery of tailored therapy to individuals. Contemporary classification of disease systems has properties that limit its information content and usability. Chinese medicine pattern classification has been incorporated with disease classification, and this integrated classification method became more precise because of the increased understanding of the molecular mechanisms. However, we are still facing the complexity of diseases and patterns in the classification of health conditions. With continuing advances in omics methodologies and instrumentation, we are proposing a new classification approach: molecular module classification, which is applying molecular modules to classifying human health status. The initiative would be precisely defining the health status, providing accurate diagnoses, optimizing the therapeutics and improving new drug discovery strategy. Therefore, there would be no current disease diagnosis, no disease pattern classification, and in the future, a new medicine based on this classification, molecular module medicine, could redefine health statuses and reshape the clinical practice.
Search guidance is proportional to the categorical specificity of a target cue.
Schmidt, Joseph; Zelinsky, Gregory J
2009-10-01
Visual search studies typically assume the availability of precise target information to guide search, often a picture of the exact target. However, search targets in the real world are often defined categorically and with varying degrees of visual specificity. In five target preview conditions we manipulated the availability of target visual information in a search task for common real-world objects. Previews were: a picture of the target, an abstract textual description of the target, a precise textual description, an abstract + colour textual description, or a precise + colour textual description. Guidance generally increased as information was added to the target preview. We conclude that the information used for search guidance need not be limited to a picture of the target. Although generally less precise, to the extent that visual information can be extracted from a target label and loaded into working memory, this information too can be used to guide search.
A new Ultra Precision Interferometer for absolute length measurements down to cryogenic temperatures
NASA Astrophysics Data System (ADS)
Schödel, R.; Walkov, A.; Zenker, M.; Bartl, G.; Meeß, R.; Hagedorn, D.; Gaiser, C.; Thummes, G.; Heltzel, S.
2012-09-01
A new Ultra Precision Interferometer (UPI) was built at Physikalisch-Technische Bundesanstalt. As its precursor, the precision interferometer, it was designed for highly precise absolute length measurements of prismatic bodies, e.g. gauge blocks, under well-defined temperature conditions and pressure, making use of phase stepping imaging interferometry. The UPI enables a number of enhanced features, e.g. it is designed for a much better lateral resolution and better temperature stability. In addition to the original concept, the UPI is equipped with an external measurement pathway (EMP) in which a prismatic body can be placed alternatively. The temperature of the EMP can be controlled in a much wider range compared to the temperature of the interferometer's main chamber. An appropriate cryostat system, a precision temperature measurement system and improved imaging interferometry were established to permit absolute length measurements down to cryogenic temperature, demonstrated for the first time ever. Results of such measurements are important for studying thermal expansion of materials from room temperature towards less than 10 K.
WORK-UP OF STILLBIRTH: A REVIEW OF THE EVIDENCE
SILVER, Robert M.; VARNER, Michael W.; REDDY, Uma; GOLDENBERG, Robert; PINAR, Halit; CONWAY, Deborah; BUKOWSKI, Radek; CARPENTER, Marshall; HOGUE, Carol; WILLINGER, Marian; DUDLEY, Donald; SAADE, George; STOLL, Barbara
2009-01-01
Despite improvements in antenatal and intrapartum care, stillbirth, defined as in utero fetal death at 20 weeks of gestation or greater, remains an important, largely unstudied, and poignant problem in obstetrics. Over 26,000 stillbirths were reported in the US in 2001. Although several conditions have been linked to stillbirth, it is difficult to define the precise etiology in many cases. This paper reviews known and suspected causes of stillbirth including genetic abnormalities, infection, fetal-maternal hemorrhage, and a variety of medical conditions in the mother. The proportion of stillbirths that have a diagnostic explanation is higher in centers that conduct a defined and systematic evaluation. Recommended diagnostic tests for stillbirth are discussed. The on-going work of the NICHD Stillbirth Collaborative Research Network, a consortium of 5 academic centers in the United States that are studying the scope and causes of stillbirth, is presented. PMID:17466694
The use of radar imagery for surface water investigations
NASA Technical Reports Server (NTRS)
Bryan, M. L.
1981-01-01
The paper is concerned with the interpretation of hydrologic features using L-band (HH) imagery collected by aircraft and Seasat systems. Areas of research needed to more precisely define the accuracy and repeatability of measurements related to the conditions of surfaces and boundaries of fresh water bodies are identified. These include: the definition of shoreline, the nature of variations in surface roughness across a water body and along streams and lake shores, and the separation of ambiguous conditions which appear similar to lakes.
Hospital information system: reusability, designing, modelling, recommendations for implementing.
Huet, B
1998-01-01
The aims of this paper are to precise some essential conditions for building reuse models for hospital information systems (HIS) and to present an application for hospital clinical laboratories. Reusability is a general trend in software, however reuse can involve a more or less part of design, classes, programs; consequently, a project involving reusability must be precisely defined. In the introduction it is seen trends in software, the stakes of reuse models for HIS and the special use case constituted with a HIS. The main three parts of this paper are: 1) Designing a reuse model (which objects are common to several information systems?) 2) A reuse model for hospital clinical laboratories (a genspec object model is presented for all laboratories: biochemistry, bacteriology, parasitology, pharmacology, ...) 3) Recommendations for generating plug-compatible software components (a reuse model can be implemented as a framework, concrete factors that increase reusability are presented). In conclusion reusability is a subtle exercise of which project must be previously and carefully defined.
A Note on "Accuracy" and "Precision"
ERIC Educational Resources Information Center
Stallings, William M.; Gillmore, Gerald M.
1971-01-01
Advocates the use of precision" rather than accuracy" in defining reliability. These terms are consistently differentiated in certain sciences. Review of psychological and measurement literature reveals, however, interchangeable usage of the terms in defining reliability. (Author/GS)
Reference condition approach to restoration planning
Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.
2010-01-01
Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.
Isolation and Characterization of Precise Dye/Dendrimer Ratios
Dougherty, Casey A.; Furgal, Joseph C.; van Dongen, Mallory A.; Goodson, Theodore; Banaszak Holl, Mark M.; Manono, Janet; DiMaggio, Stassi
2014-01-01
Fluorescent dyes are commonly conjugated to nanomaterials for imaging applications using stochastic synthesis conditions that result in a Poisson distribution of dye/particle ratios and therefore a broad range of photophysical and biodistribution properties. We report the isolation and characterization of generation 5 poly(amidoamine) (G5 PAMAM) dendrimer samples containing 1, 2, 3, and 4 fluorescein (FC) or 6-carboxytetramethylrhodamine succinimidyl ester (TAMRA) dyes per polymer particle. For the fluorescein case, this was achieved by stochastically functionalizing dendrimer with a cyclooctyne `click' ligand, separation into sample containing precisely defined `click' ligand/particle ratios using reverse-phase high performance liquid chromatography (rp-HPLC), followed by reaction with excess azide-functionalized fluorescein dye. For the TAMRA samples, stochastically functionalized dendrimer was directly separated into precise dye/particle ratios using rp-HPLC. These materials were characterized using 1H and 19F NMR, rp-HPLC, UV-Vis and fluorescence spectroscopy, lifetime measurements, and MALDI. PMID:24604830
Criteria for the use of regression analysis for remote sensing of sediment and pollutants
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R.
1982-01-01
An examination of limitations, requirements, and precision of the linear multiple-regression technique for quantification of marine environmental parameters is conducted. Both environmental and optical physics conditions have been defined for which an exact solution to the signal response equations is of the same form as the multiple regression equation. Various statistical parameters are examined to define a criteria for selection of an unbiased fit when upwelled radiance values contain error and are correlated with each other. Field experimental data are examined to define data smoothing requirements in order to satisfy the criteria of Daniel and Wood (1971). Recommendations are made concerning improved selection of ground-truth locations to maximize variance and to minimize physical errors associated with the remote sensing experiment.
DILEMMAS IN THE ATTITUDE TOWARDS SUICIDE.
Carasevici, B
2016-01-01
Although apparently easy to define, the suicidal act or attempt raises complex and difficult problems due to the multitude of conditions and situations that can lead to it. In all cases the suicide's definition has always centred on the intention of one person to deliberately cause his or her death in an active manner. Defining suicide has been consecutively the temptation of philosophers, sociologists, theologians, psychologists and psychiatrists. From an epistemological point of view the suicide is an open concept without precise borders, yet not incoherent. Scientists have constantly tried to establish evaluation criteria of suicidal acts but these are variable. One can even assume that there is an infinity of combinations of characteristics that would legit- imize the label of suicide, although none of them can be particularized in any way. Not even death itself represents a necessary condition for the evaluation of an act as suicide.
Software-defined microwave photonic filter with high reconfigurable resolution
Wei, Wei; Yi, Lilin; Jaouën, Yves; Hu, Weisheng
2016-01-01
Microwave photonic filters (MPFs) are of great interest in radio frequency systems since they provide prominent flexibility on microwave signal processing. Although filter reconfigurability and tunability have been demonstrated repeatedly, it is still difficult to control the filter shape with very high precision. Thus the MPF application is basically limited to signal selection. Here we present a polarization-insensitive single-passband arbitrary-shaped MPF with ~GHz bandwidth based on stimulated Brillouin scattering (SBS) in optical fibre. For the first time the filter shape, bandwidth and central frequency can all be precisely defined by software with ~MHz resolution. The unprecedented multi-dimensional filter flexibility offers new possibilities to process microwave signals directly in optical domain with high precision thus enhancing the MPF functionality. Nanosecond pulse shaping by implementing precisely defined filters is demonstrated to prove the filter superiority and practicability. PMID:27759062
Software-defined microwave photonic filter with high reconfigurable resolution.
Wei, Wei; Yi, Lilin; Jaouën, Yves; Hu, Weisheng
2016-10-19
Microwave photonic filters (MPFs) are of great interest in radio frequency systems since they provide prominent flexibility on microwave signal processing. Although filter reconfigurability and tunability have been demonstrated repeatedly, it is still difficult to control the filter shape with very high precision. Thus the MPF application is basically limited to signal selection. Here we present a polarization-insensitive single-passband arbitrary-shaped MPF with ~GHz bandwidth based on stimulated Brillouin scattering (SBS) in optical fibre. For the first time the filter shape, bandwidth and central frequency can all be precisely defined by software with ~MHz resolution. The unprecedented multi-dimensional filter flexibility offers new possibilities to process microwave signals directly in optical domain with high precision thus enhancing the MPF functionality. Nanosecond pulse shaping by implementing precisely defined filters is demonstrated to prove the filter superiority and practicability.
Hackland, James O S; Frith, Tom J R; Thompson, Oliver; Marin Navarro, Ana; Garcia-Castro, Martin I; Unger, Christian; Andrews, Peter W
2017-10-10
Defects in neural crest development have been implicated in many human disorders, but information about human neural crest formation mostly depends on extrapolation from model organisms. Human pluripotent stem cells (hPSCs) can be differentiated into in vitro counterparts of the neural crest, and some of the signals known to induce neural crest formation in vivo are required during this process. However, the protocols in current use tend to produce variable results, and there is no consensus as to the precise signals required for optimal neural crest differentiation. Using a fully defined culture system, we have now found that the efficient differentiation of hPSCs to neural crest depends on precise levels of BMP signaling, which are vulnerable to fluctuations in endogenous BMP production. We present a method that controls for this phenomenon and could be applied to other systems where endogenous signaling can also affect the outcome of differentiation protocols. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Eliminating barriers to personalized medicine
2014-01-01
With the emergence of high-throughput discovery platforms, robust preclinical small-animal models, and efficient clinical trial pipelines, it is becoming possible to envision a time when the treatment of human neurologic diseases will become personalized. The emergence of precision medicine will require the identification of subgroups of patients most likely to respond to specific biologically based therapies. This stratification only becomes possible when the determinants that contribute to disease heterogeneity become more fully elucidated. This review discusses the defining factors that underlie disease heterogeneity relevant to the potential for individualized brain tumor (optic pathway glioma) treatments arising in the common single-gene cancer predisposition syndrome, neurofibromatosis type 1 (NF1). In this regard, NF1 is posited as a model genetic condition to establish a workable paradigm for actualizing precision therapeutics for other neurologic disorders. PMID:24975854
A Model for Axial Magnetic Bearings Including Eddy Currents
NASA Technical Reports Server (NTRS)
Kucera, Ladislav; Ahrens, Markus
1996-01-01
This paper presents an analytical method of modelling eddy currents inside axial bearings. The problem is solved by dividing an axial bearing into elementary geometric forms, solving the Maxwell equations for these simplified geometries, defining boundary conditions and combining the geometries. The final result is an analytical solution for the flux, from which the impedance and the force of an axial bearing can be derived. Several impedance measurements have shown that the analytical solution can fit the measured data with a precision of approximately 5%.
2011-06-01
to build a membership fact. The atom definition also defines the precise order of the pieces. Each argument has a label (D) and a type ( E ). The...list of ato argument). Figure 2 shows the inference rule editor. B. Name E . Rule Premises F. Rule Conclusions Figure 2. Inference rule editor One...created using this specific rule. one premise in the rule premises list ( E ), which represents a list of fact conditions that need to be found in the fact
Self-defining memories during exposure to music in Alzheimer's disease.
El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis; Gély-Nargeot, Marie-Christine; Raffard, Stéphane
2015-10-01
Research suggests that exposure to music may enhance autobiographical recall in Alzheimer's Disease (AD) patients. This study investigated whether exposure to music could enhance the production of self-defining memories, that is, memories that contribute to self-discovery, self-understanding, and identity in AD patients. Twenty-two mild-stage AD patients and 24 healthy controls were asked to produce autobiographical memories in silence, while listening to researcher-chosen music, and to their own-chosen music. AD patients showed better autobiographical recall when listening to their own-chosen music than to researcher-chosen music or than in silence. More precisely, they produced more self-defining memories during exposure to their own-chosen music than to researcher-chosen music or during silence. Additionally, AD patients produced more self-defining memories than autobiographical episodes or personal-semantics during exposure to their own-chosen music. This pattern contrasted with the poor production of self-defining memories during silence or during exposure to researcher-chosen music. Healthy controls did not seem to enjoy the same autobiographical benefits nor the same self-defining memory enhancement in the self-chosen music condition. Poor production of self-defining memories, as observed in AD, may somehow be alleviated by exposure to self-chosen music.
Report of the first Nimbus-7 SMMR Experiment Team Workshop
NASA Technical Reports Server (NTRS)
Campbell, W. J.; Gloersen, P.
1983-01-01
Preliminary results of sea ice and techniques for calculating sea ice concentration and multiyear fraction from the microwave radiances obtained from the Nimbus-7 SMMR were presented. From these results, it is evident that these groups used different and independent approaches in deriving sea ice emissivities and algorithms. This precluded precise comparisons of their results. A common set of sea ice emissivities were defined for all groups to use for subsequent more careful comparison of the results from the various sea ice parameter algorithms. To this end, three different geographical areas in two different time intervals were defined as typifying SMMR beam-filling conditions for first year sea ice, multiyear sea ice, and open water and to be used for determining the required microwave emissivities.
Lindborg, Beth A; Brekke, John H; Vegoe, Amanda L; Ulrich, Connor B; Haider, Kerri T; Subramaniam, Sandhya; Venhuizen, Scott L; Eide, Cindy R; Orchard, Paul J; Chen, Weili; Wang, Qi; Pelaez, Francisco; Scott, Carolyn M; Kokkoli, Efrosini; Keirstead, Susan A; Dutton, James R; Tolar, Jakub; O'Brien, Timothy D
2016-07-01
Tissue organoids are a promising technology that may accelerate development of the societal and NIH mandate for precision medicine. Here we describe a robust and simple method for generating cerebral organoids (cOrgs) from human pluripotent stem cells by using a chemically defined hydrogel material and chemically defined culture medium. By using no additional neural induction components, cOrgs appeared on the hydrogel surface within 10-14 days, and under static culture conditions, they attained sizes up to 3 mm in greatest dimension by day 28. Histologically, the organoids showed neural rosette and neural tube-like structures and evidence of early corticogenesis. Immunostaining and quantitative reverse-transcription polymerase chain reaction demonstrated protein and gene expression representative of forebrain, midbrain, and hindbrain development. Physiologic studies showed responses to glutamate and depolarization in many cells, consistent with neural behavior. The method of cerebral organoid generation described here facilitates access to this technology, enables scalable applications, and provides a potential pathway to translational applications where defined components are desirable. Tissue organoids are a promising technology with many potential applications, such as pharmaceutical screens and development of in vitro disease models, particularly for human polygenic conditions where animal models are insufficient. This work describes a robust and simple method for generating cerebral organoids from human induced pluripotent stem cells by using a chemically defined hydrogel material and chemically defined culture medium. This method, by virtue of its simplicity and use of defined materials, greatly facilitates access to cerebral organoid technology, enables scalable applications, and provides a potential pathway to translational applications where defined components are desirable. ©AlphaMed Press.
Precise Interval Timer for Software Defined Radio
NASA Technical Reports Server (NTRS)
Pozhidaev, Aleksey (Inventor)
2014-01-01
A precise digital fractional interval timer for software defined radios which vary their waveform on a packet-by-packet basis. The timer allows for variable length in the preamble of the RF packet and allows to adjust boundaries of the TDMA (Time Division Multiple Access) Slots of the receiver of an SDR based on the reception of the RF packet of interest.
Robust Flight Path Determination for Mars Precision Landing Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Bayard, David S.; Kohen, Hamid
1997-01-01
This paper documents the application of genetic algorithms (GAs) to the problem of robust flight path determination for Mars precision landing. The robust flight path problem is defined here as the determination of the flight path which delivers a low-lift open-loop controlled vehicle to its desired final landing location while minimizing the effect of perturbations due to uncertainty in the atmospheric model and entry conditions. The genetic algorithm was capable of finding solutions which reduced the landing error from 111 km RMS radial (open-loop optimal) to 43 km RMS radial (optimized with respect to perturbations) using 200 hours of computation on an Ultra-SPARC workstation. Further reduction in the landing error is possible by going to closed-loop control which can utilize the GA optimized paths as nominal trajectories for linearization.
NASA Astrophysics Data System (ADS)
Belo, João Filipe; Greenberg, Michael; Igarashi, Atsushi; Pierce, Benjamin C.
Manifest contracts track precise properties by refining types with predicates - e.g., {x : Int |x > 0 } denotes the positive integers. Contracts and polymorphism make a natural combination: programmers can give strong contracts to abstract types, precisely stating pre- and post-conditions while hiding implementation details - for example, an abstract type of stacks might specify that the pop operation has input type {x :α Stack |not ( empty x )} . We formalize this combination by defining FH, a polymorphic calculus with manifest contracts, and establishing fundamental properties including type soundness and relational parametricity. Our development relies on a significant technical improvement over earlier presentations of contracts: instead of introducing a denotational model to break a problematic circularity between typing, subtyping, and evaluation, we develop the metatheory of contracts in a completely syntactic fashion, omitting subtyping from the core system and recovering it post facto as a derived property.
The parametrization of radio source coordinates in VLBI and its impact on the CRF
NASA Astrophysics Data System (ADS)
Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald
2016-04-01
Usually celestial radio sources in the celestial reference frame (CRF) catalog are divided in three categories: defining, special handling, and others. The defining sources are those used for the datum realization of the celestial reference frame, i.e. they are included in the No-Net-Rotation (NNR) constraints to maintain the axis orientation of the CRF, and are modeled with one set of totally constant coordinates. At the current level of precision, the choice of the defining sources has a significant effect on the coordinates. For the ICRF2 295 sources were chosen as defining sources, based on their geometrical distribution, statistical properties, and stability. The number of defining sources is a compromise between the reliability of the datum, which increases with the number of sources, and the noise which is introduced by each source. Thus, the optimal number of defining sources is a trade-off between reliability, geometry, and precision. In the ICRF2 only 39 of sources were sorted into the special handling group as they show large fluctuations in their position, therefore they are excluded from the NNR conditions and their positions are normally estimated for each VLBI session instead of as global parameters. All the remaining sources are classified as others. However, a large fraction of these unstable sources show other favorable characteristics, e.g. large flux density (brightness) and a long history of observations. Thus, it would prove advantageous including these sources into the NNR condition. However, the instability of these objects inhibit this. If the coordinate model of these sources would be extended, it would be possible to use these sources for the NNR condition as well. All other sources are placed in the "others" group. This is the largest group of sources, containing those which have not shown any very problematic behavior, but still do not fulfill the requirements for defining sources. Studies show that the behavior of each source can vary dramatically in time. Hence, each source would have to be modeled individually. Considering this, the shear amount of sources, in our study more than 600 are included, sets practical limitations. We decided to use the multivariate adaptive regression splines (MARS) procedure to parametrize the source coordinates, as they allow a great deal of automation as it combines recursive partitioning and spline fitting in an optimal way. The algorithm finds the ideal knot positions for the splines and thus the best number of polynomial pieces to fit the data. We investigate linear and cubic splines determined by MARS to "human" determined linear splines and their impact on the CRF. Within this work we try to answer the following questions: How can we find optimal criteria for the definition of the defining and unstable sources? What are the best polynomials for the individual categories? How much can we improve the CRF by extending the parametrization of the sources?
On the precision of experimentally determined protein folding rates and φ-values
De Los Rios, Miguel A.; Muralidhara, B.K.; Wildes, David; Sosnick, Tobin R.; Marqusee, Susan; Wittung-Stafshede, Pernilla; Plaxco, Kevin W.; Ruczinski, Ingo
2006-01-01
φ-Values, a relatively direct probe of transition-state structure, are an important benchmark in both experimental and theoretical studies of protein folding. Recently, however, significant controversy has emerged regarding the reliability with which φ-values can be determined experimentally: Because φ is a ratio of differences between experimental observables it is extremely sensitive to errors in those observations when the differences are small. Here we address this issue directly by performing blind, replicate measurements in three laboratories. By monitoring within- and between-laboratory variability, we have determined the precision with which folding rates and φ-values are measured using generally accepted laboratory practices and under conditions typical of our laboratories. We find that, unless the change in free energy associated with the probing mutation is quite large, the precision of φ-values is relatively poor when determined using rates extrapolated to the absence of denaturant. In contrast, when we employ rates estimated at nonzero denaturant concentrations or assume that the slopes of the chevron arms (mf and mu) are invariant upon mutation, the precision of our estimates of φ is significantly improved. Nevertheless, the reproducibility we thus obtain still compares poorly with the confidence intervals typically reported in the literature. This discrepancy appears to arise due to differences in how precision is calculated, the dependence of precision on the number of data points employed in defining a chevron, and interlaboratory sources of variability that may have been largely ignored in the prior literature. PMID:16501226
Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.
ERIC Educational Resources Information Center
Caltagirone, Paul J.; Glover, Christopher E.
1985-01-01
A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…
Sensor-based precision fertilization for field crops
USDA-ARS?s Scientific Manuscript database
From the development of the first viable variable-rate fertilizer systems in the upper Midwest USA, precision agriculture is now approaching three decades old. Early precision fertilization practice relied on laboratory analysis of soil samples collected on a spatial pattern to define the nutrient-s...
Submillihertz magnetic spectroscopy performed with a nanoscale quantum sensor
NASA Astrophysics Data System (ADS)
Schmitt, Simon; Gefen, Tuvia; Stürner, Felix M.; Unden, Thomas; Wolff, Gerhard; Müller, Christoph; Scheuer, Jochen; Naydenov, Boris; Markham, Matthew; Pezzagna, Sebastien; Meijer, Jan; Schwarz, Ilai; Plenio, Martin; Retzker, Alex; McGuinness, Liam P.; Jelezko, Fedor
2017-05-01
Precise timekeeping is critical to metrology, forming the basis by which standards of time, length, and fundamental constants are determined. Stable clocks are particularly valuable in spectroscopy because they define the ultimate frequency precision that can be reached. In quantum metrology, the qubit coherence time defines the clock stability, from which the spectral linewidth and frequency precision are determined. We demonstrate a quantum sensing protocol in which the spectral precision goes beyond the sensor coherence time and is limited by the stability of a classical clock. Using this technique, we observed a precision in frequency estimation scaling in time T as T-3/2 for classical oscillating fields. The narrow linewidth magnetometer based on single spins in diamond is used to sense nanoscale magnetic fields with an intrinsic frequency resolution of 607 microhertz, which is eight orders of magnitude narrower than the qubit coherence time.
Impulsivity modulates performance under response uncertainty in a reaching task.
Tzagarakis, C; Pellizzer, G; Rogers, R D
2013-03-01
We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.
ERIC Educational Resources Information Center
Couch, Richard W.
Precision teaching (PT) is an approach to the science of human behavior that focuses on precise monitoring of carefully defined behaviors in an attempt to construct an environmental analysis of that behavior and its controlling variables. A variety of subjects have been used with PT, ranging in academic objectives from beginning reading to college…
Eliminating barriers to personalized medicine: learning from neurofibromatosis type 1.
Gutmann, David H
2014-07-29
With the emergence of high-throughput discovery platforms, robust preclinical small-animal models, and efficient clinical trial pipelines, it is becoming possible to envision a time when the treatment of human neurologic diseases will become personalized. The emergence of precision medicine will require the identification of subgroups of patients most likely to respond to specific biologically based therapies. This stratification only becomes possible when the determinants that contribute to disease heterogeneity become more fully elucidated. This review discusses the defining factors that underlie disease heterogeneity relevant to the potential for individualized brain tumor (optic pathway glioma) treatments arising in the common single-gene cancer predisposition syndrome, neurofibromatosis type 1 (NF1). In this regard, NF1 is posited as a model genetic condition to establish a workable paradigm for actualizing precision therapeutics for other neurologic disorders. © 2014 American Academy of Neurology.
NASA Astrophysics Data System (ADS)
Gawronek, Pelagia; Makuch, Maria
2017-12-01
The classical measurements of stability of railway bridge, in the context of determining the vertical displacements of the object, consisted on precise leveling of girders and trigonometric leveling of controlled points (fixed into girders' surface). The construction elements, which were measured in two ways, in real terms belonged to the same vertical planes. Altitude measurements of construction were carried out during periodic structural stability tests and during static load tests of bridge by train. The specificity of displacement measurements, the type of measured object and the rail land surveying measurement conditions were determinants to define methodology of altitude measurement. The article presents compatibility of vertical displacements of steel railway bridge, which were developed in two measurement methods. In conclusion, the authors proposed the optimum concept of determining the vertical displacements of girders by using precise and trigonometric leveling (in terms of accuracy, safety and economy of measurement).
Johnson-Chavarria, Eric M.; Agrawal, Utsav; Tanyeri, Melikhan; Kuhlman, Thomas E.
2014-01-01
We report an automated microfluidic-based platform for single cell analysis that allows for cell culture in free solution with the ability to control the cell growth environment. Using this approach, cells are confined by the sole action of gentle fluid flow, thereby enabling non-perturbative analysis of cell growth away from solid boundaries. In addition, the single cell microbioreactor allows for precise and time-dependent control over cell culture media, with the combined ability to observe the dynamics of non-adherent cells over long time scales. As a proof-of-principle demonstration, we used the platform to observe dynamic cell growth, gene expression, and intracellular diffusion of repressor proteins while precisely tuning the cell growth environment. Overall, this microfluidic approach enables the direct observation of cellular dynamics with exquisite control over environmental conditions, which will be useful for quantifying the behaviour of single cells in well-defined media. PMID:24836754
Pascoal, Lívia Maia; Lopes, Marcos Venícios de Oliveira; Chaves, Daniel Bruno Resende; Beltrão, Beatriz Amorim; da Silva, Viviane Martins; Monteiro, Flávia Paula Magalhães
2015-01-01
OBJECTIVE: to analyze the accuracy of the defining characteristics of the Impaired gas exchange nursing diagnosis in children with acute respiratory infection. METHOD: open prospective cohort study conducted with 136 children monitored for a consecutive period of at least six days and not more than ten days. An instrument based on the defining characteristics of the Impaired gas exchange diagnosis and on literature addressing pulmonary assessment was used to collect data. The accuracy means of all the defining characteristics under study were computed. RESULTS: the Impaired gas exchange diagnosis was present in 42.6% of the children in the first assessment. Hypoxemia was the characteristic that presented the best measures of accuracy. Abnormal breathing presented high sensitivity, while restlessness, cyanosis, and abnormal skin color showed high specificity. All the characteristics presented negative predictive values of 70% and cyanosis stood out by its high positive predictive value. CONCLUSION: hypoxemia was the defining characteristic that presented the best predictive ability to determine Impaired gas exchange. Studies of this nature enable nurses to minimize variability in clinical situations presented by the patient and to identify more precisely the nursing diagnosis that represents the patient's true clinical condition. PMID:26155010
Simplicity constraints: A 3D toy model for loop quantum gravity
NASA Astrophysics Data System (ADS)
Charles, Christoph
2018-05-01
In loop quantum gravity, tremendous progress has been made using the Ashtekar-Barbero variables. These variables, defined in a gauge fixing of the theory, correspond to a parametrization of the solutions of the so-called simplicity constraints. Their geometrical interpretation is however unsatisfactory as they do not constitute a space-time connection. It would be possible to resolve this point by using a full Lorentz connection or, equivalently, by using the self-dual Ashtekar variables. This leads however to simplicity constraints or reality conditions which are notoriously difficult to implement in the quantum theory. We explore in this paper the possibility of using completely degenerate actions to impose such constraints at the quantum level in the context of canonical quantization. To do so, we define a simpler model, in 3D, with similar constraints by extending the phase space to include an independent vielbein. We define the classical model and show that a precise quantum theory by gauge unfixing can be defined out of it, completely equivalent to the standard 3D Euclidean quantum gravity. We discuss possible future explorations around this model as it could help as a stepping stone to define full-fledged covariant loop quantum gravity.
Technical performance of lactate biosensors and a test-strip device during labour.
Luttkus, A K; Fotopoulou, C; Sehouli, J; Stupin, J; Dudenhausen, J W
2010-04-01
Lactate in fetal blood has a high diagnostic power to detect fetal compromise due to hypoxia, as lactate allows an estimation of duration and intensity of metabolic acidemia. Biosensor technology allows an instantaneous diagnosis of fetal compromise in the delivery room. The goal of the current investigation is to define the preanalytical and analytical biases of this technology under routine conditions in a labour ward in comparison to test-strip technology, which allows measurement of lactate alone. Three lactate biosensors (RapidLab 865, Siemens Medical Solutions Diagnostics, Bad Nauheim, Germany; Radiometer ABL625 and ABL 700, Radiometer Copenhagen, Denmark) and one test-strip device (Lactate Pro, Oxford Instruments, UK) were evaluated regarding precision in serial and repetitive measurements in over 1350 samples of fetal whole blood. The coefficient of variation (CV) and the standard deviation (SD) were calculated. The average value of all three biosensors was defined as an artificial reference value (refval). Blood tonometry was performed in order to test the quality of respiratory parameters and to simulate conditions of fetal hypoxia (pO (2): 10 and 20 mmHg). The precision of serial measurements of all biosensors indicated a coefficient of variation (CV) between 1.55 and 3.16% with an SD from 0.042 to 0.053 mmol/L. The test-strip device (Lactate Pro) mounted to 0.117 mmol/L and 3.99% (SD, CV). When compared to our reference value (refval) ABL 625 showed the closest correlation of -0.1%, while Siemens RapidLab 865 showed an overestimation of +8.9%, ABL700 an underestimation of -6.2% and Lactate Pro of -3.7%. For routine use all tested biosensors show sufficient precision. The test-strip device shows a slightly higher standard deviation. A direct comparison of measured lactate values from the various devices needs to be interpreted with caution as each method detects different lactate concentrations. Furthermore, the 40 min process of tonometry led to an increase of SD and coefficient of variation in all devices. This results in the important preanalytical finding that the precision of replicated measurements worsens significantly with time. The clinician should be aware of the type of analyser used and of preanalytical biases before making clinical decisions on the basis of lactate values.
An Aristotelian Account of Minimal Chemical Life
NASA Astrophysics Data System (ADS)
Bedau, Mark A.
2010-12-01
This paper addresses the open philosophical and scientific problem of explaining and defining life. This problem is controversial, and there is nothing approaching a consensus about what life is. This raises a philosophical meta-question: Why is life so controversial and so difficult to define? This paper proposes that we can attribute a significant part of the controversy over life to use of a Cartesian approach to explaining life, which seeks necessary and sufficient conditions for being an individual living organism, out of the context of other organisms and the abiotic environment. The Cartesian approach contrasts with an Aristotelian approach to explaining life, which considers life only in the whole context in which it actually exists, looks at the characteristic phenomena involving actual life, and seeks the deepest and most unified explanation for those phenomena. The phenomena of life might be difficult to delimit precisely, but it certainly includes life's characteristic hallmarks, borderline cases, and puzzles. The Program-Metabolism-Container (PMC) model construes minimal chemical life as a functionally integrated triad of chemical systems, which are identified as the Program, Metabolism, and Container. Rasmussen diagrams precisely depict the functional definition of minimal chemical life. The PMC model illustrates the Aristotelian approach to life, because it explains eight of life's hallmarks, one of life's borderline cases (the virus), and two of life's puzzles.
An Aristotelian account of minimal chemical life.
Bedau, Mark A
2010-12-01
This paper addresses the open philosophical and scientific problem of explaining and defining life. This problem is controversial, and there is nothing approaching a consensus about what life is. This raises a philosophical meta-question: Why is life so controversial and so difficult to define? This paper proposes that we can attribute a significant part of the controversy over life to use of a Cartesian approach to explaining life, which seeks necessary and sufficient conditions for being an individual living organism, out of the context of other organisms and the abiotic environment. The Cartesian approach contrasts with an Aristotelian approach to explaining life, which considers life only in the whole context in which it actually exists, looks at the characteristic phenomena involving actual life, and seeks the deepest and most unified explanation for those phenomena. The phenomena of life might be difficult to delimit precisely, but it certainly includes life's characteristic hallmarks, borderline cases, and puzzles. The Program-Metabolism-Container (PMC) model construes minimal chemical life as a functionally integrated triad of chemical systems, which are identified as the Program, Metabolism, and Container. Rasmussen diagrams precisely depict the functional definition of minimal chemical life. The PMC model illustrates the Aristotelian approach to life, because it explains eight of life's hallmarks, one of life's borderline cases (the virus), and two of life's puzzles.
Eigenvalues of the Laplacian of a graph
NASA Technical Reports Server (NTRS)
Anderson, W. N., Jr.; Morley, T. D.
1971-01-01
Let G be a finite undirected graph with no loops or multiple edges. The Laplacian matrix of G, Delta(G), is defined by Delta sub ii = degree of vertex i and Delta sub ij = -1 if there is an edge between vertex i and vertex j. The structure of the graph G is related to the eigenvalues of Delta(G); in particular, it is proved that all the eigenvalues of Delta(G) are nonnegative, less than or equal to the number of vertices, and less than or equal to twice the maximum vertex degree. Precise conditions for equality are given.
NASA Astrophysics Data System (ADS)
Niedermeier, Dennis; Voigtländer, Jens; Siebert, Holger; Desai, Neel; Shaw, Raymond; Chang, Kelken; Krueger, Steven; Schumacher, Jörg; Stratmann, Frank
2017-11-01
Turbulence - cloud droplet interaction processes have been investigated primarily through numerical simulation and field measurements over the last ten years. However, only in the laboratory we can be confident in our knowledge of initial and boundary conditions, and are able to measure for extended times under statistically stationary and repeatable conditions. Therefore, the newly built turbulent wind tunnel LACIS-T (Turbulent Leipzig Aerosol Cloud Interaction Simulator) is an ideal facility for pursuing mechanistic understanding of these processes. Within the tunnel we are able to adjust precisely controlled turbulent temperature and humidity fields so as to achieve supersaturation levels allowing for detailed investigations of the interactions between cloud microphysical processes (e.g., cloud droplet activation) and the turbulent flow, under well-defined and reproducible laboratory conditions. We will present the fundamental operating principle, first results from ongoing characterization efforts, numerical simulations as well as first droplet activation experiments.
Validating internal controls for quantitative plant gene expression studies.
Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H
2004-08-18
Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.
Base line estimation using single passes of laser data
NASA Technical Reports Server (NTRS)
Dunn, P. J.; Torrence, M.; Smith, D. E.; Kolenkiewicz, R.
1979-01-01
The laser data of the GEOS 3 satellite passes observed by four stations at Greenbelt (Maryland), Bermuda, Grand Turk Island (Bahamas) and Patrick Air Force Base (Florida), were employed to determine precise interstation base lines and relative heights in short orbital arcs of no more than 12-min duration. No more than five arcs of data are required to define the interstation base lines to 30-cm precision. Base lines running parallel to the orbital motion can be defined to submeter precision from a single short arc of data. Combining arcs of different orbital geometry in a common adjustment of two or more stations relative to the base station helps to compensate for weak base line definition in any single arc. This technique can be used for tracking such spacecraft as Lageos, a high-altitude retroreflector-carrying satellite designed for precise laser ranging studies.
Atomically precise edge chlorination of nanographenes and its application in graphene nanoribbons
Tan, Yuan-Zhi; Yang, Bo; Parvez, Khaled; Narita, Akimitsu; Osella, Silvio; Beljonne, David; Feng, Xinliang; Müllen, Klaus
2013-01-01
Chemical functionalization is one of the most powerful and widely used strategies to control the properties of nanomaterials, particularly in the field of graphene. However, the ill-defined structure of the present functionalized graphene inhibits atomically precise structural characterization and structure-correlated property modulation. Here we present a general edge chlorination protocol for atomically precise functionalization of nanographenes at different scales from 1.2 to 3.4 nm and its application in graphene nanoribbons. The well-defined edge chlorination is unambiguously confirmed by X-ray single-crystal analysis, which also discloses the characteristic non-planar molecular shape and detailed bond lengths of chlorinated nanographenes. Chlorinated nanographenes and graphene nanoribbons manifest enhanced solution processability associated with decreases in the optical band gap and frontier molecular orbital energy levels, exemplifying the structure-correlated property modulation by precise edge chlorination. PMID:24212200
Universal recovery map for approximate Markov chains.
Sutter, David; Fawzi, Omar; Renner, Renato
2016-02-01
A central question in quantum information theory is to determine how well lost information can be reconstructed. Crucially, the corresponding recovery operation should perform well without knowing the information to be reconstructed. In this work, we show that the quantum conditional mutual information measures the performance of such recovery operations. More precisely, we prove that the conditional mutual information I ( A : C | B ) of a tripartite quantum state ρ ABC can be bounded from below by its distance to the closest recovered state [Formula: see text], where the C -part is reconstructed from the B -part only and the recovery map [Formula: see text] merely depends on ρ BC . One particular application of this result implies the equivalence between two different approaches to define topological order in quantum systems.
Universal recovery map for approximate Markov chains
Sutter, David; Fawzi, Omar; Renner, Renato
2016-01-01
A central question in quantum information theory is to determine how well lost information can be reconstructed. Crucially, the corresponding recovery operation should perform well without knowing the information to be reconstructed. In this work, we show that the quantum conditional mutual information measures the performance of such recovery operations. More precisely, we prove that the conditional mutual information I(A:C|B) of a tripartite quantum state ρABC can be bounded from below by its distance to the closest recovered state RB→BC(ρAB), where the C-part is reconstructed from the B-part only and the recovery map RB→BC merely depends on ρBC. One particular application of this result implies the equivalence between two different approaches to define topological order in quantum systems. PMID:27118889
Preheat Measurements for Supernova Hydrodynamics Experiments
NASA Astrophysics Data System (ADS)
Krauland, Christine; Kuranz, Carolyn; Drake, Paul; Grosskopf, Mike; Campbell, Duncan
2007-11-01
The use of multi-kilojoule, ns lasers to launch shock waves has become a standard method for initiating hydrodynamic experiments in the field of Laboratory Astrophysics. However, the intense laser ablation that creates moving plasma also leads to the production of unwanted energetic x-rays and suprathermal electrons, both of which can be sources of material preheating. In principle, this preheat can alter the conditions of the experimental setup prior to the desired experiment actually taking place. At the University of Michigan, ongoing Rayleigh-Taylor instability experiments are defined by precise initial conditions, and potential deformation due to preheat could greatly affect their accuracy. An experiment devised and executed in an attempt to assess the preheat in this specific case will be presented, along with the quantitative analysis of the data obtained.
Compositional redistribution during casting of Hg sub 0.8 Cd sub 0.2 Te alloys
NASA Technical Reports Server (NTRS)
Su, Ching-Hua; Perry, G. L. E.; Szofran, F. R.; Lehoczky, S. L.
1986-01-01
A series of Hg(0.8)Cd(0.2)Te ingots was cast both vertically and horizontally under well-defined thermal conditions by using a two-zone furnace with isothermal heat-pipe liners. The main objective of the experiments was to establish correlations between casting parameters and compositional redistribution and to develop ground-based data for a proposed flight experiment of casting of Hg(1-x)Cd(x)Te alloys under reduced gravity conditions. The compositional variations along the axial and radial directions were determined by precision density measurements, infrared transmission spectra, and X-ray energy dispersion spectrometry. Comparison between the experimental results and a numerical simulation of the solidification process of Hg(0.8)Cd(0.2)Te is described.
Chen, Yi-Dao; Huang, Shiang-Fu; Wang, Hung-Ming
2015-01-01
To precisely and faithfully perform cell-based drug chemosensitivity assays, a well-defined and biologically relevant culture condition is required. For the former, a perfusion microbioreactor system capable of providing a stable culture condition was adopted. For the latter, however, little is known about the impact of culture models on the physiology and chemosensitivity assay results of primary oral cavity cancer cells. To address the issues, experiments were performed. Results showed that minor environmental pH change could significantly affect the metabolic activity of cells, demonstrating the importance of stable culture condition for such assays. Moreover, the culture models could also significantly influence the metabolic activity and proliferation of cells. Furthermore, the choice of culture models might lead to different outcomes of chemosensitivity assays. Compared with the similar test based on tumor-level assays, the spheroid model could overestimate the drug resistance of cells to cisplatin, whereas the 2D and 3D culture models might overestimate the chemosensitivity of cells to such anticancer drug. In this study, the 3D culture models with same cell density as that in tumor samples showed comparable chemosensitivity assay results as the tumor-level assays. Overall, this study has provided some fundamental information for establishing a precise and faithful drug chemosensitivity assay. PMID:25654105
Montague, Shelby A; Baker, Bruce S
2016-01-01
An animal's ability to learn and to form memories is essential for its survival. The fruit fly has proven to be a valuable model system for studies of learning and memory. One learned behavior in fruit flies is courtship conditioning. In Drosophila courtship conditioning, male flies learn not to court females during training with an unreceptive female. He retains a memory of this training and for several hours decreases courtship when subsequently paired with any female. Courtship conditioning is a unique learning paradigm; it uses a positive-valence stimulus, a female fly, to teach a male to decrease an innate behavior, courtship of the female. As such, courtship conditioning is not clearly categorized as either appetitive or aversive conditioning. The mushroom body (MB) region in the fruit fly brain is important for several types of memory; however, the precise subsets of intrinsic and extrinsic MB neurons necessary for courtship conditioning are unknown. Here, we disrupted synaptic signaling by driving a shibirets effector in precise subsets of MB neurons, defined by a collection of split-GAL4 drivers. Out of 75 lines tested, 32 showed defects in courtship conditioning memory. Surprisingly, we did not have any hits in the γ lobe Kenyon cells, a region previously implicated in courtship conditioning memory. We did find that several γ lobe extrinsic neurons were necessary for courtship conditioning memory. Overall, our memory hits in the dopaminergic neurons (DANs) and the mushroom body output neurons were more consistent with results from appetitive memory assays than aversive memory assays. For example, protocerebral anterior medial DANs were necessary for courtship memory, similar to appetitive memory, while protocerebral posterior lateral 1 (PPL1) DANs, important for aversive memory, were not needed. Overall, our results indicate that the MB circuits necessary for courtship conditioning memory coincide with circuits necessary for appetitive memory.
Montague, Shelby A.; Baker, Bruce S.
2016-01-01
An animal’s ability to learn and to form memories is essential for its survival. The fruit fly has proven to be a valuable model system for studies of learning and memory. One learned behavior in fruit flies is courtship conditioning. In Drosophila courtship conditioning, male flies learn not to court females during training with an unreceptive female. He retains a memory of this training and for several hours decreases courtship when subsequently paired with any female. Courtship conditioning is a unique learning paradigm; it uses a positive-valence stimulus, a female fly, to teach a male to decrease an innate behavior, courtship of the female. As such, courtship conditioning is not clearly categorized as either appetitive or aversive conditioning. The mushroom body (MB) region in the fruit fly brain is important for several types of memory; however, the precise subsets of intrinsic and extrinsic MB neurons necessary for courtship conditioning are unknown. Here, we disrupted synaptic signaling by driving a shibirets effector in precise subsets of MB neurons, defined by a collection of split-GAL4 drivers. Out of 75 lines tested, 32 showed defects in courtship conditioning memory. Surprisingly, we did not have any hits in the γ lobe Kenyon cells, a region previously implicated in courtship conditioning memory. We did find that several γ lobe extrinsic neurons were necessary for courtship conditioning memory. Overall, our memory hits in the dopaminergic neurons (DANs) and the mushroom body output neurons were more consistent with results from appetitive memory assays than aversive memory assays. For example, protocerebral anterior medial DANs were necessary for courtship memory, similar to appetitive memory, while protocerebral posterior lateral 1 (PPL1) DANs, important for aversive memory, were not needed. Overall, our results indicate that the MB circuits necessary for courtship conditioning memory coincide with circuits necessary for appetitive memory. PMID:27764141
Chronic obstructive pulmonary disease: knowing what we mean, meaning what we say.
Joshi, J M
2008-01-01
Chronic obstructive pulmonary disease (COPD) is defined in several different ways using different criteria based on symptoms, physiological impairment and pathological abnormalities. While some use COPD to mean smoking related chronic airway disease, others include all disorders causing chronic airway obstruction. When COPD is used as a broad descriptive term, specific disorders that cause chronic airway obstruction remain under-diagnosed and the prevalence estimates vary considerably. The lack of agreement over the precise terminology and classification of COPD has resulted in widespread confusion. Terminology includes definition, diagnostic criteria, and a system for staging severity. Recently, COPD is defined more clearly and diagnosed using precise criteria that include tobacco smoking greater than 10 pack years, symptoms and airway obstruction on spirometry. A multi-dimensional severity grading system, the BODE (body mass index, obstruction, dyspnoea, and exercise tolerance) index has been designed to assess the respiratory and systemic expressions of COPD. This review proposes that the broad group of chronic disorders of the airways (with or without airway obstruction) be called chronic airway disease (CAD). The term COPD should be used exclusively for tobacco smoking related chronic airway disease. Chronic airway obstruction or obstructive lung disease may be used to define those conditions with airways obstruction caused by factors other than tobacco smoking. The aetiology may be appended to the label, for example, chronic airway obstruction/obstructive lung disease associated with bronchiectasis, chronic airway obstruction/obstructive lung disease associated with obliterative bronchiolitis or chronic airway obstruction/obstructive lung disease due to biomass fuel/occupational exposure.
Precision mechatronics based on high-precision measuring and positioning systems and machines
NASA Astrophysics Data System (ADS)
Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert
2007-06-01
Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.
Estimation of methane emission rate changes using age-defined waste in a landfill site.
Ishii, Kazuei; Furuichi, Toru
2013-09-01
Long term methane emissions from landfill sites are often predicted by first-order decay (FOD) models, in which the default coefficients of the methane generation potential and the methane generation rate given by the Intergovernmental Panel on Climate Change (IPCC) are usually used. However, previous studies have demonstrated the large uncertainty in these coefficients because they are derived from a calibration procedure under ideal steady-state conditions, not actual landfill site conditions. In this study, the coefficients in the FOD model were estimated by a new approach to predict more precise long term methane generation by considering region-specific conditions. In the new approach, age-defined waste samples, which had been under the actual landfill site conditions, were collected in Hokkaido, Japan (in cold region), and the time series data on the age-defined waste sample's methane generation potential was used to estimate the coefficients in the FOD model. The degradation coefficients were 0.0501/y and 0.0621/y for paper and food waste, and the methane generation potentials were 214.4 mL/g-wet waste and 126.7 mL/g-wet waste for paper and food waste, respectively. These coefficients were compared with the default coefficients given by the IPCC. Although the degradation coefficient for food waste was smaller than the default value, the other coefficients were within the range of the default coefficients. With these new coefficients to calculate methane generation, the long term methane emissions from the landfill site was estimated at 1.35×10(4)m(3)-CH(4), which corresponds to approximately 2.53% of the total carbon dioxide emissions in the city (5.34×10(5)t-CO(2)/y). Copyright © 2013 Elsevier Ltd. All rights reserved.
The Role of Astro-Geodetic in Precise Guidance of Long Tunnels
NASA Astrophysics Data System (ADS)
Mirghasempour, M.; Jafari, A. Y.
2015-12-01
One of prime aspects of surveying projects is guidance of paths of a long tunnel from different directions and finally ending all paths in a specific place. This kind of underground surveying, because of particular condition, has some different points in relation to the ground surveying, including Improper geometry in underground transverse, low precise measurement in direction and length due to condition such as refraction, distinct gravity between underground point and corresponding point on the ground (both value and direction of gravity) and etc. To solve this problems, astro-geodetic that is part of geodesy science, can help surveying engineers. In this article, the role of astronomy is defined in two subjects: 1- Azimuth determination of directions from entrance and exit nets of tunnel and also calibration of gyro-theodolite to use them in Underground transvers: By astronomical methods, azimuth of directions can be determine with an accuracy of 0.5 arcsecond, whereas, nowadays, no gyroscope can measure the azimuth in this accuracy; For instance, accuracy of the most precise gyroscope (Gyromat 5000) is 1.2 cm over a distance of one kilometre (2.4 arcsecond). Furthermore, the calibration methods that will be mention in this article, have significance effects on underground transverse. 2- Height relation between entrance point and exit point is problematic and time consuming; For example, in a 3 km long tunnel ( in Arak- Khoram Abad freeway), to relate entrance point to exit point, it is necessary to perform levelling about 90 km. Other example of this boring and time consuming levelling is in Kerman tunnel. This tunnel is 36 km length, but to transfer the entrance point height to exit point, 150 km levelling is needed. According to this paper, The solution for this difficulty is application of astro-geodetic and determination of vertical deflection by digital zenith camera system TZK2-D. These two elements make possible to define geoid profile in terms of tunnel azimuth in entrance and exit of tunnel; So by doing this, surveying engineers are able to transfer entrance point height to exit point of tunnels in easiest way.
Equality Matters: The Critical Implications of Precisely Defining Equality
ERIC Educational Resources Information Center
Faulkner, Valerie; Walkowiak, Temple; Cain, Chris; Lee, Carrie
2016-01-01
Equality is such an important concept for children to develop. In this article it is argued that a precise definition is needed to ensure that students are provided with a consistent "picture" of what it is that equality really means.
[Darwin versus Marx? Reflections on a book by Giovanni Jervis].
Cavallaro, Luigi
2012-01-01
Giovanni Jervis'2002 book Individualismo e cooperazione. Psicologia della politica [Individualism and Cooperation: Psychology of Politics] is the outcome of a critical reflection begun by the author at the end of the 1970s in order to explore the manifestations and the problems of cooperation between individuals, and to identify some "universal" psychological factors that could define the role of psychology within politics and constitute an "objective foundation" of any human culture. Although Jervis was, so to speak,favoring Darwin against Marx, it is argued that,from his overall reasoning, several of his arguments actually are in favor of the inevitable "historicity" of individuals, due to the social conditioning they are subjected since birth: too often certain "universalistic" approaches transmit, together with scientific advances (or even without them), well identifiable ideological motives linked to precise and well defined historical and economic interests?
Yilmaz, B.; Kaban, S.; Akcay, B. K.
2015-01-01
In this study, simple, fast and reliable cyclic voltammetry, linear sweep voltammetry, square wave voltammetry and differential pulse voltammetry methods were developed and validated for determination of etodolac in pharmaceutical preparations. The proposed methods were based on electrochemical oxidation of etodolac at platinum electrode in acetonitrile solution containing 0.1 M lithium perchlorate. The well-defined oxidation peak was observed at 1.03 V. The calibration curves were linear for etodolac at the concentration range of 2.5-50 μg/ml for linear sweep, square wave and differential pulse voltammetry methods, respectively. Intra- and inter-day precision values for etodolac were less than 4.69, and accuracy (relative error) was better than 2.00%. The mean recovery of etodolac was 100.6% for pharmaceutical preparations. No interference was found from three tablet excipients at the selected assay conditions. Developed methods in this study are accurate, precise and can be easily applied to Etol, Tadolak and Etodin tablets as pharmaceutical preparation. PMID:26664057
Pekala, Ronald J
2016-01-01
Wickramasekera II (2015) has penned a comprehensive and thoughtful review article demonstrating how empathy is intimately involved in the psychology and neurophysiology of hypnosis and the self. Hypnosis is a very "mental" or subjective phenomenon for both the client and the research participant. To better assess the mind of the client/participant during hypnosis, it is my belief that we need to generate more "precise" phenomenological descriptors of the mind during hypnosis and related empathic conditions, as Wickramasekera II (2015) has suggested in his article. Although any phenomenological methodology will have its limits and disadvantages, noetics (as defined in the article below) can help us better understand hypnosis, empathic involvement theory, and the brain/mind/behavior interface. By quantifying the mind in a comprehensive manner, just as the brain is comprehensively quantified via fMRI and qEEG technologies, noetic analysis can help us more precisely assess the mind and relate it to the brain and human behavior and experience.
Verification of spectrophotometric method for nitrate analysis in water samples
NASA Astrophysics Data System (ADS)
Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu
2017-12-01
The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.
Patient similarity for precision medicine: a systematic review.
Parimbelli, E; Marini, S; Sacchi, L; Bellazzi, R
2018-06-01
Evidence-based medicine is the most prevalent paradigm adopted by physicians. Clinical practice guidelines typically define a set of recommendations together with eligibility criteria that restrict their applicability to a specific group of patients. The ever-growing size and availability of health-related data is currently challenging the broad definitions of guideline-defined patient groups. Precision medicine leverages on genetic, phenotypic, or psychosocial characteristics to provide precise identification of patient subsets for treatment targeting. Defining a patient similarity measure is thus an essential step to allow stratification of patients into clinically-meaningful subgroups. The present review investigates the use of patient similarity as a tool to enable precision medicine. 279 articles were analyzed along four dimensions: data types considered, clinical domains of application, data analysis methods, and translational stage of findings. Cancer-related research employing molecular profiling and standard data analysis techniques such as clustering constitute the majority of the retrieved studies. Chronic and psychiatric diseases follow as the second most represented clinical domains. Interestingly, almost one quarter of the studies analyzed presented a novel methodology, with the most advanced employing data integration strategies and being portable to different clinical domains. Integration of such techniques into decision support systems constitutes and interesting trend for future research. Copyright © 2018. Published by Elsevier Inc.
Timeliner: Automating Procedures on the ISS
NASA Technical Reports Server (NTRS)
Brown, Robert; Braunstein, E.; Brunet, Rick; Grace, R.; Vu, T.; Zimpfer, Doug; Dwyer, William K.; Robinson, Emily
2002-01-01
Timeliner has been developed as a tool to automate procedural tasks. These tasks may be sequential tasks that would typically be performed by a human operator, or precisely ordered sequencing tasks that allow autonomous execution of a control process. The Timeliner system includes elements for compiling and executing sequences that are defined in the Timeliner language. The Timeliner language was specifically designed to allow easy definition of scripts that provide sequencing and control of complex systems. The execution environment provides real-time monitoring and control based on the commands and conditions defined in the Timeliner language. The Timeliner sequence control may be preprogrammed, compiled from Timeliner "scripts," or it may consist of real-time, interactive inputs from system operators. In general, the Timeliner system lowers the workload for mission or process control operations. In a mission environment, scripts can be used to automate spacecraft operations including autonomous or interactive vehicle control, performance of preflight and post-flight subsystem checkouts, or handling of failure detection and recovery. Timeliner may also be used for mission payload operations, such as stepping through pre-defined procedures of a scientific experiment.
Optimization of deformation monitoring networks using finite element strain analysis
NASA Astrophysics Data System (ADS)
Alizadeh-Khameneh, M. Amin; Eshagh, Mehdi; Jensen, Anna B. O.
2018-04-01
An optimal design of a geodetic network can fulfill the requested precision and reliability of the network, and decrease the expenses of its execution by removing unnecessary observations. The role of an optimal design is highlighted in deformation monitoring network due to the repeatability of these networks. The core design problem is how to define precision and reliability criteria. This paper proposes a solution, where the precision criterion is defined based on the precision of deformation parameters, i. e. precision of strain and differential rotations. A strain analysis can be performed to obtain some information about the possible deformation of a deformable object. In this study, we split an area into a number of three-dimensional finite elements with the help of the Delaunay triangulation and performed the strain analysis on each element. According to the obtained precision of deformation parameters in each element, the precision criterion of displacement detection at each network point is then determined. The developed criterion is implemented to optimize the observations from the Global Positioning System (GPS) in Skåne monitoring network in Sweden. The network was established in 1989 and straddled the Tornquist zone, which is one of the most active faults in southern Sweden. The numerical results show that 17 out of all 21 possible GPS baseline observations are sufficient to detect minimum 3 mm displacement at each network point.
Pneumococcal infections at Hajj: current knowledge gaps.
Ridda, Iman; King, Catherine; Rashid, Harunor
2014-01-01
Hajj attendance increases the risk of respiratory infections including pneumonia. Streptococcus pneumoniae is a frequently identified pathogen, found in about 10% of respiratory tract samples of symptomatic Hajj pilgrims; and at least 20% of these isolates are penicillin resistant. However, the burden of pneumococcal disease at Hajj is not precisely defined at serotypic level, and it is postulated that due to intense mixing of pilgrims the distribution of pneumococcal serotypes at Hajj could be different from pilgrims' country of origin or of Saudi Arabia. In Saudi Arabia, the most prevalent pneumococcal serotypes are 23F, 6B, 19F, 18C, 4, 14, and 19A, and 90% of the serotypes are covered by 13-valent pneumococcal conjugate vaccine (PCV-13) as well as 23-valent pneumococcal polysaccharide vaccine (PPV-23). However, due to lack of Hajj-specific data, the Saudi Arabian Ministry of Health has not yet recommended pneumococcal vaccine for pilgrims, and the immunisation recommendation and uptake vary greatly across countries. As at least one third of Hajj pilgrims are 'at risk' of pneumococcal disease either by virtue of age or pre-existing medical conditions, consideration should be given to vaccinating high risk pilgrims against pneumococcal disease. Other preventive measures such as smoking cessation, pollution reduction and vaccinations against influenza and pertussis should also be considered. Precisely defining the epidemiology of pneumococcal disease to identify an optimum vaccination schedule for Hajj pilgrims is a current research priority.
Precision grip responses to unexpected rotational perturbations scale with axis of rotation.
De Gregorio, Michael; Santos, Veronica J
2013-04-05
It has been established that rapid, pulse-like increases in precision grip forces ("catch-up responses") are elicited by unexpected translational perturbations and that response latency and strength scale according to the direction of linear slip relative to the hand as well as gravity. To determine if catch-up responses are elicited by unexpected rotational perturbations and are strength-, axis-, and/or direction-dependent, we imposed step torque loads about each of two axes which were defined relative to the subject's hand: the distal-proximal axis away from and towards the subject's palm, and the grip axis which connects the two fingertips. Precision grip responses were dominated initially by passive mechanics and then by active, unimodal catch-up responses. First dorsal interosseous activity, marking the start of the catch-up response, began 71-89 ms after the onset of perturbation. The onset latency, shape, and duration (217-231 ms) of the catch-up response were not affected by the axis, direction, or magnitude of the rotational perturbation, while strength was scaled by axis of rotation and slip conditions. Rotations about the grip axis that tilted the object away from the palm and induced rotational slip elicited stronger catch-up responses than rotations about the distal-proximal axis that twisted the object between the digits. To our knowledge, this study is the first to investigate grip responses to unexpected torque loads and to show characteristic, yet axis-dependent, catch-up responses for conditions other than pure linear slip. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Zelinka, Ivan; Davendra, Donald; Oplatkova, Zuzana
2010-06-01
This research deals with the optimization of the control of chaos by means of evolutionary algorithms. This work is aimed on an explanation of how to use evolutionary algorithms (EAs) and how to properly define the advanced targeting cost function (CF) securing very fast and precise stabilization of desired state for any initial conditions. As a model of deterministic chaotic system, the one dimensional Logistic equation was used. The evolutionary algorithm Self-Organizing Migrating Algorithm (SOMA) was used in four versions. For each version, repeated simulations were conducted to outline the effectiveness and robustness of used method and targeting CF.
Analytical evaluation of ILM sensors, volume 1
NASA Technical Reports Server (NTRS)
Kirk, R. J.
1975-01-01
The functional requirements and operating environment constraints are defined for an independent landing monitor ILM which provides the flight crew with an independent assessment of the operation of the primary automatic landing system. The capabilities of radars, TV, forward looking infrared radiometers, multilateration, microwave radiometers, interferometers, and nuclear sensing concepts to meet the ILM conditions are analyzed. The most critical need for the ILM appears in the landing sequence from 1000 to 2000 meters from threshold through rollout. Of the sensing concepts analyzed, the following show potential of becoming feasible ILM's: redundant microwave landings systems, precision approach radar, airborne triangulation radar, multilateration with radar altimetry, and nuclear sensing.
NASA Astrophysics Data System (ADS)
George, J.; Irkens, M.; Neumann, S.; Scherer, U. W.; Srivastava, A.; Sinha, D.; Fink, D.
2006-03-01
It is a common practice since long to follow the ion track-etching process in thin foils via conductometry, i.e . by measurement of the electrical current which passes through the etched track, once the track breakthrough condition has been achieved. The major disadvantage of this approach, namely the absence of any major detectable signal before breakthrough, can be avoided by examining the track-etching process capacitively. This method allows one to define precisely not only the breakthrough point before it is reached, but also the length of any non-transient track. Combining both capacitive and conductive etching allows one to control the etching process perfectly. Examples and possible applications are given.
Toward precision medicine and health: Opportunities and challenges in allergic diseases.
Galli, Stephen Joseph
2016-05-01
Precision medicine (also called personalized, stratified, or P4 medicine) can be defined as the tailoring of preventive measures and medical treatments to the characteristics of each patient to obtain the best clinical outcome for each person while ideally also enhancing the cost-effectiveness of such interventions for patients and society. Clearly, the best clinical outcome for allergic diseases is not to get them in the first place. To emphasize the importance of disease prevention, a critical component of precision medicine can be referred to as precision health, which is defined herein as the use of all available information pertaining to specific subjects (including family history, individual genetic and other biometric information, and exposures to risk factors for developing or exacerbating disease), as well as features of their environments, to sustain and enhance health and prevent the development of disease. In this article I will provide a personal perspective on how the precision health-precision medicine approach can be applied to the related goals of preventing the development of allergic disorders and providing the most effective diagnosis, disease monitoring, and care for those with these prevalent diseases. I will also mention some of the existing and potential challenges to achieving these ambitious goals. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Adiabatic invariants in stellar dynamics. 1: Basic concepts
NASA Technical Reports Server (NTRS)
Weinberg, Martin D.
1994-01-01
The adiabatic criterion, widely used in astronomical dynamics, is based on the harmonic oscillator. It asserts that the change in action under a slowly varying perturbation is exponentially small. Recent mathematical results that precisely define the conditions for invariance show that this model does not apply in general. In particular, a slowly varying perturbation may cause significant evolution stellar dynamical systems even if its time scale is longer than any internal orbital time scale. This additional 'heating' may have serious implications for the evolution of star clusters and dwarf galaxies which are subject to long-term environmental forces. The mathematical developments leading to these results are reviewed, and the conditions for applicability to and further implications for stellar systems are discussed. Companion papers present a computational method for a general time-dependent disturbance and detailed example.
Experiments to assess preheat in blast-wave-drive instability experiments
NASA Astrophysics Data System (ADS)
Krauland, Christine; Drake, Paul; Kuranz, Carolyn; Grosskopf, Michael; Boehly, Tom
2009-11-01
The use of multi-kilojoule, ns lasers to launch shock waves has become a standard method for initiating hydrodynamic experiments in Laboratory Astrophysics. However, the intense laser ablation that creates moving plasma also leads to the production of unwanted energetic x-rays and suprathermal electrons, both of which can be sources of material preheating. In principle, this preheat can alter the conditions of the experimental setup prior to the occurrence of the intended dynamics. At the University of Michigan, ongoing Rayleigh-Taylor instability experiments are defined by precise initial conditions, and potential deformation due to preheat could greatly affect their accuracy. An experiment devised and executed in an attempt to assess the preheat in this specific case will be presented, along with the quantitative analysis of the data obtained and comparison with 2D simulations.
Validating internal controls for quantitative plant gene expression studies
Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H
2004-01-01
Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655
Comparison of working efficiency of terrestrial laser scanner in day and night conditions
NASA Astrophysics Data System (ADS)
Arslan, A. E.; Kalkan, K.
2013-10-01
Terrestrial Laser Scanning is a popular and widely used technique to scan existing objects, document historical sites and items, and remodel them if and when needed. Their ability to collect thousands of point data per second makes them an invaluable tool in many areas from engineering to historical reconstruction. There are many scanners in the market with different technical specifications. One main technical specification of laser scanners is range and illumination. In this study, it is tested to be determined the optimal working times of a laser scanner and the scanners consistency with its specifications sheet. In order to conduct this work, series of GNSS measurements in Istanbul Technical University have been carried out, connected to the national reference network, to determine precise positions of target points and the scanner, which makes possible to define a precise distance between the scanner and targets. Those ground surveys has been used for calibration and registration purposes. Two different scan campaigns conducted at 12 am and 11 pm to compare working efficiency of laser scanner in different illumination conditions and targets are measured with a handheld spectro-radiometer in order to determine their reflective characteristics. The obtained results are compared and their accuracies have been analysed.
NASA Astrophysics Data System (ADS)
Jansons, Adam Wayne
Colloidal nanocrystals offer new and improved performance in applications as well as less environmental impact when compared to traditional device fabrication methods. The important properties that enable improved applications are a direct result of nanocrystal structure. While there have been many great advances in the production of colloidal nanocrystals over the past three decades, precise, atomic-level control of the size, composition, and structure of the inorganic core remains challenging. Rather than dictate these material aspects through traditional synthetic routes, this dissertation details the development and exploitation of a colloidal nanocrystal synthetic method inspired by polymerization reactions. Living polymerization reactions offer precise control of polymer size and structure and have tremendously advanced polymer science, allowing the intuitive production of polymers and block co-polymers of well-defined molecular weights. Similarly, living nanocrystal synthetic methods allow an enhanced level of structural control, granting the synthesis of binary, doped, and core/shell nanocrystals of well-defined size, composition, and structure. This improved control in turn grants enhanced nanocrystal property performance and deepens our understanding of structure/property relationships. This dissertation defines living nanocrystal growth and demonstrates the potential of the living methods in the colloidal production of oxide nanocrystals. After a brief introduction, living growth is defined and discussed in the context of synthetic prerequisites, attributes, and outcomes. Living growth is also compared to more traditional colloidal nanocrystal synthetic methods. The following chapters then demonstrate the precise control living approaches offer in three separate studies; the first highlights sub-nanometer control of nanocrystal size from 2-22+ nm in diameter. Next the improvement in nanocrystal composition is illustrated using several transition metal dopants into an oxide nanocrystal matrix at near thermodynamically allowed compositions. Additionally, precise radial dopant placement is demonstrated, which has striking implications for material properties. The radial position of tin in tin-doped indium oxide nanocrystals and the resulting differences on the localized surface plasmon resonance are discussed. Finally, future opportunities are reviewed. This dissertation includes previously published co-authored material.
On the radius of habitable planets
NASA Astrophysics Data System (ADS)
Alibert, Y.
2014-01-01
Context. The conditions that a planet must fulfill to be habitable are not precisely known. However, it is comparatively easier to define conditions under which a planet is very likely not habitable. Finding such conditions is important as it can help select, in an ensemble of potentially observable planets, which ones should be observed in greater detail for characterization studies. Aims: Assuming, as in the Earth, that the presence of a C-cycle is a necessary condition for long-term habitability, we derive, as a function of the planetary mass, a radius above which a planet is likely not habitable. We compute the maximum radius a planet can have to fulfill two constraints: surface conditions compatible with the existence of liquid water, and no ice layer at the bottom of a putative global ocean. We demonstrate that, above a given radius, these two constraints cannot be met. Methods: We compute internal structure models of planets, using a five-layer model (core, inner mantle, outer mantle, ocean, and atmosphere), for different masses and composition of the planets (in particular, the Fe/Si ratio of the planet). Results: Our results show that for planets in the super-Earth mass range (1-12 M⊕), the maximum that a planet, with a composition similar to that of the Earth, can have varies between 1.7 and 2.2 R⊕. This radius is reduced when considering planets with higher Fe/Si ratios and taking radiation into account when computing the gas envelope structure. Conclusions: These results can be used to infer, from radius and mass determinations using high-precision transit observations like those that will soon be performed by the CHaracterizing ExOPlanet Satellite (CHEOPS), which planets are very likely not habitable, and therefore which ones should be considered as best targets for further habitability studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhakaran, Venkateshkumar; Johnson, Grant E.; Wang, Bingbing
2016-11-07
Molecular-level understanding of electrochemical processes occurring at electrode-electrolyte interfaces (EEI) is key to the rational development of high-performance and sustainable electrochemical technologies. This article reports the development and first application of solid-state in situ electrochemical probes to study redox and catalytic processes occurring at well-defined EEI generated using soft-landing of mass- and charge-selected cluster ions (SL). In situ electrochemical probes with excellent mass transfer properties are fabricated using carefully-designed nanoporous ionic liquid membranes. SL enables deposition of pure active species that are not obtainable with other techniques onto electrode surfaces with precise control over charge state, composition, and kinetic energy.more » SL is, therefore, a unique tool for studying fundamental processes occurring at EEI. For the first time using an aprotic electrochemical probe, the effect of charge state (PMo12O403-/2-) and the contribution of building blocks of Keggin polyoxometalate (POM) clusters to redox processes are characterized by populating EEI with novel POM anions generated by electrospray ionization and gas phase dissociation. Additionally, a proton conducting electrochemical probe has been developed to characterize the reactive electrochemistry (oxygen reduction activity) of bare Pt clusters (Pt40 ~1 nm diameter), thus demonstrating the capability of the probe for studying reactions in controlled gaseous environments. The newly developed in situ electrochemical probes combined with ion SL provide a versatile method to characterize the EEI in solid-state redox systems and reactive electrochemistry at precisely-defined conditions. This capability will advance molecular-level understanding of processes occurring at EEI that are critical to many energy-related technologies.« less
Wagner, Wolfgang; Feldmann, Robert E; Seckinger, Anja; Maurer, Martin H; Wein, Frederik; Blake, Jonathon; Krause, Ulf; Kalenka, Armin; Bürgers, Heinrich F; Saffrich, Rainer; Wuchter, Patrick; Kuschinsky, Wolfgang; Ho, Anthony D
2006-04-01
Mesenchymal stem cells (MSC) raise high hopes in clinical applications. However, the lack of common standards and a precise definition of MSC preparations remains a major obstacle in research and application of MSC. Whereas surface antigen markers have failed to precisely define this population, a combination of proteomic data and microarray data provides a new dimension for the definition of MSC preparations. In our continuing effort to characterize MSC, we have analyzed the differential transcriptome and proteome expression profiles of MSC preparations isolated from human bone marrow under two different expansion media (BM-MSC-M1 and BM-MSC-M2). In proteomics, 136 protein spots were unambiguously identified by MALDI-TOF-MS and corresponding cDNA spots were selected on our "Human Transcriptome cDNA Microarray." Combination of datasets revealed a correlation in differential gene expression and protein expression of BM-MSC-M1 vs BM-MSC-M2. Genes involved in metabolism were more highly expressed in BM-MSC-M1, whereas genes involved in development, morphogenesis, extracellular matrix, and differentiation were more highly expressed in BM-MSC-M2. Interchanging culture conditions for 8 days revealed that differential expression was retained in several genes whereas it was altered in others. Our results have provided evidence that homogeneous BM-MSC preparations can reproducibly be isolated under standardized conditions, whereas culture conditions exert a prominent impact on transcriptome, proteome, and cellular organization of BM-MSC.
NASA Astrophysics Data System (ADS)
Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.
2014-05-01
MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.
2002-12-01
34th Annual Precise Time and Time Interval (PTTI) Meeting 243 IEEE-1588™ STANDARD FOR A PRECISION CLOCK SYNCHRONIZATION PROTOCOL FOR... synchronization . 2. Cyclic-systems. In cyclic-systems, timing is periodic and is usually defined by the characteristics of a cyclic network or bus...incommensurate, timing schedules for each device are easily implemented. In addition, synchronization accuracy depends on the accuracy of the common
Precision Cleaning and Protection of Coated Optical Components for NIF Small Optics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phelps, Jim
The purpose of this procedure shall be to define the precision cleaning of finished, coated, small optical components for NIF at Lawrence Livermore National Laboratories. The term “small optical components” includes coated optics that are set into simple mounts, as well as coated, un-mounted optics.
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
Stone, William J.
1986-01-01
A zero-home locator includes a fixed phototransistor switch and a moveable actuator including two symmetrical, opposed wedges, each wedge defining a point at which switching occurs. The zero-home location is the average of the positions of the points defined by the wedges.
Stone, W.J.
1983-10-31
A zero-home locator includes a fixed phototransistor switch and a moveable actuator including two symmetrical, opposed wedges, each wedge defining a point at which switching occurs. The zero-home location is the average of the positions of the points defined by the wedges.
Precision segmented reflector, figure verification sensor
NASA Technical Reports Server (NTRS)
Manhart, Paul K.; Macenka, Steve A.
1989-01-01
The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed
3D reconstruction optimization using imagery captured by unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Bassie, Abby L.; Meacham, Sean; Young, David; Turnage, Gray; Moorhead, Robert J.
2017-05-01
Because unmanned air vehicles (UAVs) are emerging as an indispensable image acquisition platform in precision agriculture, it is vitally important that researchers understand how to optimize UAV camera payloads for analysis of surveyed areas. In this study, imagery captured by a Nikon RGB camera attached to a Precision Hawk Lancaster was used to survey an agricultural field from six different altitudes ranging from 45.72 m (150 ft.) to 121.92 m (400 ft.). After collecting imagery, two different software packages (MeshLab and AgiSoft) were used to measure predetermined reference objects within six three-dimensional (3-D) point clouds (one per altitude scenario). In-silico measurements were then compared to actual reference object measurements, as recorded with a tape measure. Deviations of in-silico measurements from actual measurements were recorded as Δx, Δy, and Δz. The average measurement deviation in each coordinate direction was then calculated for each of the six flight scenarios. Results from MeshLab vs. AgiSoft offered insight into the effectiveness of GPS-defined point cloud scaling in comparison to user-defined point cloud scaling. In three of the six flight scenarios flown, MeshLab's 3D imaging software (user-defined scale) was able to measure object dimensions from 50.8 to 76.2 cm (20-30 inches) with greater than 93% accuracy. The largest average deviation in any flight scenario from actual measurements was 14.77 cm (5.82 in.). Analysis of the point clouds in AgiSoft (GPS-defined scale) yielded even smaller Δx, Δy, and Δz than the MeshLab measurements in over 75% of the flight scenarios. The precisions of these results are satisfactory in a wide variety of precision agriculture applications focused on differentiating and identifying objects using remote imagery.
Shape Evolution of Detached Bridgman Crystals Grown in Microgravity
NASA Technical Reports Server (NTRS)
Volz, M. P.; Mazuruk, K.
2015-01-01
Detached (or dewetted) Bridgman crystal growth defines that process in which a gap exists between a growing crystal and the crucible wall. In microgravity, the parameters that influence the existence of a stable gap are the growth angle of the solidifying crystal, the contact angle between the melt and the crucible wall, and the pressure difference across the meniscus. During actual crystal growth, the initial crystal radius will not have the precise value required for stable detached growth. Beginning with a crystal diameter that differs from stable conditions, numerical calculations are used to analyze the transient crystal growth process. Depending on the initial conditions and growth parameters, the crystal shape will either evolve towards attachment at the crucible wall, towards a stable gap width, or inwards towards eventual collapse of the meniscus. Dynamic growth stability is observed only when the sum of the growth and contact angles exceeds 180 degrees.
Design and Evaluation of a Proxy-Based Monitoring System for OpenFlow Networks.
Taniguchi, Yoshiaki; Tsutsumi, Hiroaki; Iguchi, Nobukazu; Watanabe, Kenzi
2016-01-01
Software-Defined Networking (SDN) has attracted attention along with the popularization of cloud environment and server virtualization. In SDN, the control plane and the data plane are decoupled so that the logical topology and routing control can be configured dynamically depending on network conditions. To obtain network conditions precisely, a network monitoring mechanism is necessary. In this paper, we focus on OpenFlow which is a core technology to realize SDN. We propose, design, implement, and evaluate a network monitoring system for OpenFlow networks. Our proposed system acts as a proxy between an OpenFlow controller and OpenFlow switches. Through experimental evaluations, we confirm that our proposed system can capture packets and monitor traffic information depending on administrator's configuration. In addition, we show that our proposed system does not influence significant performance degradation to overall network performance.
Design and Evaluation of a Proxy-Based Monitoring System for OpenFlow Networks
Taniguchi, Yoshiaki; Tsutsumi, Hiroaki; Iguchi, Nobukazu; Watanabe, Kenzi
2016-01-01
Software-Defined Networking (SDN) has attracted attention along with the popularization of cloud environment and server virtualization. In SDN, the control plane and the data plane are decoupled so that the logical topology and routing control can be configured dynamically depending on network conditions. To obtain network conditions precisely, a network monitoring mechanism is necessary. In this paper, we focus on OpenFlow which is a core technology to realize SDN. We propose, design, implement, and evaluate a network monitoring system for OpenFlow networks. Our proposed system acts as a proxy between an OpenFlow controller and OpenFlow switches. Through experimental evaluations, we confirm that our proposed system can capture packets and monitor traffic information depending on administrator's configuration. In addition, we show that our proposed system does not influence significant performance degradation to overall network performance. PMID:27006977
Insomnia and the performance of US workers: results from the America insomnia survey.
Kessler, Ronald C; Berglund, Patricia A; Coulouvrat, Catherine; Hajak, Goeran; Roth, Thomas; Shahly, Victoria; Shillington, Alicia C; Stephenson, Judith J; Walsh, James K
2011-09-01
To estimate the prevalence and associations of broadly defined (i.e., meeting full ICD-10, DSM-IV, or RDC/ICSD-2 inclusion criteria) insomnia with work performance net of comorbid conditions in the America Insomnia Survey (AIS). Cross-sectional telephone survey. National sample of 7,428 employed health plan subscribers (ages 18+). None. Broadly defined insomnia was assessed with the Brief Insomnia Questionnaire (BIQ). Work absenteeism and presenteeism (low on-the-job work performance defined in the metric of lost workday equivalents) were assessed with the WHO Health and Work Performance Questionnaire (HPQ). Regression analysis examined associations between insomnia and HPQ scores controlling 26 comorbid conditions based on self-report and medical/pharmacy claims records. The estimated prevalence of insomnia was 23.2%. Insomnia was significantly associated with lost work performance due to presenteeism (χ² (1) = 39.5, P < 0.001) but not absenteeism (χ² (1) = 3.2, P = 0.07), with an annualized individual-level association of insomnia with presenteeism equivalent to 11.3 days of lost work performance. This estimate decreased to 7.8 days when controls were introduced for comorbid conditions. The individual-level human capital value of this net estimate was $2,280. If we provisionally assume these estimates generalize to the total US workforce, they are equivalent to annualized population-level estimates of 252.7 days and $63.2 billion. Insomnia is associated with substantial workplace costs. Although experimental studies suggest some of these costs could be recovered with insomnia disease management programs, effectiveness trials are needed to obtain precise estimates of return-on-investment of such interventions from the employer perspective.
Dickie, Ben R; Banerji, Anita; Kershaw, Lucy E; McPartlin, Andrew; Choudhury, Ananya; West, Catharine M; Rose, Chris J
2016-10-01
To improve the accuracy and precision of tracer kinetic model parameter estimates for use in dynamic contrast enhanced (DCE) MRI studies of solid tumors. Quantitative DCE-MRI requires an estimate of precontrast T1 , which is obtained prior to fitting a tracer kinetic model. As T1 mapping and tracer kinetic signal models are both a function of precontrast T1 it was hypothesized that its joint estimation would improve the accuracy and precision of both precontrast T1 and tracer kinetic model parameters. Accuracy and/or precision of two-compartment exchange model (2CXM) parameters were evaluated for standard and joint fitting methods in well-controlled synthetic data and for 36 bladder cancer patients. Methods were compared under a number of experimental conditions. In synthetic data, joint estimation led to statistically significant improvements in the accuracy of estimated parameters in 30 of 42 conditions (improvements between 1.8% and 49%). Reduced accuracy was observed in 7 of the remaining 12 conditions. Significant improvements in precision were observed in 35 of 42 conditions (between 4.7% and 50%). In clinical data, significant improvements in precision were observed in 18 of 21 conditions (between 4.6% and 38%). Accuracy and precision of DCE-MRI parameter estimates are improved when signal models are fit jointly rather than sequentially. Magn Reson Med 76:1270-1281, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Outline of a new approach to the analysis of complex systems and decision processes.
NASA Technical Reports Server (NTRS)
Zadeh, L. A.
1973-01-01
Development of a conceptual framework for dealing with systems which are too complex or too ill-defined to admit of precise quantitative analysis. The approach outlined is based on the premise that the key elements in human thinking are not numbers, but labels of fuzzy sets - i.e., classes of objects in which the transition from membership to nonmembership is gradual rather than abrupt. The approach in question has three main distinguishing features - namely, the use of so-called 'linguistic' variables in place of or in addition to numerical variables, the characterization of simple relations between variables by conditional fuzzy statements, and the characterization of complex relations by fuzzy algorithms.
NASA Technical Reports Server (NTRS)
Frew, A. M.; Eisenhut, D. F.; Farrenkopf, R. L.; Gates, R. F.; Iwens, R. P.; Kirby, D. K.; Mann, R. J.; Spencer, D. J.; Tsou, H. S.; Zaremba, J. G.
1972-01-01
The precision pointing control system (PPCS) is an integrated system for precision attitude determination and orientation of gimbaled experiment platforms. The PPCS concept configures the system to perform orientation of up to six independent gimbaled experiment platforms to design goal accuracy of 0.001 degrees, and to operate in conjunction with a three-axis stabilized earth-oriented spacecraft in orbits ranging from low altitude (200-2500 n.m., sun synchronous) to 24 hour geosynchronous, with a design goal life of 3 to 5 years. The system comprises two complementary functions: (1) attitude determination where the attitude of a defined set of body-fixed reference axes is determined relative to a known set of reference axes fixed in inertial space; and (2) pointing control where gimbal orientation is controlled, open-loop (without use of payload error/feedback) with respect to a defined set of body-fixed reference axes to produce pointing to a desired target.
Application of Millisecond Pulsar Timing to the Long-Term Stability of Clock Ensembles
NASA Technical Reports Server (NTRS)
Foster, Roger S.; Matsakis, Demetrios N.
1996-01-01
We review the application of millisecond pulsars to define a precise long-term standard and positional reference system in a nearly inertial reference frame. We quantify the current timing precision of the best millisecond pulsars and define the required precise time and time interval (PTTI) accuracy and stability to enable time transfer via pulsars. Pulsars may prove useful as independent standards to examine decade-long timing stability and provide an independent natural system within which to calibrate any new, perhaps vastly improved atomic time scale. Since pulsar stability appears to be related to the lifetime of the pulsar, the new millisecond pulsar J173+0747 is projected to have a 100-day accuracy equivalent to a single HP5071 cesium standard. Over the last five years, dozens of new millisecond pulsars have been discovered. A few of the new millisecond pulsars may have even better timing properties.
Using Ultrasonic Speckle Velocimetry to Detect Fluid Instabilities in a Surfactant Solution
NASA Astrophysics Data System (ADS)
Bice, Jason E.
Rheometry is a leading technology used to define material properties of multi-phase viscoelastic fluid-like materials, such as the shear modulus and viscosity. However, traditional rheometry relies on a mechanical response from a rotating or oscillating rotor of various geometries which does not allow for any spatial or temporal quantification of the material characteristics. Further, the setup operates under the assumption of a uniform and homogeneous flow. Thus, only qualitative deductions can be realized when a complex fluid displays inhomogeneous behavior, such as wall slip or shear banding. Due to this lack of capability, non-intrusive imaging is required to define and quantify behavior that occurs in a complex fluid under shear conditions. This thesis outlines the design, fabrication, and experimental examples of an adapted ultrasonic speckle velocimetry device, which enables spatial and temporal resolution of inhomogeneous fluid behavior using ultrasound acoustics. For the experimental example, a commercial surfactant mixture (hair shampoo) was tested to show the utility and precision that ultrasonic speckle velocimetry possesses.
Farajollahi, Farid; Seidenstücker, Axel; Altintoprak, Klara; Walther, Paul; Ziemann, Paul; Plettl, Alfred; Marti, Othmar; Wege, Christina; Gliemann, Hartmut
2018-04-13
Nanoporous membranes are of increasing interest for many applications, such as molecular filters, biosensors, nanofluidic logic and energy conversion devices. To meet high-quality standards, e.g., in molecular separation processes, membranes with well-defined pores in terms of pore diameter and chemical properties are required. However, the preparation of membranes with narrow pore diameter distributions is still challenging. In the work presented here, we demonstrate a strategy, a "pore-in-pore" approach, where the conical pores of a solid state membrane produced by a multi-step top-down lithography procedure are used as a template to insert precisely-formed biomolecular nanodiscs with exactly defined inner and outer diameters. These nanodiscs, which are the building blocks of tobacco mosaic virus-deduced particles, consist of coat proteins, which self-assemble under defined experimental conditions with a stabilizing short RNA. We demonstrate that the insertion of the nanodiscs can be driven either by diffusion due to a concentration gradient or by applying an electric field along the cross-section of the solid state membrane. It is found that the electrophoresis-driven insertion is significantly more effective than the insertion via the concentration gradient.
Farajollahi, Farid; Seidenstücker, Axel; Altintoprak, Klara; Walther, Paul; Ziemann, Paul; Plettl, Alfred; Wege, Christina; Gliemann, Hartmut
2018-01-01
Nanoporous membranes are of increasing interest for many applications, such as molecular filters, biosensors, nanofluidic logic and energy conversion devices. To meet high-quality standards, e.g., in molecular separation processes, membranes with well-defined pores in terms of pore diameter and chemical properties are required. However, the preparation of membranes with narrow pore diameter distributions is still challenging. In the work presented here, we demonstrate a strategy, a “pore-in-pore” approach, where the conical pores of a solid state membrane produced by a multi-step top-down lithography procedure are used as a template to insert precisely-formed biomolecular nanodiscs with exactly defined inner and outer diameters. These nanodiscs, which are the building blocks of tobacco mosaic virus-deduced particles, consist of coat proteins, which self-assemble under defined experimental conditions with a stabilizing short RNA. We demonstrate that the insertion of the nanodiscs can be driven either by diffusion due to a concentration gradient or by applying an electric field along the cross-section of the solid state membrane. It is found that the electrophoresis-driven insertion is significantly more effective than the insertion via the concentration gradient. PMID:29652841
Atila, Alptug; Yilmaz, Bilal
2015-01-01
In this study, simple, fast and reliable cyclic voltammetry (CV), linear sweep voltammetry (LSV), square wave voltammetry (SWV) and differential pulse voltammetry (DPV) methods were developed and validated for determination of bosentan in pharmaceutical preparations. The proposed methods were based on electrochemical oxidation of bosentan at platinum electrode in acetonitrile solution containing 0.1 M TBACIO4. The well-defined oxidation peak was observed at 1.21 V. The calibration curves were linear for bosentan at the concentration range of 5-40 µg/mL for LSV and 5-35 µg/mL for SWV and DPV methods, respectively. Intra- and inter-day precision values for bosentan were less than 4.92, and accuracy (relative error) was better than 6.29%. The mean recovery of bosentan was 100.7% for pharmaceutical preparations. No interference was found from two tablet excipients at the selected assay conditions. Developed methods in this study are accurate, precise and can be easily applied to Tracleer and Diamond tablets as pharmaceutical preparation. PMID:25901151
Nielsen, Alec A K; Segall-Shapiro, Thomas H; Voigt, Christopher A
2013-12-01
Cells use regulatory networks to perform computational operations to respond to their environment. Reliably manipulating such networks would be valuable for many applications in biotechnology; for example, in having genes turn on only under a defined set of conditions or implementing dynamic or temporal control of expression. Still, building such synthetic regulatory circuits remains one of the most difficult challenges in genetic engineering and as a result they have not found widespread application. Here, we review recent advances that address the key challenges in the forward design of genetic circuits. First, we look at new design concepts, including the construction of layered digital and analog circuits, and new approaches to control circuit response functions. Second, we review recent work to apply part mining and computational design to expand the number of regulators that can be used together within one cell. Finally, we describe new approaches to obtain precise gene expression and to reduce context dependence that will accelerate circuit design by more reliably balancing regulators while reducing toxicity. Copyright © 2013. Published by Elsevier Ltd.
Decision making under uncertainty: a quasimetric approach.
N'Guyen, Steve; Moulin-Frier, Clément; Droulez, Jacques
2013-01-01
We propose a new approach for solving a class of discrete decision making problems under uncertainty with positive cost. This issue concerns multiple and diverse fields such as engineering, economics, artificial intelligence, cognitive science and many others. Basically, an agent has to choose a single or series of actions from a set of options, without knowing for sure their consequences. Schematically, two main approaches have been followed: either the agent learns which option is the correct one to choose in a given situation by trial and error, or the agent already has some knowledge on the possible consequences of his decisions; this knowledge being generally expressed as a conditional probability distribution. In the latter case, several optimal or suboptimal methods have been proposed to exploit this uncertain knowledge in various contexts. In this work, we propose following a different approach, based on the geometric intuition of distance. More precisely, we define a goal independent quasimetric structure on the state space, taking into account both cost function and transition probability. We then compare precision and computation time with classical approaches.
Atila, Alptug; Yilmaz, Bilal
2015-01-01
In this study, simple, fast and reliable cyclic voltammetry (CV), linear sweep voltammetry (LSV), square wave voltammetry (SWV) and differential pulse voltammetry (DPV) methods were developed and validated for determination of bosentan in pharmaceutical preparations. The proposed methods were based on electrochemical oxidation of bosentan at platinum electrode in acetonitrile solution containing 0.1 M TBACIO4. The well-defined oxidation peak was observed at 1.21 V. The calibration curves were linear for bosentan at the concentration range of 5-40 µg/mL for LSV and 5-35 µg/mL for SWV and DPV methods, respectively. Intra- and inter-day precision values for bosentan were less than 4.92, and accuracy (relative error) was better than 6.29%. The mean recovery of bosentan was 100.7% for pharmaceutical preparations. No interference was found from two tablet excipients at the selected assay conditions. Developed methods in this study are accurate, precise and can be easily applied to Tracleer and Diamond tablets as pharmaceutical preparation.
Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects
Paillard, Thierry; Noé, Frédéric
2015-01-01
The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them. PMID:26640800
Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects.
Paillard, Thierry; Noé, Frédéric
2015-01-01
The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them.
Diversity in times of adversity: probabilistic strategies in microbial survival games.
Wolf, Denise M; Vazirani, Vijay V; Arkin, Adam P
2005-05-21
Population diversification strategies are ubiquitous among microbes, encompassing random phase-variation (RPV) of pathogenic bacteria, viral latency as observed in some bacteriophage and HIV, and the non-genetic diversity of bacterial stress responses. Precise conditions under which these diversification strategies confer an advantage have not been well defined. We develop a model of population growth conditioned on dynamical environmental and cellular states. Transitions among cellular states, in turn, may be biased by possibly noisy readings of the environment from cellular sensors. For various types of environmental dynamics and cellular sensor capability, we apply game-theoretic analysis to derive the evolutionarily stable strategy (ESS) for an organism and determine when that strategy is diversification. We find that: (1) RPV, effecting a sort of Parrondo paradox wherein random alternations between losing strategies produce a winning strategy, is selected when transitions between different selective environments cannot be sensed, (2) optimal RPV cell switching rates are a function of environmental lifecycle asymmetries and environmental autocorrelation, (3) probabilistic diversification upon entering a new environment is selected when sensors can detect environmental transitions but have poor precision in identifying new environments, and (4) in the presence of excess additive noise, low-pass filtering is required for evolutionary stability. We show that even when RPV is not the ESS, it may minimize growth rate variance and the risk of extinction due to 'unlucky' environmental dynamics.
Precision Medicine: The New Frontier in Idiopathic Pulmonary Fibrosis.
Brownell, Robert; Kaminski, Naftali; Woodruff, Prescott G; Bradford, Williamson Z; Richeldi, Luca; Martinez, Fernando J; Collard, Harold R
2016-06-01
Precision medicine is defined by the National Institute of Health's Precision Medicine Initiative Working Group as an approach to disease treatment that takes into account individual variability in genes, environment, and lifestyle. There has been increased interest in applying the concept of precision medicine to idiopathic pulmonary fibrosis, in particular to search for genetic and molecular biomarker-based profiles (so called endotypes) that identify mechanistically distinct disease subgroups. The relevance of precision medicine to idiopathic pulmonary fibrosis is yet to be established, but we believe that it holds great promise to provide targeted and highly effective therapies to patients. In this manuscript, we describe the field's nascent efforts in genetic/molecular endotype identification and how environmental and behavioral subgroups may also be relevant to disease management.
A three dimensional scaffold with precise micro-architecture and surface micro-textures
Mata, Alvaro; Kim, Eun Jung; Boehm, Cynthia A.; Fleischman, Aaron J.; Muschler, George F.; Roy, Shuvo
2013-01-01
A three-dimensional (3D) structure comprising precisely defined microarchitecture and surface micro-textures, designed to present specific physical cues to cells and tissues, may provide an efficient scaffold in a variety of tissue engineering and regenerative medicine applications. We report a fabrication technique based on microfabrication and soft lithography that permits for the development of 3D scaffolds with both precisely engineered architecture and tailored surface topography. The scaffold fabrication technique consists of three key steps starting with microfabrication of a mold using an epoxy-based photoresist (SU-8), followed by dual-sided molding of a single layer of polydimethylsiloxane (PDMS) using a mechanical jig for precise motion control; and finally, alignment, stacking, and adhesion of multiple PDMS layers to achieve a 3D structure. This technique was used to produce 3D Texture and 3D Smooth PDMS scaffolds, where the surface topography comprised 10 μm-diameter/height posts and smooth surfaces, respectively. The potential utility of the 3D microfabricated scaffolds, and the role of surface topography, were subsequently investigated in vitro with a combined heterogeneous population of adult human stem cells and their resultant progenitor cells, collectively termed connective tissue progenitors (CTPs), under conditions promoting the osteoblastic phenotype. Examination of bone-marrow derived CTPs cultured on the 3D Texture scaffold for 9 days revealed cell growth in three dimensions and increased cell numbers compared to those on the 3D Smooth scaffold. Furthermore, expression of alkaline phosphatase mRNA was higher on the 3D Texture scaffold, while osteocalcin mRNA expression was comparable for both types of scaffolds. PMID:19524292
A simulation study of control and display requirements for zero-experience general aviation pilots
NASA Technical Reports Server (NTRS)
Stewart, Eric C.
1993-01-01
The purpose of this simulation study was to define the basic human factor requirements for operating an airplane in all weather conditions. The basic human factors requirements are defined as those for an operator who is a complete novice for airplane operations but who is assumed to have automobile driving experience. These operators thus have had no piloting experience or training of any kind. The human factor requirements are developed for a practical task which includes all of the basic maneuvers required to go from one airport to another airport in limited visibility conditions. The task was quite demanding including following a precise path with climbing and descending turns while simultaneously changing airspeed. The ultimate goal of this research is to increase the utility of general aviation airplanes - that is, to make them a practical mode of transportation for a much larger segment of the general population. This can be accomplished by reducing the training and proficiency requirements of pilots while improving the level of safety. It is believed that advanced technologies such as fly-by-wire (or light), and head-up pictorial displays can be of much greater benefit to the general aviation pilot than to the full-time, professional pilot.
Bernstein, Leslie R; Trahiotis, Constantine
2016-11-01
This study assessed whether audiometrically-defined "slight" or "hidden" hearing losses might be associated with degradations in binaural processing as measured in binaural detection experiments employing interaurally delayed signals and maskers. Thirty-one listeners participated, all having no greater than slight hearing losses (i.e., no thresholds greater than 25 dB HL). Across the 31 listeners and consistent with the findings of Bernstein and Trahiotis [(2015). J. Acoust. Soc. Am. 138, EL474-EL479] binaural detection thresholds at 500 Hz and 4 kHz increased with increasing magnitude of interaural delay, suggesting a loss of precision of coding with magnitude of interaural delay. Binaural detection thresholds were consistently found to be elevated for listeners whose absolute thresholds at 4 kHz exceeded 7.5 dB HL. No such elevations were observed in conditions having no binaural cues available to aid detection (i.e., "monaural" conditions). Partitioning and analyses of the data revealed that those elevated thresholds (1) were more attributable to hearing level than to age and (2) result from increased levels of internal noise. The data suggest that listeners whose high-frequency monaural hearing status would be classified audiometrically as being normal or "slight loss" may exhibit substantial and perceptually meaningful losses of binaural processing.
Golda, Rachel L; Golda, Mark D; Hayes, Jacqueline A; Peterson, Tawnya D; Needoba, Joseph A
2017-05-01
Laboratory investigations of physiological processes in phytoplankton require precise control of experimental conditions. Chemostats customized to control and maintain stable pH levels (pHstats) are ideally suited for investigations of the effects of pH on phytoplankton physiology, for example in context of ocean acidification. Here we designed and constructed a simple, flexible pHstat system and demonstrated its operational capabilities under laboratory culture conditions. In particular, the system is useful for simulating natural cyclic pH variability within aquatic ecosystems, such as diel fluctuations that result from metabolic activity or tidal mixing in estuaries. The pHstat system operates in two modes: (1) static/set point pH, which maintains pH at a constant level, or (2) dynamic pH, which generates regular, sinusoidal pH fluctuations by systematically varying pH according to user-defined parameters. The pHstat is self-regulating through the use of interchangeable electronically controlled reagent or gas-mediated pH-modification manifolds, both of which feature flow regulation by solenoid valves. Although effective pH control was achieved using both liquid reagent additions and gas-mediated methods, the liquid manifold exhibited tighter control (±0.03pH units) of the desired pH than the gas manifold (±0.10pH units). The precise control provided by this pHstat system, as well as its operational flexibility will facilitate studies that examine responses by marine microbiota to fluctuations in pH in aquatic ecosystems. Copyright © 2017 Elsevier B.V. All rights reserved.
Crowther, Caroline A; Aghajafari, Fariba; Askie, Lisa M; Asztalos, Elizabeth V; Brocklehurst, Peter; Bubner, Tanya K; Doyle, Lex W; Dutta, Sourabh; Garite, Thomas J; Guinn, Debra A; Hallman, Mikko; Hannah, Mary E; Hardy, Pollyanna; Maurel, Kimberly; Mazumder, Premasish; McEvoy, Cindy; Middleton, Philippa F; Murphy, Kellie E; Peltoniemi, Outi M; Peters, Dawn; Sullivan, Lisa; Thom, Elizabeth A; Voysey, Merryn; Wapner, Ronald J; Yelland, Lisa; Zhang, Sasha
2012-02-12
The aim of this individual participant data (IPD) meta-analysis is to assess whether the effects of repeat prenatal corticosteroid treatment given to women at risk of preterm birth to benefit their babies are modified in a clinically meaningful way by factors related to the women or the trial protocol. The Prenatal Repeat Corticosteroid International IPD Study Group: assessing the effects using the best level of Evidence (PRECISE) Group will conduct an IPD meta-analysis. The PRECISE International Collaborative Group was formed in 2010 and data collection commenced in 2011. Eleven trials with up to 5,000 women and 6,000 infants are eligible for the PRECISE IPD meta-analysis. The primary study outcomes for the infants will be serious neonatal outcome (defined by the PRECISE International IPD Study Group as one of death (foetal, neonatal or infant); severe respiratory disease; severe intraventricular haemorrhage (grade 3 and 4); chronic lung disease; necrotising enterocolitis; serious retinopathy of prematurity; and cystic periventricular leukomalacia); use of respiratory support (defined as mechanical ventilation or continuous positive airways pressure or other respiratory support); and birth weight (Z-scores). For the children, the primary study outcomes will be death or any neurological disability (however defined by trialists at childhood follow up and may include developmental delay or intellectual impairment (developmental quotient or intelligence quotient more than one standard deviation below the mean), cerebral palsy (abnormality of tone with motor dysfunction), blindness (for example, corrected visual acuity worse than 6/60 in the better eye) or deafness (for example, hearing loss requiring amplification or worse)). For the women, the primary outcome will be maternal sepsis (defined as chorioamnionitis; pyrexia after trial entry requiring the use of antibiotics; puerperal sepsis; intrapartum fever requiring the use of antibiotics; or postnatal pyrexia). Data analyses are expected to commence in 2011 with results publicly available in 2012.
Curriculum Development and Alignment in Radiologic Technology.
ERIC Educational Resources Information Center
Dowd, Steven B.
Before developing a curriculum for radiologic technology, one must first attempt to define the term "curriculum." The term is not easy to define precisely, although it does imply the necessity of a master plan that outlines institutional philosophy and goals, course descriptions, description of competency-based evaluation, performance objectives,…
Evaluating the Skill of Students to Determine Soil Morphology Characteristics
ERIC Educational Resources Information Center
Post, Donald F.; Parikh, Sanjai J.; Papp, Rae Ann; Ferriera, Laerta
2006-01-01
Precise and accurate pedon descriptions prepared by field scientists using standard techniques with defined terminology and methodology are essential in describing soil pedons. The accuracy of field measurements generally are defined in terms of how well they agree with objective criteria (e.g., laboratory analysis), such as mechanical analysis…
Precision agricultural systems: a model of integrative science and technology
USDA-ARS?s Scientific Manuscript database
In the world of science research, long gone are the days when investigations are done in isolation. More often than not, science funding starts with one or more well-defined challenges or problems, judged by society as high-priority and needing immediate attention. As such, problems are not defined...
An absolute calibration system for millimeter-accuracy APOLLO measurements
NASA Astrophysics Data System (ADS)
Adelberger, E. G.; Battat, J. B. R.; Birkmeier, K. J.; Colmenares, N. R.; Davis, R.; Hoyle, C. D.; Huang, L. R.; McMillan, R. J.; Murphy, T. W., Jr.; Schlerman, E.; Skrobol, C.; Stubbs, C. W.; Zach, A.
2017-12-01
Lunar laser ranging provides a number of leading experimental tests of gravitation—important in our quest to unify general relativity and the standard model of physics. The apache point observatory lunar laser-ranging operation (APOLLO) has for years achieved median range precision at the ∼2 mm level. Yet residuals in model-measurement comparisons are an order-of-magnitude larger, raising the question of whether the ranging data are not nearly as accurate as they are precise, or if the models are incomplete or ill-conditioned. This paper describes a new absolute calibration system (ACS) intended both as a tool for exposing and eliminating sources of systematic error, and also as a means to directly calibrate ranging data in situ. The system consists of a high-repetition-rate (80 MHz) laser emitting short (< 10 ps) pulses that are locked to a cesium clock. In essence, the ACS delivers photons to the APOLLO detector at exquisitely well-defined time intervals as a ‘truth’ input against which APOLLO’s timing performance may be judged and corrected. Preliminary analysis indicates no inaccuracies in APOLLO data beyond the ∼3 mm level, suggesting that historical APOLLO data are of high quality and motivating continued work on model capabilities. The ACS provides the means to deliver APOLLO data both accurate and precise below the 2 mm level.
NASA Technical Reports Server (NTRS)
Abdelwahab, Mahmood; Biesiadny, Thomas J.; Silver, Dean
1987-01-01
An uncertainty analysis was conducted to determine the bias and precision errors and total uncertainty of measured turbojet engine performance parameters. The engine tests were conducted as part of the Uniform Engine Test Program which was sponsored by the Advisory Group for Aerospace Research and Development (AGARD). With the same engines, support hardware, and instrumentation, performance parameters were measured twice, once during tests conducted in test cell number 3 and again during tests conducted in test cell number 4 of the NASA Lewis Propulsion Systems Laboratory. The analysis covers 15 engine parameters, including engine inlet airflow, engine net thrust, and engine specific fuel consumption measured at high rotor speed of 8875 rpm. Measurements were taken at three flight conditions defined by the following engine inlet pressure, engine inlet total temperature, and engine ram ratio: (1) 82.7 kPa, 288 K, 1.0, (2) 82.7 kPa, 288 K, 1.3, and (3) 20.7 kPa, 288 K, 1.3. In terms of bias, precision, and uncertainty magnitudes, there were no differences between most measurements made in test cells number 3 and 4. The magnitude of the errors increased for both test cells as engine pressure level decreased. Also, the level of the bias error was two to three times larger than that of the precision error.
Kohlmeier, Martin; De Caterina, Raffaele; Ferguson, Lynnette R; Görman, Ulf; Allayee, Hooman; Prasad, Chandan; Kang, Jing X; Nicoletti, Carolina Ferreira; Martinez, J Alfredo
2016-01-01
Nutrigenetics considers the influence of individual genetic variation on differences in response to dietary components, nutrient requirements and predisposition to disease. Nutrigenomics involves the study of interactions between the genome and diet, including how nutrients affect the transcription and translation process plus subsequent proteomic and metabolomic changes, and also differences in response to dietary factors based on the individual genetic makeup. Personalized characteristics such as age, gender, physical activity, physiological state and social status, and special conditions such as pregnancy and risk of disease can inform dietary advice that more closely meets individual needs. Precision nutrition has a promising future in treating the individual according to their phenotype and genetic characteristics, aimed at both the treatment and prevention of disease. However, many aspects are still in progress and remain as challenges for the future of nutrition. The integration of the human genotype and microbiome needs to be better understood. Further advances in data interpretation tools are also necessary, so that information obtained through newer tests and technologies can be properly transferred to consumers. Indeed, precision nutrition will integrate genetic data with phenotypical, social, cultural and personal preferences and lifestyles matters to provide a more individual nutrition, but considering public health perspectives, where ethical, legal and policy aspects need to be defined and implemented. © 2016 S. Karger AG, Basel.
The European Society for Medical Oncology (ESMO) Precision Medicine Glossary.
Yates, L R; Seoane, J; Le Tourneau, C; Siu, L L; Marais, R; Michiels, S; Soria, J C; Campbell, P; Normanno, N; Scarpa, A; Reis-Filho, J S; Rodon, J; Swanton, C; Andre, F
2018-01-01
Precision medicine is rapidly evolving within the field of oncology and has brought many new concepts and terminologies that are often poorly defined when first introduced, which may subsequently lead to miscommunication within the oncology community. The European Society for Medical Oncology (ESMO) recognises these challenges and is committed to support the adoption of precision medicine in oncology. To add clarity to the language used by oncologists and basic scientists within the context of precision medicine, the ESMO Translational Research and Personalised Medicine Working Group has developed a standardised glossary of relevant terms. Relevant terms for inclusion in the glossary were identified via an ESMO member survey conducted in Autumn 2016, and by the ESMO Translational Research and Personalised Medicine Working Group members. Each term was defined by experts in the field, discussed and, if necessary, modified by the Working Group before reaching consensus approval. A literature search was carried out to determine which of the terms, 'precision medicine' and 'personalised medicine', is most appropriate to describe this field. A total of 43 terms are included in the glossary, grouped into five main themes-(i) mechanisms of decision, (ii) characteristics of molecular alterations, (iii) tumour characteristics, (iv) clinical trials and statistics and (v) new research tools. The glossary classes 'precision medicine' or 'personalised medicine' as technically interchangeable but the term 'precision medicine' is favoured as it more accurately reflects the highly precise nature of new technologies that permit base pair resolution dissection of cancer genomes and is less likely to be misinterpreted. The ESMO Precision Medicine Glossary provides a resource to facilitate consistent communication in this field by clarifying and raising awareness of the language employed in cancer research and oncology practice. The glossary will be a dynamic entity, undergoing expansion and refinement over the coming years. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology. [br]All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.
Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos
2015-01-01
ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.
Molecular diagnosis and precision medicine in allergy management.
Riccio, Anna Maria; De Ferrari, Laura; Chiappori, Alessandra; Ledda, Sabina; Passalacqua, Giovanni; Melioli, Giovanni; Canonica, Giorgio Walter
2016-11-01
Precision medicine (PM) can be defined as a structural model aimed at customizing healthcare, with medical decisions/products tailored on an individual patient at a highly detailed level. In this sense, allergy diagnostics based on molecular allergen components allows to accurately define the patient's IgE repertoire. The availability of highly specialized singleplexed and multiplexed platforms support allergists with an advanced diagnostic armamentarium. The therapeutic intervention, driven by the standard diagnostic approach, but further supported by these innovative tools may result, for instance, in a more appropriate prescription of allergen immunotherapy (AIT). Also, the phenotyping of patients, which may have relevant effects on the treatment strategy, could be take advantage by the molecular allergy diagnosis.
Precision Medicine: The New Frontier in Idiopathic Pulmonary Fibrosis
Brownell, Robert; Kaminski, Naftali; Woodruff, Prescott G.; Bradford, Williamson Z.; Richeldi, Luca; Martinez, Fernando J.
2016-01-01
Precision medicine is defined by the National Institute of Health’s Precision Medicine Initiative Working Group as an approach to disease treatment that takes into account individual variability in genes, environment, and lifestyle. There has been increased interest in applying the concept of precision medicine to idiopathic pulmonary fibrosis, in particular to search for genetic and molecular biomarker-based profiles (so called endotypes) that identify mechanistically distinct disease subgroups. The relevance of precision medicine to idiopathic pulmonary fibrosis is yet to be established, but we believe that it holds great promise to provide targeted and highly effective therapies to patients. In this manuscript, we describe the field’s nascent efforts in genetic/molecular endotype identification and how environmental and behavioral subgroups may also be relevant to disease management. PMID:26991475
Kim, Young Eun; Kim, Yu-na; Kim, Jung A.; Kim, Ho Min; Jung, Yongwon
2015-01-01
Supramolecular protein assemblies offer novel nanoscale architectures with molecular precision and unparalleled functional diversity. A key challenge, however, is to create precise nano-assemblies of functional proteins with both defined structures and a controlled number of protein-building blocks. Here we report a series of supramolecular green fluorescent protein oligomers that are assembled in precise polygonal geometries and prepared in a monodisperse population. Green fluorescent protein is engineered to be self-assembled in cells into oligomeric assemblies that are natively separated in a single-protein resolution by surface charge manipulation, affording monodisperse protein (nano)polygons from dimer to decamer. Several functional proteins are multivalently displayed on the oligomers with controlled orientations. Spatial arrangements of protein oligomers and displayed functional proteins are directly visualized by a transmission electron microscope. By employing our functional protein assemblies, we provide experimental insight into multivalent protein–protein interactions and tools to manipulate receptor clustering on live cell surfaces. PMID:25972078
Xie, Weizhen; Zhang, Weiwei
2017-09-01
Negative emotion sometimes enhances memory (higher accuracy and/or vividness, e.g., flashbulb memories). The present study investigates whether it is the qualitative (precision) or quantitative (the probability of successful retrieval) aspect of memory that drives these effects. In a visual long-term memory task, observers memorized colors (Experiment 1a) or orientations (Experiment 1b) of sequentially presented everyday objects under negative, neutral, or positive emotions induced with International Affective Picture System images. In a subsequent test phase, observers reconstructed objects' colors or orientations using the method of adjustment. We found that mnemonic precision was enhanced under the negative condition relative to the neutral and positive conditions. In contrast, the probability of successful retrieval was comparable across the emotion conditions. Furthermore, the boost in memory precision was associated with elevated subjective feelings of remembering (vividness and confidence) and metacognitive sensitivity in Experiment 2. Altogether, these findings suggest a novel precision-based account for emotional memories. Copyright © 2017 Elsevier B.V. All rights reserved.
Construct Definition Using Cognitively Based Evidence: A Framework for Practice
ERIC Educational Resources Information Center
Ketterlin-Geller, Leanne R.; Yovanoff, Paul; Jung, EunJu; Liu, Kimy; Geller, Josh
2013-01-01
In this article, we highlight the need for a precisely defined construct in score-based validation and discuss the contribution of cognitive theories to accurately and comprehensively defining the construct. We propose a framework for integrating cognitively based theoretical and empirical evidence to specify and evaluate the construct. We apply…
Defining and Quantifying Potentially Discriminatory Questions in Employment Interviewing.
ERIC Educational Resources Information Center
Springston, Jeffery K.; Keyton, Joann
A study determined what constitutes an illegal pre-employment question, reviewed current laws and literature on the subject, and determined the prevalence of illegal questions asked by organizations. Except in the case of specific statutory law, there is no precise way to define what constitutes an illegal question; however, state and federal…
Consumer Education: A Position on the State of the Art.
ERIC Educational Resources Information Center
Richardson, Lee; And Others
Including the introduction, this document is a collection of seven short papers that discuss facets of consumer education (CE). The Introduction defines CE and lists five assumptions used throughout the report (e.g., CE is generally understood, but not precisely defined enough for the people implementing it to have a uniform understanding; schools…
The Cauchy Problem in Local Spaces for the Complex Ginzburg-Landau EquationII. Contraction Methods
NASA Astrophysics Data System (ADS)
Ginibre, J.; Velo, G.
We continue the study of the initial value problem for the complex Ginzburg-Landau equation
FlexibleSUSY-A spectrum generator generator for supersymmetric models
NASA Astrophysics Data System (ADS)
Athron, Peter; Park, Jae-hyeon; Stöckinger, Dominik; Voigt, Alexander
2015-05-01
We introduce FlexibleSUSY, a Mathematica and C++ package, which generates a fast, precise C++ spectrum generator for any SUSY model specified by the user. The generated code is designed with both speed and modularity in mind, making it easy to adapt and extend with new features. The model is specified by supplying the superpotential, gauge structure and particle content in a SARAH model file; specific boundary conditions e.g. at the GUT, weak or intermediate scales are defined in a separate FlexibleSUSY model file. From these model files, FlexibleSUSY generates C++ code for self-energies, tadpole corrections, renormalization group equations (RGEs) and electroweak symmetry breaking (EWSB) conditions and combines them with numerical routines for solving the RGEs and EWSB conditions simultaneously. The resulting spectrum generator is then able to solve for the spectrum of the model, including loop-corrected pole masses, consistent with user specified boundary conditions. The modular structure of the generated code allows for individual components to be replaced with an alternative if available. FlexibleSUSY has been carefully designed to grow as alternative solvers and calculators are added. Predefined models include the MSSM, NMSSM, E6SSM, USSM, R-symmetric models and models with right-handed neutrinos.
NASA Technical Reports Server (NTRS)
2003-01-01
We propose a multifunctional X-ray facility for the Materials, Biotechnology and Life Sciences Programs to visualize formation and behavior dynamics of materials, biomaterials, and living organisms, tissues and cells. The facility will combine X-ray topography, phase micro-imaging and scattering capabilities with sample units installed on the goniometer. This should allow, for the first time, to monitor under well defined conditions, in situ, in real time: creation of imperfections during growth of semiconductors, metal, dielectric and biomacromolecular crystals and films, high-precision diffraction from crystals within a wide range of temperatures and vapor, melt, solution conditions, internal morphology and changes in living organisms, tissues and cells, diffraction on biominerals, nanotubes and particles, radiation damage, also under controlled formation/life conditions. The system will include an ultrabright X-ray source, X-ray mirror, monochromator, image-recording unit, detectors, and multipurpose diffractometer that fully accommodate and integrate furnaces and samples with other experimental environments. The easily adjustable laboratory and flight versions will allow monitoring processes under terrestrial and microgravity conditions. The flight version can be made available using a microsource combined with multilayer or capillary optics.
Barbieri, Christopher E; Chinnaiyan, Arul M; Lerner, Seth P; Swanton, Charles; Rubin, Mark A
2017-02-01
Biomarker-driven cancer therapy, also referred to as precision oncology, has received increasing attention for its promise of improving patient outcomes by defining subsets of patients more likely to respond to various therapies. In this collaborative review article, we examine recent literature regarding biomarker-driven therapeutics in urologic oncology, to better define the state of the field, explore the current evidence supporting utility of this approach, and gauge potential for the future. We reviewed relevant literature, with a particular focus on recent studies about targeted therapy, predictors of response, and biomarker development. The recent advances in molecular profiling have led to a rapid expansion of potential biomarkers and predictive information for patients with urologic malignancies. Across disease states, distinct molecular subtypes of cancers have been identified, with the potential to inform choices of management strategy. Biomarkers predicting response to standard therapies (such as platinum-based chemotherapy) are emerging. In several malignancies (particularly renal cell carcinoma and castration-resistant prostate cancer), targeted therapy against commonly altered signaling pathways has emerged as standard of care. Finally, targeted therapy against alterations present in rare patients (less than 2%) across diseases has the potential to drastically alter patterns of care and choices of therapeutic options. Precision medicine has the highest potential to impact the care of patients. Prospective studies in the setting of clinical trials and standard of care therapy will help define reliable predictive biomarkers and new therapeutic targets leading to real improvement in patient outcomes. Precision oncology uses molecular information (DNA and RNA) from the individual and the tumor to match the right patient with the right treatment. Tremendous strides have been made in defining the molecular underpinnings of urologic malignancies and understanding how these predict response to treatment-this represents the future of urologic oncology. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Barbieri, Christopher E.; Chinnaiyan, Arul M.; Lerner, Seth P.; Swanton, Charles; Rubin, Mark A.
2016-01-01
Context Biomarker-driven cancer therapy, also referred to as precision oncology, has received increasing attention for its promise of improving patient outcomes by defining subsets of patients more likely to respond to various therapies. Objective In this collaborative review article, we examine recent literature regarding biomarker-driven therapeutics in urologic oncology, to better define the state of the field, explore the current evidence supporting utility of this approach, and gauge potential for the future. Evidence acquisition We reviewed relevant literature, with a particular focus on recent studies about targeted therapy, predictors of response, and biomarker development. Evidence synthesis The recent advances in molecular profiling have led to a rapid expansion of potential biomarkers and predictive information for patients with urologic malignancies. Across disease states, distinct molecular subtypes of cancers have been identified, with the potential to inform choices of management strategy. Biomarkers predicting response to standard therapies (such as platinum-based chemotherapy) are emerging. In several malignancies (particularly renal cell carcinoma and castration-resistant prostate cancer), targeted therapy against commonly altered signaling pathways has emerged as standard of care. Finally, targeted therapy against alterations present in rare patients (less than 2%) across diseases has the potential to drastically alter patterns of care and choices of therapeutic options. Conclusions Precision medicine has the highest potential to impact the care of patients. Prospective studies in the setting of clinical trials and standard of care therapy will help define reliable predictive biomarkers and new therapeutic targets leading to real improvement in patient outcomes. Patient summary Precision oncology uses molecular information (DNA and RNA) from the individual and the tumor to match the right patient with the right treatment. Tremendous strides have been made in defining the molecular underpinnings of urologic malignancies and understanding how these predict response to treatment—this represents the future of urologic oncology. PMID:27567210
Russo, Russell R; Burn, Matthew B; Ismaily, Sabir K; Gerrie, Brayden J; Han, Shuyang; Alexander, Jerry; Lenherr, Christopher; Noble, Philip C; Harris, Joshua D; McCulloch, Patrick C
2017-09-07
Accurate measurements of knee and hip motion are required for management of musculoskeletal pathology. The purpose of this investigation was to compare three techniques for measuring motion at the hip and knee. The authors hypothesized that digital photography would be equivalent in accuracy and show higher precision compared to the other two techniques. Using infrared motion capture analysis as the reference standard, hip flexion/abduction/internal rotation/external rotation and knee flexion/extension were measured using visual estimation, goniometry, and photography on 10 fresh frozen cadavers. These measurements were performed by three physical therapists and three orthopaedic surgeons. Accuracy was defined by the difference from the reference standard, while precision was defined by the proportion of measurements within either 5° or 10°. Analysis of variance (ANOVA), t-tests, and chi-squared tests were used. Although two statistically significant differences were found in measurement accuracy between the three techniques, neither of these differences met clinical significance (difference of 1.4° for hip abduction and 1.7° for the knee extension). Precision of measurements was significantly higher for digital photography than: (i) visual estimation for hip abduction and knee extension, and (ii) goniometry for knee extension only. There was no clinically significant difference in measurement accuracy between the three techniques for hip and knee motion. Digital photography only showed higher precision for two joint motions (hip abduction and knee extension). Overall digital photography shows equivalent accuracy and near-equivalent precision to visual estimation and goniometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhakaran, Venkateshkumar; Johnson, Grant E.; Wang, Bingbing
Molecular-level understanding of electrochemical processes occurring at electrode-electrolyte interfaces (EEI) is key to the rational development of high-performance and sustainable electrochemical technologies. This article reports the development and first application of solid-state in situ electrochemical probes to study redox and catalytic processes occurring at well-defined EEI generated using soft-landing of mass- and charge-selected cluster ions (SL). In situ electrochemical probes with excellent mass transfer properties are fabricated using carefully-designed nanoporous ionic liquid membranes. SL enables deposition of pure active species that are not obtainable with other techniques onto electrode surfaces with precise control over charge state, composition, and kinetic energy.more » SL is, therefore, a unique tool for studying fundamental processes occurring at EEI. For the first time using an aprotic electrochemical probe, the effect of charge state (PMo12O403-/2-) and the contribution of building blocks of Keggin polyoxometalate (POM) clusters to redox processes are characterized by populating EEI with novel POM anions generated by electrospray ionization and gas phase dissociation. Additionally, a proton conducting electrochemical probe has been developed to characterize the reactive electrochemistry (oxygen reduction activity) of bare Pt clusters (Pt40 ~1 nm diameter), thus demonstrating the capability of the probe for studying reactions in controlled gaseous environments. The newly developed in situ electrochemical probes combined with ion SL provide a versatile method to characterize the EEI in solid-state redox systems and reactive electrochemistry at precisely-defined conditions. This capability will advance molecular-level understanding of processes occurring at EEI that are critical to many energy-related technologies.« less
Transport phenomena of graded sediments in tidal environments
NASA Astrophysics Data System (ADS)
Bonaldo, Davide; Dall'Angelo, Chiara; di Silvio, Giampaolo
2010-05-01
A long-term morphodynamic model simulating the ontogenesis and evolution of a tidal lagoon has been undergoing a continuous improvement in order to enrich its predictive ability and assess the relative importance of different factors, of both natural and anthropogenic origin, in defining the equilibrium configuration of such systems. A significant step forward in this direction is achieved by introducing the possibility to extend the analysis from uniform to graded sediments. In the latter case the representation of long-term phenomena is conceptually the same as for a sediment characterized by a single granulometric class, as far as it concerns the temporal averaging and the splitting of the transport in a dispersive component (mainly given by tidal action) and an eulerian residual convective component (resulting from rivers, long-shore currents, and asymmetry between flood and ebb flow fields). The horizontal sediment budget, however, is now coupled with a sediment budget among the different granulometric classes in the bottom, and precisely in a "mixing layer" whose thickness has to be properly defined. This new enhancement of the model allows, beside a more precise description of the morphodynamic processes, a certain number of further investigations. As a first point, it makes it possible to study the effect of the initial stratigraphic conditions on the genesis and evolution of the tidal basin, thus obtaining some informations about the persistence of "geological memory" in the system. Another matter, of environmental interest rather than strictly morphodynamic, concerns the possibility of creating "auxiliary classes" among the grainsize classes in order to label and track contaminated sediments, providing a prediction tool and a decisional support in case of environmental accidents. Such a sediment tracking could also be used to distinguish the sediments according to their fluvial or maritime origin, defining in this way a criterion for the classification of the various morphological features which can be found within the system. A sensitivity analysis of the main parameters is under way.
Precision and repeatability of the Optotrak 3020 motion measurement system.
States, R A; Pappas, E
2006-01-01
Several motion analysis systems are used by researchers to quantify human motion and to perform accurate surgical procedures. The Optotrak 3020 is one of these systems and despite its widespread use there is not any published information on its precision and repeatability. We used a repeated measures design study to evaluate the precision and repeatability of the Optotrak 3020 by measuring distance and angle in three sessions, four distances and three conditions (motion, static vertical, and static tilted). Precision and repeatability were found to be excellent for both angle and distance although they decreased with increasing distance from the sensors and with tilt from the plane of the sensors. Motion did not have a significant effect on the precision of the measurements. In conclusion, the measurement error of the Optotrak is minimal. Further studies are needed to evaluate its precision and repeatability under human motion conditions.
Avila, Irene; Lin, Shih-Chieh
2014-03-01
The survival of animals depends critically on prioritizing responses to motivationally salient stimuli. While it is generally believed that motivational salience increases decision speed, the quantitative relationship between motivational salience and decision speed, measured by reaction time (RT), remains unclear. Here we show that the neural correlate of motivational salience in the basal forebrain (BF), defined independently of RT, is coupled with faster and also more precise decision speed. In rats performing a reward-biased simple RT task, motivational salience was encoded by BF bursting response that occurred before RT. We found that faster RTs were tightly coupled with stronger BF motivational salience signals. Furthermore, the fraction of RT variability reflecting the contribution of intrinsic noise in the decision-making process was actively suppressed in faster RT distributions with stronger BF motivational salience signals. Artificially augmenting the BF motivational salience signal via electrical stimulation led to faster and more precise RTs and supports a causal relationship. Together, these results not only describe for the first time, to our knowledge, the quantitative relationship between motivational salience and faster decision speed, they also reveal the quantitative coupling relationship between motivational salience and more precise RT. Our results further establish the existence of an early and previously unrecognized step in the decision-making process that determines both the RT speed and variability of the entire decision-making process and suggest that this novel decision step is dictated largely by the BF motivational salience signal. Finally, our study raises the hypothesis that the dysregulation of decision speed in conditions such as depression, schizophrenia, and cognitive aging may result from the functional impairment of the motivational salience signal encoded by the poorly understood noncholinergic BF neurons.
Huelle, Jan O; Katz, Toam; Druchkiv, Vasyl; Pahlitzsch, Milena; Steinberg, Johannes; Richard, Gisbert; Linke, Stephan J
2014-11-01
To provide the first clinical data in determining the feasibility, quality and precision of intraoperative wavefront aberrometry (IWA)-based refraction in patients with cataract. IWA refraction was recorded at 7 defined measurement points during standardised cataract surgery in 74 eyes of 74 consecutive patients (mean age 69±11.3 years). Precision and measurement quality was evaluated by the 'limits of agreement' approach, regression analysis, correlation analysis, Analysis of variance (ANOVA) and ORs for predicting measurement failure. Wavefront map (WFM) quality was objectivised and compared with the Pentacam Nuclear Staging analysis. Out of 814 IWA measurement attempts, 462 WFMs could be obtained. The most successful readings (n=63) were achieved in aphakia with viscoelastic. The highest (50.63%, SD 20.23) and lowest (29.19%, SD 13.94) quality of WFMs across all measurement points were found after clear corneal incision and in pseudophakia with viscoelastic, respectively. High consistency across repeated measures were found for mean spherical equivalent (SE) differences in aphakia with -0.01D and pseudophakia with -0.01D, but ranges were high (limits of agreement +0.69 D and -0.72 D; +1.53 D and -1.54 D, respectively). With increasing WFM quality, higher precision in measurements was observed. This is the first report addressing quality and reproducibility of WA in a large sample. IWA refraction in aphakia, for instance, appears to be reliable once stable and pressurised anterior chamber conditions are achieved. More efforts are required to improve the precision and quality of measurements before IWA can be used to guide the surgical refractive plan in cataract surgery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Avila, Irene; Lin, Shih-Chieh
2014-01-01
The survival of animals depends critically on prioritizing responses to motivationally salient stimuli. While it is generally believed that motivational salience increases decision speed, the quantitative relationship between motivational salience and decision speed, measured by reaction time (RT), remains unclear. Here we show that the neural correlate of motivational salience in the basal forebrain (BF), defined independently of RT, is coupled with faster and also more precise decision speed. In rats performing a reward-biased simple RT task, motivational salience was encoded by BF bursting response that occurred before RT. We found that faster RTs were tightly coupled with stronger BF motivational salience signals. Furthermore, the fraction of RT variability reflecting the contribution of intrinsic noise in the decision-making process was actively suppressed in faster RT distributions with stronger BF motivational salience signals. Artificially augmenting the BF motivational salience signal via electrical stimulation led to faster and more precise RTs and supports a causal relationship. Together, these results not only describe for the first time, to our knowledge, the quantitative relationship between motivational salience and faster decision speed, they also reveal the quantitative coupling relationship between motivational salience and more precise RT. Our results further establish the existence of an early and previously unrecognized step in the decision-making process that determines both the RT speed and variability of the entire decision-making process and suggest that this novel decision step is dictated largely by the BF motivational salience signal. Finally, our study raises the hypothesis that the dysregulation of decision speed in conditions such as depression, schizophrenia, and cognitive aging may result from the functional impairment of the motivational salience signal encoded by the poorly understood noncholinergic BF neurons. PMID:24642480
Equity and Value in 'Precision Medicine'.
Gray, Muir; Lagerberg, Tyra; Dombrádi, Viktor
2017-04-01
Precision medicine carries huge potential in the treatment of many diseases, particularly those with high-penetrance monogenic underpinnings. However, precision medicine through genomic technologies also has ethical implications. We will define allocative, personal, and technical value ('triple value') in healthcare and how this relates to equity. Equity is here taken to be implicit in the concept of triple value in countries that have publicly funded healthcare systems. It will be argued that precision medicine risks concentrating resources to those that already experience greater access to healthcare and power in society, nationally as well as globally. Healthcare payers, clinicians, and patients must all be involved in optimising the potential of precision medicine, without reducing equity. Throughout, the discussion will refer to the NHS RightCare Programme, which is a national initiative aiming to improve value and equity in the context of NHS England.
Production of zinc oxide nanowires power with precisely defined morphology
NASA Astrophysics Data System (ADS)
Mičová, Júlia; Remeš, Zdeněk; Chan, Yu-Ying
2017-12-01
The interest about zinc oxide is increasing thanks to its unique chemical and physical properties. Our attention has focused on preparation powder of 1D nanostructures of ZnO nanowires with precisely defined morphology include characterization size (length and diameter) and shape controlled in the scanning electron microscopy (SEM). We have compared results of SEM with dynamic light scattering (DLS) technique. We have found out that SEM method gives more accurate results. We have proposed transformation process from ZnO nanowires on substrates to ZnO nanowires powder by ultrasound peeling to colloid followed by lyophilization. This method of the mass production of the ZnO nanowires powder has some advantages: simplicity, cost effective, large-scale and environment friendly.
Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas
2015-01-01
Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID:25653655
Revisiting Ferguson's Defining Cases of Diglossia
ERIC Educational Resources Information Center
Snow, Don
2013-01-01
While the defining cases of diglossia offered in Charles Ferguson's 1959 article have long been useful as vehicles for introducing this important form of societal multilingualism, they are also problematic in that they differ from each other in a number of significant ways. This article proposes a modified and more precise framework in which…
Defining the Essence of a University: Lessons from Higher Education Branding
ERIC Educational Resources Information Center
Waeraas, Arild; Solbakk, Marianne N.
2009-01-01
Branding is a phenomenon that has become increasingly common in higher education over the last few years. It entails defining the essence of what a university "is", what it "stands for", and what it is going to be known for, requiring precision and consistency in the formulations as well as internal commitment to the brand.…
Stress-Triggered Phase Separation Is an Adaptive, Evolutionarily Tuned Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riback, Joshua A.; Katanski, Christopher D.; Kear-Scott, Jamie L.
In eukaryotic cells, diverse stresses trigger coalescence of RNA-binding proteins into stress granules. In vitro, stress-granule-associated proteins can demix to form liquids, hydrogels, and other assemblies lacking fixed stoichiometry. Observing these phenomena has generally required conditions far removed from physiological stresses. We show that poly(A)-binding protein (Pab1 in yeast), a defining marker of stress granules, phase separates and forms hydrogels in vitro upon exposure to physiological stress conditions. Other RNA-binding proteins depend upon low-complexity regions (LCRs) or RNA for phase separation, whereas Pab1’s LCR is not required for demixing, and RNA inhibits it. Based on unique evolutionary patterns, we createmore » LCR mutations, which systematically tune its biophysical properties and Pab1 phase separation in vitro and in vivo. Mutations that impede phase separation reduce organism fitness during prolonged stress. Poly(A)-binding protein thus acts as a physiological stress sensor, exploiting phase separation to precisely mark stress onset, a broadly generalizable mechanism.« less
Patient-centred outcomes research: perspectives of patient stakeholders.
Chhatre, Sumedha; Gallo, Joseph J; Wittink, Marsha; Schwartz, J Sanford; Jayadevappa, Ravishankar
2017-11-01
To elicit patient stakeholders' experience and perspectives about patient-centred care. Qualitative. A large urban healthcare system. Four patient stakeholders who are prostate cancer survivors. Experience and perspectives of patient stakeholders regarding patient-centred care and treatment decisions. Our patient stakeholders represented a diverse socio-demographic group. The patient stakeholders identified engagement and dialogue with physicians as crucial elements of patient-centred care model. The degree of patient-centred care was observed to be dependent on the situations. High severity conditions warranted a higher level of patient involvement, compared to mild conditions. They agreed that patient-centred care should not mean that patients can demand inappropriate treatments. An important attribute of patient-centred outcomes research model is the involvement of stakeholders. However, we have limited knowledge about the experience of patient stakeholders in patient-centred outcomes research. Our study indicates that patient stakeholders offer a unique perspective as researchers and policy-makers aim to precisely define patient-centred research and care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Detwiler, Russell L.; Glass, Robert J.; Pringle, Scott E.
Understanding of single and multi-phase flow and transport in fractures can be greatly enhanced through experimentation in transparent systems (analogs or replicas) where light transmission techniques yield quantitative measurements of aperture, solute concentration, and phase saturation fields. Here we quanti@ aperture field measurement error and demonstrate the influence of this error on the results of flow and transport simulations (hypothesized experimental results) through saturated and partially saturated fractures. find that precision and accuracy can be balanced to greatly improve the technique and We present a measurement protocol to obtain a minimum error field. Simulation results show an increased sensitivity tomore » error as we move from flow to transport and from saturated to partially saturated conditions. Significant sensitivity under partially saturated conditions results in differences in channeling and multiple-peaked breakthrough curves. These results emphasize the critical importance of defining and minimizing error for studies of flow and transpoti in single fractures.« less
The Investigation of Ghost Fluid Method for Simulating the Compressible Two-Medium Flow
NASA Astrophysics Data System (ADS)
Lu, Hai Tian; Zhao, Ning; Wang, Donghong
2016-06-01
In this paper, we investigate the conservation error of the two-dimensional compressible two-medium flow simulated by the front tracking method. As the improved versions of the original ghost fluid method, the modified ghost fluid method and the real ghost fluid method are selected to define the interface boundary conditions, respectively, to show different effects on the conservation error. A Riemann problem is constructed along the normal direction of the interface in the front tracking method, with the goal of obtaining an efficient procedure to track the explicit sharp interface precisely. The corresponding Riemann solutions are also used directly in these improved ghost fluid methods. Extensive numerical examples including the sod tube and the shock-bubble interaction are tested to calculate the conservation error. It is found that these two ghost fluid methods have distinctive performances for different initial conditions of the flow field, and the related conclusions are made to suggest the best choice for the combination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dill, Eric D.; Folmer, Jacob C.W.; Martin, James D.
A series of simulations was performed to enable interpretation of the material and physical significance of the parameters defined in the Kolmogorov, Johnson and Mehl, and Avrami (KJMA) rate expression commonly used to describe phase boundary controlled reactions of condensed matter. The parameters k, n, and t 0 are shown to be highly correlated, which if unaccounted for seriously challenge mechanistic interpretation. It is demonstrated that rate measurements exhibit an intrinsic uncertainty without precise knowledge of the location and orientation of nucleation with respect to the free volume into which it grows. More significantly, it is demonstrated that the KJMAmore » rate constant k is highly dependent on sample size. However, under the simulated conditions of slow nucleation relative to crystal growth, sample volume and sample anisotropy correction affords a means to eliminate the experimental condition dependence of the KJMA rate constant, k, producing the material-specific parameter, the velocity of the phase boundary, v pb.« less
Huang, Ling; Holtzinger, Audrey; Jagan, Ishaan; BeGora, Michael; Lohse, Ines; Ngai, Nicholas; Nostro, Cristina; Wang, Rennian; Muthuswamy, Lakshmi B.; Crawford, Howard C.; Arrowsmith, Cheryl; Kalloger, Steve E.; Renouf, Daniel J.; Connor, Ashton A; Cleary, Sean; Schaeffer, David F.; Roehrl, Michael; Tsao, Ming-Sound; Gallinger, Steven; Keller, Gordon; Muthuswamy, Senthil K.
2016-01-01
There are few in vitro models of exocrine pancreas development and primary human pancreatic adenocarcinoma (PDAC). We establish three-dimensional culture conditions to induce the differentiation of human pluripotent stem cells (PSCs) into exocrine progenitor organoids that form ductal and acinar structures in culture and in vivo. Expression of mutant KRAS or TP53 in progenitor organoids induces mutation-specific phenotypes in culture and in vivo. Expression of TP53R175H induced cytosolic SOX9 localization. In patient tumors bearing TP53 mutations, SOX9 was cytoplasmic and associated with mortality. Culture conditions are also defined for clonal generation of tumor organoids from freshly resected PDAC. Tumor organoids maintain the differentiation status, histoarchitecture, phenotypic heterogeneity of the primary tumor, and retain patient-specific physiologic changes including hypoxia, oxygen consumption, epigenetic marks, and differential sensitivity to EZH2 inhibition. Thus, pancreatic progenitor organoids and tumor organoids can be used to model PDAC and for drug screening to identify precision therapy strategies. PMID:26501191
NASA Astrophysics Data System (ADS)
Mekanik, Abolghasem; Soleimani, Mohsen
2007-11-01
Wind effect on natural draught cooling towers has a very complex physics. The fluid flow and temperature distribution around and in a single and two adjacent (tandem and side by side) dry-cooling towers under cross wind are studied numerically in the present work. Cross-wind can significantly reduce cooling efficiency of natural-draft dry-cooling towers, and the adjacent towers can affect the cooling efficiency of both. In this paper we will present a complex computational model involving more than 750,000 finite volume cells under precisely defined boundary condition. Since the flow is turbulent, the standard k-ɛ turbulence model is used. The numerical results are used to estimate the heat transfer between radiators of the tower and air surrounding it. The numerical simulation explained the main reason for decline of the thermo-dynamical performance of dry-cooling tower under cross wind. In this paper, the incompressible fluid flow is simulated, and the flow is assumed steady and three-dimensional.
Dynamical observation and detailed description of catalysts under strong metal–support interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Shuyi; Plessow, Philipp N.; Willis, Joshua J.
2016-06-09
Understanding the structures of catalysts under realistic conditions with atomic precision is crucial to design better materials for challenging transformations. Under reducing conditions, certain reducible supports migrate onto supported metallic particles and create strong metal–support states that drastically change the reactivity of the systems. The details of this process are still unclear and preclude its thorough exploitation. Here, we report an atomic description of a palladium/titania (Pd/TiO 2) system by combining state-of-the-art in situ transmission electron microscopy and density functional theory (DFT) calculations with structurally defined materials, in which we visualize the formation of the overlayers at the atomic scalemore » under atmospheric pressure and high temperature. We show that an amorphous reduced titania layer is formed at low temperatures, and that crystallization of the layer into either mono- or bilayer structures is dictated by the reaction environment and predicted by theory. Moreover, it occurs in combination with a dramatic reshaping of the metallic surface facets.« less
Reduction to Outside the Atmosphere and Statistical Tests Used in Geneva Photometry
NASA Technical Reports Server (NTRS)
Rufener, F.
1984-01-01
Conditions for creating a precise photometric system are investigated. The analytical and discriminatory potentials of a photometry obviously result from the localization of the passbands in the spectrum; they do, however, also depend critically on the precision attained. This precision is the result of two different types of precautions. Two procedures which contribute in an efficient manner to achieving greater precision are examined. These two methods are known as hardware related precision and software related precision.
A Neumann boundary term for gravity
NASA Astrophysics Data System (ADS)
Krishnan, Chethan; Raju, Avinash
2017-05-01
The Gibbons-Hawking-York (GHY) boundary term makes the Dirichlet problem for gravity well-defined, but no such general term seems to be known for Neumann boundary conditions. In this paper, we view Neumann not as fixing the normal derivative of the metric (“velocity”) at the boundary, but as fixing the functional derivative of the action with respect to the boundary metric (“momentum”). This leads directly to a new boundary term for gravity: the trace of the extrinsic curvature with a specific dimension-dependent coefficient. In three dimensions, this boundary term reduces to a “one-half” GHY term noted in the literature previously, and we observe that our action translates precisely to the Chern-Simons action with no extra boundary terms. In four dimensions, the boundary term vanishes, giving a natural Neumann interpretation to the standard Einstein-Hilbert action without boundary terms. We argue that in light of AdS/CFT, ours is a natural approach for defining a “microcanonical” path integral for gravity in the spirit of the (pre-AdS/CFT) work of Brown and York.
Understanding the Molecular Mechanisms of the Interplay Between Herbal Medicines and Gut Microbiota.
Xu, Jun; Chen, Hu-Biao; Li, Song-Lin
2017-09-01
Herbal medicines (HMs) are much appreciated for their significant contribution to human survival and reproduction by remedial and prophylactic management of diseases. Defining the scientific basis of HMs will substantiate their value and promote their modernization. Ever-increasing evidence suggests that gut microbiota plays a crucial role in HM therapy by complicated interplay with HM components. This interplay includes such activities as: gut microbiota biotransforming HM chemicals into metabolites that harbor different bioavailability and bioactivity/toxicity from their precursors; HM chemicals improving the composition of gut microbiota, consequently ameliorating its dysfunction as well as associated pathological conditions; and gut microbiota mediating the interactions (synergistic and antagonistic) between the multiple chemicals in HMs. More advanced experimental designs are recommended for future study, such as overall chemical characterization of gut microbiota-metabolized HMs, direct microbial analysis of HM-targeted gut microbiota, and precise gut microbiota research model development. The outcomes of such research can further elucidate the interactions between HMs and gut microbiota, thereby opening a new window for defining the scientific basis of HMs and for guiding HM-based drug discovery. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Swindle, T. D.
2014-12-01
Our knowledge of the noble gas abundances and isotopic compositions in the Martian crust and atmosphere come from two sources, measurements of meteorites from Mars and in situ measurements by spacecraft. Measurements by the Viking landers had large uncertainties, but were precise enough to tie the meteorites to Mars. Hence most of the questions we have are currently defined by meteorite measurements. Curiosity's SAM has confirmed that the Ar isotopic composition of the atmosphere is highly fractionated, presumably representing atmospheric loss that can now be modeled with more confidence. What turns out to be a more difficult trait to explain is the fact that the ratio of Kr/Xe in nakhlites, chassignites and ALH84001 is distinct from the atmospheric ratio, as defined by measurements from shergottites. This discrepancy has been suggested to be a result of atmosphere/groundwater/rock interaction, polar clathrate formation, or perhaps local temperature conditions. More detailed atmospheric measurements, along with targeted simulation experiments, will be needed to make full use of this anomaly.
Dawood, M K; Liew, T H; Lianto, P; Hong, M H; Tripathy, S; Thong, J T L; Choi, W K
2010-05-21
We report a simple and cost effective method for the synthesis of large-area, precisely located silicon nanocones from nanowires. The nanowires were obtained from our interference lithography and catalytic etching (IL-CE) method. We found that porous silicon was formed near the Au catalyst during the fabrication of the nanowires. The porous silicon exhibited enhanced oxidation ability when exposed to atmospheric conditions or in wet oxidation ambient. Very well located nanocones with uniform sharpness resulted when these oxidized nanowires were etched in 10% HF. Nanocones of different heights were obtained by varying the doping concentration of the silicon wafers. We believe this is a novel method of producing large-area, low cost, well defined nanocones from nanowires both in terms of the control of location and shape of the nanocones. A wide range of potential applications of the nanocone array can be found as a master copy for nanoimprinted polymer substrates for possible biomedical research; as a candidate for making sharp probes for scanning probe nanolithography; or as a building block for field emitting tips or photodetectors in electronic/optoelectronic applications.
Breakdown Conditioning Chacteristics of Precision-Surface-Treatment-Electrode in Vacuum
NASA Astrophysics Data System (ADS)
Kato, Kastumi; Fukuoka, Yuji; Inagawa, Yukihiko; Saitoh, Hitoshi; Sakaki, Masayuki; Okubo, Hitoshi
Breakdown (BD) characteristics in vacuum are strongly dependent on the electrode surface condition, like the surface roughness etc. Therefore, in order to develop a high voltage vacuum circuit breaker, it is important to optimize the surface treatment process. This paper discusses about the effect of precision-surface-treatment of the electrode on breakdown conditioning characteristics under non-uniform electric field in vacuum. Experimental results reveal that the electrode surface treatment affects the conditioning process, especially the BD voltage and the BD field strength at the initial stage of the conditioning.
Exact solution and precise asymptotics of a Fisher-KPP type front
NASA Astrophysics Data System (ADS)
Berestycki, Julien; Brunet, Éric; Derrida, Bernard
2018-01-01
The present work concerns a version of the Fisher-KPP equation where the nonlinear term is replaced by a saturation mechanism, yielding a free boundary problem with mixed conditions. Following an idea proposed in Brunet and Derrida (2015 J. Stat. Phys. 161 801), we show that the Laplace transform of the initial condition is directly related to some functional of the front position μt . We then obtain precise asymptotics of the front position by means of singularity analysis. In particular, we recover the so-called Ebert and van Saarloos correction (Ebert and van Saarloos 2000 Physica D 146 1), we obtain an additional term of order log t /t in this expansion, and we give precise conditions on the initial condition for those terms to be present.
NASA Technical Reports Server (NTRS)
Naasz, Bo J.; Burns, Richard D.; Gaylor, David; Higinbotham, John
2004-01-01
A sample mission sequence is defined for a low earth orbit demonstration of Precision Formation Flying (PFF). Various guidance navigation and control strategies are discussed for use in the PFF experiment phases. A sample PFF experiment is implemented and tested in a realistic Hardware-in-the-Loop (HWIL) simulation using the Formation Flying Test Bed (FFTB) at NASA's Goddard Space Flight Center.
Navigation Performance of Global Navigation Satellite Systems in the Space Service Volume
NASA Technical Reports Server (NTRS)
Force, Dale A.
2013-01-01
This paper extends the results I reported at this year's ION International Technical Meeting on multi-constellation GNSS coverage by showing how the use of multi-constellation GNSS improves Geometric Dilution of Precision (GDOP). Originally developed to provide position, navigation, and timing for terrestrial users, GPS has found increasing use for in space for precision orbit determination, precise time synchronization, real-time spacecraft navigation, and three-axis attitude control of Earth orbiting satellites. With additional Global Navigation Satellite Systems (GNSS) coming into service (GLONASS, Galileo, and Beidou) and the development of Satellite Based Augmentation Services, it is possible to obtain improved precision by using evolving multi-constellation receiver. The Space Service Volume formally defined as the volume of space between three thousand kilometers altitude and geosynchronous altitude ((is) approximately 36,500 km), with the volume below three thousand kilometers defined as the Terrestrial Service Volume (TSV). The USA has established signal requirements for the Space Service Volume (SSV) as part of the GPS Capability Development Documentation (CDD). Diplomatic efforts are underway to extend Space service Volume commitments to the other Position, Navigation, and Timing (PNT) service providers in an effort to assure that all space users will benefit from the enhanced capabilities of interoperating GNSS services in the space domain.
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
Self-Assembled Monolayers for Dental Implants
Correa-Uribe, Alejandra
2018-01-01
Implant-based therapy is a mature approach to recover the health conditions of patients affected by edentulism. Thousands of dental implants are placed each year since their introduction in the 80s. However, implantology faces challenges that require more research strategies such as new support therapies for a world population with a continuous increase of life expectancy, to control periodontal status and new bioactive surfaces for implants. The present review is focused on self-assembled monolayers (SAMs) for dental implant materials as a nanoscale-processing approach to modify titanium surfaces. SAMs represent an easy, accurate, and precise approach to modify surface properties. These are stable, well-defined, and well-organized organic structures that allow to control the chemical properties of the interface at the molecular scale. The ability to control the composition and properties of SAMs precisely through synthesis (i.e., the synthetic chemistry of organic compounds with a wide range of functional groups is well established and in general very simple, being commercially available), combined with the simple methods to pattern their functional groups on complex geometry appliances, makes them a good system for fundamental studies regarding the interaction between surfaces, proteins, and cells, as well as to engineering surfaces in order to develop new biomaterials. PMID:29552036
Active inference and epistemic value.
Friston, Karl; Rigoli, Francesco; Ognibene, Dimitri; Mathys, Christoph; Fitzgerald, Thomas; Pezzulo, Giovanni
2015-01-01
We offer a formal treatment of choice behavior based on the premise that agents minimize the expected free energy of future outcomes. Crucially, the negative free energy or quality of a policy can be decomposed into extrinsic and epistemic (or intrinsic) value. Minimizing expected free energy is therefore equivalent to maximizing extrinsic value or expected utility (defined in terms of prior preferences or goals), while maximizing information gain or intrinsic value (or reducing uncertainty about the causes of valuable outcomes). The resulting scheme resolves the exploration-exploitation dilemma: Epistemic value is maximized until there is no further information gain, after which exploitation is assured through maximization of extrinsic value. This is formally consistent with the Infomax principle, generalizing formulations of active vision based upon salience (Bayesian surprise) and optimal decisions based on expected utility and risk-sensitive (Kullback-Leibler) control. Furthermore, as with previous active inference formulations of discrete (Markovian) problems, ad hoc softmax parameters become the expected (Bayes-optimal) precision of beliefs about, or confidence in, policies. This article focuses on the basic theory, illustrating the ideas with simulations. A key aspect of these simulations is the similarity between precision updates and dopaminergic discharges observed in conditioning paradigms.
A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics
NASA Astrophysics Data System (ADS)
Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.
2017-03-01
Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.
Comparative OCT imaging of the human esophagus: How well can we localize the muscularis mucosae?
NASA Astrophysics Data System (ADS)
Cilesiz, Inci F.; Fockens, Paul; Kerindongo, Raphaela P.; Faber, Dirk J.; Tytgat, Guido N. J.; ten Kate, Febo; van Leeuwen, Ton G. J. M.
2002-06-01
Early diagnosis with esophageal cancer limited to the mucosa will allow for local endoscopic treatment and improve prognosis. We compared with histology OCT images of healthy human esophageal tissue from two systems operating at 800 and 1275 nm to investigate which wavelength was best suited for detailed OCT imaging of the esophageal wall, and to localize the muscularis mucosae. Within an hour of surgical resection, an esophageal specimen was cleaned of excess blood and soaked in formalin for a minimum of 48 hours. In order to precisely localize the different layers of the esophageal wall on an OCT image, well-defined structures within the esophageal wall were sought. Following OCT imaging the specimen was prepared for routine histology. We observed that our 1275 nm system with 12 micrometers resolution was superior in terms of penetration. As compared to histology, the 4 micrometers resolution of our 800 nm system made fine details more visible. Using either system, a minimally trained eye could recognize the muscularis mucosae as a hypo-reflective layer. Although different conditions may apply in vivo, our ex vivo study paves the path to precise interpretation of OCT images of the esophageal wall.
Cable Effects Study. Tangents, Rabbit Holes, Dead Ends, and Valuable Results
Ardelean, Emil V.; Babuška, Vít; Goodding, James C.; ...
2014-08-04
Lessons learned during a study on the effects that electrical power and signal wiring harness cables introduce on the dynamic response of precision spacecraft is presented, along with the most significant results. Our study was a three year effort to discover a set of practical approaches for updating well-defined dynamic models of harness-free structures where knowledge of the cable type, position, and tie-down method are known. Although cables are found on every satellite, the focus was on precision, low damping, and very flexible structures. Obstacles encountered, classified as tangents, rabbit holes, and dead ends, offer practical lessons for structural dynamicsmore » research. The paper traces the historical, experiential progression of the project, describing how the obstacles affected the project. Methods were developed to estimate cable properties. Problems were encountered because of the flexible, highly damped nature of cables. A beam was used as a test article to validate experimentally derived cable properties and to refine the assumptions regarding boundary conditions. Furthermore, a spacecraft bus-like panel with cables attached was designed, and finite element models were developed and validated through experiment. Various paths were investigated at each stage before a consistent test and analysis methodology was developed« less
Cable Effects Study. Tangents, Rabbit Holes, Dead Ends, and Valuable Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ardelean, Emil V.; Babuška, Vít; Goodding, James C.
Lessons learned during a study on the effects that electrical power and signal wiring harness cables introduce on the dynamic response of precision spacecraft is presented, along with the most significant results. Our study was a three year effort to discover a set of practical approaches for updating well-defined dynamic models of harness-free structures where knowledge of the cable type, position, and tie-down method are known. Although cables are found on every satellite, the focus was on precision, low damping, and very flexible structures. Obstacles encountered, classified as tangents, rabbit holes, and dead ends, offer practical lessons for structural dynamicsmore » research. The paper traces the historical, experiential progression of the project, describing how the obstacles affected the project. Methods were developed to estimate cable properties. Problems were encountered because of the flexible, highly damped nature of cables. A beam was used as a test article to validate experimentally derived cable properties and to refine the assumptions regarding boundary conditions. Furthermore, a spacecraft bus-like panel with cables attached was designed, and finite element models were developed and validated through experiment. Various paths were investigated at each stage before a consistent test and analysis methodology was developed« less
Words matter: Recommendations for clarifying coral disease nomenclature and terminology
Rogers, Caroline S.
2010-01-01
Coral diseases have caused significant losses on Caribbean reefs and are becoming a greater concern in the Pacific. Progress in coral disease research requires collaboration and communication among experts from many different disciplines. The lack of consistency in the use of terms and names in the recent scientific literature reflects the absence of an authority for naming coral diseases, a lack of consensus on the meaning of even some of the most basic terms as they apply to corals, and imprecision in the use of descriptive words. The lack of consensus partly reflects the complexity of this newly emerging field of research. Establishment of a nomenclature committee under the Coral Disease and Health Consortium (CDHC) could lead to more standardized definitions and could promote use of appropriate medical terminology for describing and communicating disease conditions in corals. This committee could also help to define disease terminology unique to corals where existing medical terminology is not applicable. These efforts will help scientists communicate with one another and with the general public more effectively. Scientists can immediately begin to reduce some of the confusion simply by explicitly defining the words they are using. In addition, digital photographs can be posted on the CDHC website and included in publications to document the macroscopic (gross) signs of the conditions observed on coral colonies along with precisely written characterizations and descriptions.
Aperture alignment in autocollimator-based deflectometric profilometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geckeler, R. D., E-mail: Ralf.Geckeler@ptb.de; Just, A.; Kranz, O.
2016-05-15
During the last ten years, deflectometric profilometers have become indispensable tools for the precision form measurement of optical surfaces. They have proven to be especially suitable for characterizing beam-shaping optical surfaces for x-ray beamline applications at synchrotrons and free electron lasers. Deflectometric profilometers use surface slope (angle) to assess topography and utilize commercial autocollimators for the contactless slope measurement. To this purpose, the autocollimator beam is deflected by a movable optical square (or pentaprism) towards the surface where a co-moving aperture limits and defines the beam footprint. In this paper, we focus on the precise and reproducible alignment of themore » aperture relative to the autocollimator’s optical axis. Its alignment needs to be maintained while it is scanned across the surface under test. The reproducibility of the autocollimator’s measuring conditions during calibration and during its use in the profilometer is of crucial importance to providing precise and traceable angle metrology. In the first part of the paper, we present the aperture alignment procedure developed at the Advanced Light Source, Lawrence Berkeley National Laboratory, USA, for the use of their deflectometric profilometers. In the second part, we investigate the topic further by providing extensive ray tracing simulations and calibrations of a commercial autocollimator performed at the Physikalisch-Technische Bundesanstalt, Germany, for evaluating the effects of the positioning of the aperture on the autocollimator’s angle response. The investigations which we performed are crucial for reaching fundamental metrological limits in deflectometric profilometry.« less
Bonding by Hydroxide-Catalyzed Hydration and Dehydration
NASA Technical Reports Server (NTRS)
Gwo, Dz-Hung
2008-01-01
A simple, inexpensive method for bonding solid objects exploits hydroxide-catalyzed hydration and dehydration to form silicate-like networks in thin surface and interfacial layers between the objects. The method can be practiced at room temperature or over a wide range of temperatures. The method was developed especially to enable the formation of precise, reliable bonds between precise optical components. The bonds thus formed exhibit the precision and transparency of bonds formed by the conventional optical-contact method and the strength and reliability of high-temperature frit bonds. The method also lends itself to numerous non-optical applications in which there are requirements for precise bonds and/or requirements for bonds, whether precise or imprecise, that can reliably withstand severe environmental conditions. Categories of such non-optical applications include forming composite materials, coating substrates, forming laminate structures, and preparing objects of defined geometry and composition. The method is applicable to materials that either (1) can form silicate-like networks in the sense that they have silicate-like molecular structures that are extensible into silicate-like networks or (2) can be chemically linked to silicate-like networks by means of hydroxide-catalyzed hydration and dehydration. When hydrated, a material of either type features surface hydroxyl (-OH) groups. In this method, a silicate-like network that bonds two substrates can be formed either by a bonding material alone or by the bonding material together with material from either or both of the substrates. Typically, an aqueous hydroxide bonding solution is dispensed and allowed to flow between the mating surfaces by capillary action. If the surface figures of the substrates do not match precisely, bonding could be improved by including a filling material in the bonding solution. Preferably, the filling material should include at least one ingredient that can be hydrated to have exposed hydroxyl groups and that can be chemically linked, by hydroxide catalysis, to a silicate-like network. The silicate-like network could be generated in situ from the filling material and/or substrate material, or could be originally present in the bonding material.
Insomnia and the Performance of US Workers: Results from the America Insomnia Survey
Kessler, Ronald C.; Berglund, Patricia A.; Coulouvrat, Catherine; Hajak, Goeran; Roth, Thomas; Shahly, Victoria; Shillington, Alicia C.; Stephenson, Judith J.; Walsh, James K.
2011-01-01
Study Objectives: To estimate the prevalence and associations of broadly defined (i.e., meeting full ICD-10, DSM-IV, or RDC/ICSD-2 inclusion criteria) insomnia with work performance net of comorbid conditions in the America Insomnia Survey (AIS). Design/Setting: Cross-sectional telephone survey. Participants: National sample of 7,428 employed health plan subscribers (ages 18+). Interventions: None. Measurements and Results: Broadly defined insomnia was assessed with the Brief Insomnia Questionnaire (BIQ). Work absenteeism and presenteeism (low on-the-job work performance defined in the metric of lost workday equivalents) were assessed with the WHO Health and Work Performance Questionnaire (HPQ). Regression analysis examined associations between insomnia and HPQ scores controlling 26 comorbid conditions based on self-report and medical/pharmacy claims records. The estimated prevalence of insomnia was 23.2%. Insomnia was significantly associated with lost work performance due to presenteeism (χ21 = 39.5, P < 0.001) but not absenteeism (χ21 = 3.2, P = 0.07), with an annualized individual-level association of insomnia with presenteeism equivalent to 11.3 days of lost work performance. This estimate decreased to 7.8 days when controls were introduced for comorbid conditions. The individual-level human capital value of this net estimate was $2,280. If we provisionally assume these estimates generalize to the total US workforce, they are equivalent to annualized population-level estimates of 252.7 days and $63.2 billion. Conclusions: Insomnia is associated with substantial workplace costs. Although experimental studies suggest some of these costs could be recovered with insomnia disease management programs, effectiveness trials are needed to obtain precise estimates of return-on-investment of such interventions from the employer perspective. Citation: Kessler RC; Berglund PA; Coulouvrat C; Hajak G; Roth T; Shahly V; Shillington AC; Stephenson JJ; Walsh JK. Insomnia and the performance of US workers: results from the America Insomnia Survey.SLEEP 2011;34(9):1161-1171. PMID:21886353
Venne, Gabriel; Rasquinha, Brian J; Pichora, David; Ellis, Randy E; Bicknell, Ryan
2015-07-01
Preoperative planning and intraoperative navigation technologies have each been shown separately to be beneficial for optimizing screw and baseplate positioning in reverse shoulder arthroplasty (RSA) but to date have not been combined. This study describes development of a system for performing computer-assisted RSA glenoid baseplate and screw placement, including preoperative planning, intraoperative navigation, and postoperative evaluation, and compares this system with a conventional approach. We used a custom-designed system allowing computed tomography (CT)-based preoperative planning, intraoperative navigation, and postoperative evaluation. Five orthopedic surgeons defined common preoperative plans on 3-dimensional CT reconstructed cadaveric shoulders. Each surgeon performed 3 computer-assisted and 3 conventional simulated procedures. The 3-dimensional CT reconstructed postoperative units were digitally matched to the preoperative model for evaluation of entry points, end points, and angulations of screws and baseplate. Values were used to find accuracy and precision of the 2 groups with respect to the defined placement. Statistical analysis was performed by t tests (α = .05). Comparison of the groups revealed no difference in accuracy or precision of screws or baseplate entry points (P > .05). Accuracy and precision were improved with use of navigation for end points and angulations of 3 screws (P < .05). Accuracy of the inferior screw showed a trend of improvement with navigation (P > .05). Navigated baseplate end point precision was improved (P < .05), with a trend toward improved accuracy (P > .05). We conclude that CT-based preoperative planning and intraoperative navigation allow improved accuracy and precision for screw placement and precision for baseplate positioning with respect to a predefined placement compared with conventional techniques in RSA. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
Wang, Chenglin; Tang, Yunchao; Zou, Xiangjun; Luo, Lufeng; Chen, Xiong
2017-01-01
Recognition and matching of litchi fruits are critical steps for litchi harvesting robots to successfully grasp litchi. However, due to the randomness of litchi growth, such as clustered growth with uncertain number of fruits and random occlusion by leaves, branches and other fruits, the recognition and matching of the fruit become a challenge. Therefore, this study firstly defined mature litchi fruit as three clustered categories. Then an approach for recognition and matching of clustered mature litchi fruit was developed based on litchi color images acquired by binocular charge-coupled device (CCD) color cameras. The approach mainly included three steps: (1) calibration of binocular color cameras and litchi image acquisition; (2) segmentation of litchi fruits using four kinds of supervised classifiers, and recognition of the pre-defined categories of clustered litchi fruit using a pixel threshold method; and (3) matching the recognized clustered fruit using a geometric center-based matching method. The experimental results showed that the proposed recognition method could be robust against the influences of varying illumination and occlusion conditions, and precisely recognize clustered litchi fruit. In the tested 432 clustered litchi fruits, the highest and lowest average recognition rates were 94.17% and 92.00% under sunny back-lighting and partial occlusion, and sunny front-lighting and non-occlusion conditions, respectively. From 50 pairs of tested images, the highest and lowest matching success rates were 97.37% and 91.96% under sunny back-lighting and non-occlusion, and sunny front-lighting and partial occlusion conditions, respectively. PMID:29112177
Green polymer chemistry: The role of Candida antarctica lipase B in polymer functionalization
NASA Astrophysics Data System (ADS)
Castano Gil, Yenni Marcela
The synthesis of functional polymers with well-defined structure, end-group fidelity and physico-chemical properties useful for biomedical applications has proven challenging. Chemo-enzymatic methods are an alternative strategy to increase the diversity of functional groups in polymeric materials. Specifically, enzyme-catalyzed polymer functionalization carried out under solventless conditions is a great advancement in the design of green processes for biomedical applications, where the toxicity of solvents and catalyst residues need to be considered. Enzymes offer several distinct advantages, including high efficiency, catalyst recyclability, and mild reaction conditions. This reseach aimed to precisely functionalized polymers using two methods: enzyme-catalyzed functionalization via polymerization and chemo-enzymatic functionalization of pre-made polymers for drug delivery. In the first method, well-defined poly(caprolactone)s were generated using alkyne-based initiating systems catalyzed by CALB. Propargyl alcohol and 4-dibenzocyclooctynol (DIBO) were shown to efficiently initiate the ring opening polymerization of epsilon-caprolactone under metal free conditions and yielded polymers with Mn ~4 to 24 KDa and relatively narrow molecular mass distribution. In the second methodology, we present quantitative enzyme-catalyzed transesterification of vinyl esters and ethyl esters with poly(ethylene glycol)s (PEG)s that will serve as building blocks for dendrimer synthesis, followed by introducing a new process for the exclusive gamma-conjugation of folic acid. Specifically, fluorescein-acrylate was enzymatically conjugated with PEG. Additionally, halo-ester functionalized PEGs were successfully prepared by the transesterification of alkyl halo-esters with PEGs. 1H and 13C NMR spectroscopy, SEC and MALDI-ToF mass spectrometry confirmed the structure and purity of the products.
Precision Medicine and Men's Health.
Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith
2017-07-01
Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.
Insulin Resistance: Regression and Clustering
Yoon, Sangho; Assimes, Themistocles L.; Quertermous, Thomas; Hsiao, Chin-Fu; Chuang, Lee-Ming; Hwu, Chii-Min; Rajaratnam, Bala; Olshen, Richard A.
2014-01-01
In this paper we try to define insulin resistance (IR) precisely for a group of Chinese women. Our definition deliberately does not depend upon body mass index (BMI) or age, although in other studies, with particular random effects models quite different from models used here, BMI accounts for a large part of the variability in IR. We accomplish our goal through application of Gauss mixture vector quantization (GMVQ), a technique for clustering that was developed for application to lossy data compression. Defining data come from measurements that play major roles in medical practice. A precise statement of what the data are is in Section 1. Their family structures are described in detail. They concern levels of lipids and the results of an oral glucose tolerance test (OGTT). We apply GMVQ to residuals obtained from regressions of outcomes of an OGTT and lipids on functions of age and BMI that are inferred from the data. A bootstrap procedure developed for our family data supplemented by insights from other approaches leads us to believe that two clusters are appropriate for defining IR precisely. One cluster consists of women who are IR, and the other of women who seem not to be. Genes and other features are used to predict cluster membership. We argue that prediction with “main effects” is not satisfactory, but prediction that includes interactions may be. PMID:24887437
A new multi-scale geomorphological landscape GIS for the Netherlands
NASA Astrophysics Data System (ADS)
Weerts, Henk; Kosian, Menne; Baas, Henk; Smit, Bjorn
2013-04-01
At present, the Cultural Heritage Agency of the Netherlands is developing a nationwide landscape Geographical Information System (GIS). In this new conceptual approach, the Agency puts together several multi-scale landscape classifications in a GIS. The natural physical landscapes lie at the basis of this GIS, because these landscapes provide the natural boundary conditions for anthropogenic. At the local scale a nationwide digital geomorphological GIS is available in the Netherlands. This map, that was originally mapped at 1:50,000 from the late 1970's to the 1990's, is based on geomorphometrical (observable and measurable in the field), geomorphological and, lithological and geochronological criteria. When used at a national scale, the legend of this comprehensive geomorphological map is very complex which hampers use in e.g. planning practice or predictive archaeology. At the national scale several landscape classifications have been in use in the Netherlands since the early 1950's, typically ranging in the order of 10 -15 landscape units for the entire country. A widely used regional predictive archaeological classification has 13 archaeo-landscapes. All these classifications have been defined "top-down" and their actual content and boundaries have only been broadly defined. Thus, these classifications have little or no meaning at a local scale. We have tried to combine the local scale with the national scale. To do so, we first defined national physical geographical regions based on the new 2010 national geological map 1:500,000. We also made sure there was a reference with the European LANMAP2 classification. We arrived at 20 landscape units at the national scale, based on (1) genesis, (2) large-scale geomorphology, (3) lithology of the shallow sub-surface and (4) age. These criteria that were chosen because the genesis of the landscape largely determines its (scale of) morphology and lithology that in turn determine hydrological conditions. All together, they define the natural boundary conditions for anthropogenic use. All units have been defined, mapped and described based on these criteria. This enables the link with the European LANMAP2 GIS. The unit "Till-plateau sand region" for instance runs deep into Germany and even Poland. At the local scale, the boundaries of the national units can be defined and precisely mapped by linking them to the 1:50,000 geomorphological map polygons. Each national unit consists of a typical assemblage of local geomorphological units. So, the newly developed natural physical landscape map layer can be used from the local to the European scale.
Analysis of Distribution of Vector-Borne Diseases Using Geographic Information Systems.
Nihei, Naoko
2017-01-01
The distribution of vector-borne diseases is changing on a global scale owing to issues involving natural environments, socioeconomic conditions and border disputes among others. Geographic information systems (GIS) provide an important method of establishing a prompt and precise understanding of local data on disease outbreaks, from which disease eradication programs can be established. Having first defined GIS as a combination of GPS, RS and GIS, we showed the processes through which these technologies were being introduced into our research. GIS-derived geographical information attributes were interpreted in terms of point, area, line, spatial epidemiology, risk and development for generating the vector dynamic models associated with the spread of the disease. The need for interdisciplinary scientific and administrative collaboration in the use of GIS to control infectious diseases is highly warranted.
6 CFR 7.28 - Automatic declassification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Classification Appeals Panel (ISCAP) for approval. (d) Declassification guides that narrowly and precisely define... years after the date of its original classification with the exception of specific information exempt...
Chapple, Iain L C; Mealey, Brian L; Van Dyke, Thomas E; Bartold, P Mark; Dommisch, Henrik; Eickholz, Peter; Geisinger, Maria L; Genco, Robert J; Glogauer, Michael; Goldstein, Moshe; Griffin, Terrence J; Holmstrup, Palle; Johnson, Georgia K; Kapila, Yvonne; Lang, Niklaus P; Meyle, Joerg; Murakami, Shinya; Plemons, Jacqueline; Romito, Giuseppe A; Shapira, Lior; Tatakis, Dimitris N; Teughels, Wim; Trombelli, Leonardo; Walter, Clemens; Wimmer, Gernot; Xenoudi, Pinelopi; Yoshie, Hiromasa
2018-06-01
Periodontal health is defined by absence of clinically detectable inflammation. There is a biological level of immune surveillance that is consistent with clinical gingival health and homeostasis. Clinical gingival health may be found in a periodontium that is intact, i.e. without clinical attachment loss or bone loss, and on a reduced periodontium in either a non-periodontitis patient (e.g. in patients with some form of gingival recession or following crown lengthening surgery) or in a patient with a history of periodontitis who is currently periodontally stable. Clinical gingival health can be restored following treatment of gingivitis and periodontitis. However, the treated and stable periodontitis patient with current gingival health remains at increased risk of recurrent periodontitis, and accordingly, must be closely monitored. Two broad categories of gingival diseases include non-dental plaque biofilm-induced gingival diseases and dental plaque-induced gingivitis. Non-dental plaque biofilm-induced gingival diseases include a variety of conditions that are not caused by plaque and usually do not resolve following plaque removal. Such lesions may be manifestations of a systemic condition or may be localized to the oral cavity. Dental plaque-induced gingivitis has a variety of clinical signs and symptoms, and both local predisposing factors and systemic modifying factors can affect its extent, severity, and progression. Dental plaque-induced gingivitis may arise on an intact periodontium or on a reduced periodontium in either a non-periodontitis patient or in a currently stable "periodontitis patient" i.e. successfully treated, in whom clinical inflammation has been eliminated (or substantially reduced). A periodontitis patient with gingival inflammation remains a periodontitis patient (Figure 1), and comprehensive risk assessment and management are imperative to ensure early prevention and/or treatment of recurrent/progressive periodontitis. Precision dental medicine defines a patient-centered approach to care, and therefore, creates differences in the way in which a "case" of gingival health or gingivitis is defined for clinical practice as opposed to epidemiologically in population prevalence surveys. Thus, case definitions of gingival health and gingivitis are presented for both purposes. While gingival health and gingivitis have many clinical features, case definitions are primarily predicated on presence or absence of bleeding on probing. Here we classify gingival health and gingival diseases/conditions, along with a summary table of diagnostic features for defining health and gingivitis in various clinical situations. © 2018 American Academy of Periodontology and European Federation of Periodontology.
1974-01-01
General agreement seems to be developing that the geophysical system should be defined in terms of a large number of points...34A Laser-Interferometer System for the Absolute Determination of the Acceleration due to Gravity," In Proc. Int. Conf. on Precision Measurement...MO %. The ratio of the plasmaspheric to the total time-delays due to free
Self-Assembly of Hierarchical DNA Nanotube Architectures with Well-Defined Geometries.
Jorgenson, Tyler D; Mohammed, Abdul M; Agrawal, Deepak K; Schulman, Rebecca
2017-02-28
An essential motif for the assembly of biological materials such as actin at the scale of hundreds of nanometers and beyond is a network of one-dimensional fibers with well-defined geometry. Here, we demonstrate the programmed organization of DNA filaments into micron-scale architectures where component filaments are oriented at preprogrammed angles. We assemble L-, T-, and Y-shaped DNA origami junctions that nucleate two or three micron length DNA nanotubes at high yields. The angles between the nanotubes mirror the angles between the templates on the junctions, demonstrating that nanoscale structures can control precisely how micron-scale architectures form. The ability to precisely program filament orientation could allow the assembly of complex filament architectures in two and three dimensions, including circuit structures, bundles, and extended materials.
Scale invariance of the η-deformed AdS5 × S5 superstring, T-duality and modified type II equations
NASA Astrophysics Data System (ADS)
Arutyunov, G.; Frolov, S.; Hoare, B.; Roiban, R.; Tseytlin, A. A.
2016-02-01
We consider the ABF background underlying the η-deformed AdS5 ×S5 sigma model. This background fails to satisfy the standard IIB supergravity equations which indicates that the corresponding sigma model is not Weyl invariant, i.e. does not define a critical string theory in the usual sense. We argue that the ABF background should still define a UV finite theory on a flat 2d world-sheet implying that the η-deformed model is scale invariant. This property follows from the formal relation via T-duality between the η-deformed model and the one defined by an exact type IIB supergravity solution that has 6 isometries albeit broken by a linear dilaton. We find that the ABF background satisfies candidate type IIB scale invariance conditions which for the R-R field strengths are of the second order in derivatives. Surprisingly, we also find that the ABF background obeys an interesting modification of the standard IIB supergravity equations that are first order in derivatives of R-R fields. These modified equations explicitly depend on Killing vectors of the ABF background and, although not universal, they imply the universal scale invariance conditions. Moreover, we show that it is precisely the non-isometric dilaton of the T-dual solution that leads, after T-duality, to modification of type II equations from their standard form. We conjecture that the modified equations should follow from κ-symmetry of the η-deformed model. All our observations apply also to η-deformations of AdS3 ×S3 ×T4and AdS2 ×S2 ×T6models.
Scale invariance of the η-deformed AdS 5 × S 5 superstring, T-duality and modified type II equations
Arutyunov, G.; Frolov, S.; Hoare, B.; ...
2015-12-23
We consider the ABF background underlying the η-deformed AdS 5 × S 5 sigma model. This background fails to satisfy the standard IIB supergravity equations which indicates that the corresponding sigma model is not Weyl invariant, i.e. does not define a critical string theory in the usual sense. We argue that the ABF background should still define a UV finite theory on a flat 2d world-sheet implying that the η-deformed model is scale invariant. This property follows from the formal relation via T-duality between the η-deformed model and the one defined by an exact type IIB supergravity solution that hasmore » 6 isometries albeit broken by a linear dilaton. We find that the ABF background satisfies candidate type IIB scale invariance conditions which for the R–R field strengths are of the second order in derivatives. Surprisingly, we also find that the ABF background obeys an interesting modification of the standard IIB supergravity equations that are first order in derivatives of R–R fields. These modified equations explicitly depend on Killing vectors of the ABF background and, although not universal, they imply the universal scale invariance conditions. Moreover, we show that it is precisely the non-isometric dilaton of the T-dual solution that leads, after T-duality, to modification of type II equations from their standard form. We conjecture that the modified equations should follow from κ-symmetry of the η-deformed model. All our observations apply also to η-deformations of AdS 3 × S 3 × T 4 and AdS 2 × S 2 × T 6 models.« less
Quantum interval-valued probability: Contextuality and the Born rule
NASA Astrophysics Data System (ADS)
Tai, Yu-Tsung; Hanson, Andrew J.; Ortiz, Gerardo; Sabry, Amr
2018-05-01
We present a mathematical framework based on quantum interval-valued probability measures to study the effect of experimental imperfections and finite precision measurements on defining aspects of quantum mechanics such as contextuality and the Born rule. While foundational results such as the Kochen-Specker and Gleason theorems are valid in the context of infinite precision, they fail to hold in general in a world with limited resources. Here we employ an interval-valued framework to establish bounds on the validity of those theorems in realistic experimental environments. In this way, not only can we quantify the idea of finite-precision measurement within our theory, but we can also suggest a possible resolution of the Meyer-Mermin debate on the impact of finite-precision measurement on the Kochen-Specker theorem.
Precision Medicine in Myelodysplastic Syndromes and Leukemias: Lessons from Sequential Mutations.
Nazha, Aziz; Sekeres, Mikkael A
2017-01-14
Precision medicine can be simply defined as the identification of personalized treatment that matches patient-specific clinical and genomic characteristics. Since the completion of the Human Genome Project in 2003, significant advances have been made in our understanding of the genetic makeup of diseases, especially cancers. The identification of somatic mutations that can drive cancer has led to the development of therapies that specifically target the abnormal proteins derived from these mutations. This has led to a paradigm shift in our treatment methodology. Although some success has been achieved in targeting some genetic abnormalities, several challenges and limitations exist when applying precision-medicine concepts in leukemia and myelodysplastic syndromes. We review the current understanding of genomics in myelodysplastic syndromes (MDS) and leukemias and the limitations of precision-medicine concepts in MDS.
Wu, Bing; Zhao, Yinghe; Nan, Haiyan; Yang, Ziyi; Zhang, Yuhan; Zhao, Huijuan; He, Daowei; Jiang, Zonglin; Liu, Xiaolong; Li, Yun; Shi, Yi; Ni, Zhenhua; Wang, Jinlan; Xu, Jian-Bin; Wang, Xinran
2016-06-08
Precise assembly of semiconductor heterojunctions is the key to realize many optoelectronic devices. By exploiting the strong and tunable van der Waals (vdW) forces between graphene and organic small molecules, we demonstrate layer-by-layer epitaxy of ultrathin organic semiconductors and heterostructures with unprecedented precision with well-defined number of layers and self-limited characteristics. We further demonstrate organic p-n heterojunctions with molecularly flat interface, which exhibit excellent rectifying behavior and photovoltaic responses. The self-limited organic molecular beam epitaxy (SLOMBE) is generically applicable for many layered small-molecule semiconductors and may lead to advanced organic optoelectronic devices beyond bulk heterojunctions.
Optimal combination of illusory and luminance-defined 3-D surfaces: A role for ambiguity.
Hartle, Brittney; Wilcox, Laurie M; Murray, Richard F
2018-04-01
The shape of the illusory surface in stereoscopic Kanizsa figures is determined by the interpolation of depth from the luminance edges of adjacent inducing elements. Despite ambiguity in the position of illusory boundaries, observers reliably perceive a coherent three-dimensional (3-D) surface. However, this ambiguity may contribute additional uncertainty to the depth percept beyond what is expected from measurement noise alone. We evaluated the intrinsic ambiguity of illusory boundaries by using a cue-combination paradigm to measure the reliability of depth percepts elicited by stereoscopic illusory surfaces. We assessed the accuracy and precision of depth percepts using 3-D Kanizsa figures relative to luminance-defined surfaces. The location of the surface peak was defined by illusory boundaries, luminance-defined edges, or both. Accuracy and precision were assessed using a depth-discrimination paradigm. A maximum likelihood linear cue combination model was used to evaluate the relative contribution of illusory and luminance-defined signals to the perceived depth of the combined surface. Our analysis showed that the standard deviation of depth estimates was consistent with an optimal cue combination model, but the points of subjective equality indicated that observers consistently underweighted the contribution of illusory boundaries. This systematic underweighting may reflect a combination rule that attributes additional intrinsic ambiguity to the location of the illusory boundary. Although previous studies show that illusory and luminance-defined contours share many perceptual similarities, our model suggests that ambiguity plays a larger role in the perceptual representation of illusory contours than of luminance-defined contours.
Precision medicine: In need of guidance and surveillance.
Lin, Jian-Zhen; Long, Jun-Yu; Wang, An-Qiang; Zheng, Ying; Zhao, Hai-Tao
2017-07-28
Precision medicine, currently a hotspot in mainstream medicine, has been strongly promoted in recent years. With rapid technological development, such as next-generation sequencing, and fierce competition in molecular targeted drug exploitation, precision medicine represents an advance in science and technology; it also fulfills needs in public health care. The clinical translation and application of precision medicine - especially in the prevention and treatment of tumors - is far from satisfactory; however, the aims of precision medicine deserve approval. Thus, this medical approach is currently in its infancy; it has promising prospects, but it needs to overcome numbers of problems and deficiencies. It is expected that in addition to conventional symptoms and signs, precision medicine will define disease in terms of the underlying molecular characteristics and other environmental susceptibility factors. Those expectations should be realized by constructing a novel data network, integrating clinical data from individual patients and personal genomic background with existing research on the molecular makeup of diseases. In addition, multi-omics analysis and multi-discipline collaboration will become crucial elements in precision medicine. Precision medicine deserves strong support, and its development demands directed momentum. We propose three kinds of impetus (research, application and collaboration impetus) for such directed momentum toward promoting precision medicine and accelerating its clinical translation and application.
Precision medicine: In need of guidance and surveillance
Lin, Jian-Zhen; Long, Jun-Yu; Wang, An-Qiang; Zheng, Ying; Zhao, Hai-Tao
2017-01-01
Precision medicine, currently a hotspot in mainstream medicine, has been strongly promoted in recent years. With rapid technological development, such as next-generation sequencing, and fierce competition in molecular targeted drug exploitation, precision medicine represents an advance in science and technology; it also fulfills needs in public health care. The clinical translation and application of precision medicine - especially in the prevention and treatment of tumors - is far from satisfactory; however, the aims of precision medicine deserve approval. Thus, this medical approach is currently in its infancy; it has promising prospects, but it needs to overcome numbers of problems and deficiencies. It is expected that in addition to conventional symptoms and signs, precision medicine will define disease in terms of the underlying molecular characteristics and other environmental susceptibility factors. Those expectations should be realized by constructing a novel data network, integrating clinical data from individual patients and personal genomic background with existing research on the molecular makeup of diseases. In addition, multi-omics analysis and multi-discipline collaboration will become crucial elements in precision medicine. Precision medicine deserves strong support, and its development demands directed momentum. We propose three kinds of impetus (research, application and collaboration impetus) for such directed momentum toward promoting precision medicine and accelerating its clinical translation and application. PMID:28811702
Evaluating the validity of using unverified indices of body condition
Schamber, J.L.; Esler, Daniel N.; Flint, Paul L.
2009-01-01
Condition indices are commonly used in an attempt to link body condition of birds to ecological variables of interest, including demographic attributes such as survival and reproduction. Most indices are based on body mass adjusted for structural body size, calculated as simple ratios or residuals from regressions. However, condition indices are often applied without confirming their predictive value (i.e., without being validated against measured values of fat and protein), which we term ‘unverified’ use. We evaluated the ability of a number of unverified indices frequently found in the literature to predict absolute and proportional levels of fat and protein across five species of waterfowl. Among indices we considered, those accounting for body size never predicted absolute protein more precisely than body mass, however, some indices improved predictability of fat, although the form of the best index varied by species. Further, the gain in precision by using a condition index to predict either absolute or percent fat was minimal (rise in r2≤0.13), and in many cases model fit was actually reduced. Our data agrees with previous assertions that the assumption that indices provide more precise indicators of body condition than body mass alone is often invalid. We strongly discourage the use of unverified indices, because subjectively selecting indices likely does little to improve precision and might in fact decrease predictability relative to using body mass alone.
Time lapse video recordings of highly purified human hematopoietic progenitor cells in culture.
Denkers, I A; Dragowska, W; Jaggi, B; Palcic, B; Lansdorp, P M
1993-05-01
Major hurdles in studies of stem cell biology include the low frequency and heterogeneity of human hematopoietic precursor cells in bone marrow and the difficulty of directly studying the effect of various culture conditions and growth factors on such cells. We have adapted the cell analyzer imaging system for monitoring and recording the morphology of limited numbers of cells under various culture conditions. Hematopoietic progenitor cells with a CD34+ CD45RAlo CD71lo phenotype were purified from previously frozen organ donor bone marrow by fluorescence activated cell sorting. Cultures of such cells were analyzed with the imaging system composed of an inverted microscope contained in an incubator, a video camera, an optical memory disk recorder and a computer-controlled motorized microscope XYZ precision stage. Fully computer-controlled video images at defined XYZ positions were captured at selected time intervals and recorded at a predetermined sequence on an optical memory disk. In this study, the cell analyzer system was used to obtain descriptions and measurements of hematopoietic cell behavior, like cell motility, cell interactions, cell shape, cell division, cell cycle time and cell size changes under different culture conditions.
Russo, Russell R; Burn, Matthew B; Ismaily, Sabir K; Gerrie, Brayden J; Han, Shuyang; Alexander, Jerry; Lenherr, Christopher; Noble, Philip C; Harris, Joshua D; McCulloch, Patrick C
2018-03-01
Accurate measurements of shoulder and elbow motion are required for the management of musculoskeletal pathology. The purpose of this investigation was to compare three techniques for measuring motion. The authors hypothesized that digital photography would be equivalent in accuracy and show higher precision compared to the other two techniques. Using infrared motion capture analysis as the reference standard, shoulder flexion/abduction/internal rotation/external rotation and elbow flexion/extension were measured using visual estimation, goniometry, and digital photography on 10 fresh frozen cadavers. These measurements were performed by three physical therapists and three orthopaedic surgeons. Accuracy was defined by the difference from the reference standard (motion capture analysis), while precision was defined by the proportion of measurements within the authors' definition of clinical significance (10° for all motions except for elbow extension where 5° was used). Analysis of variance (ANOVA), t-tests, and chi-squared tests were used. Although statistically significant differences were found in measurement accuracy between the three techniques, none of these differences met the authors' definition of clinical significance. Precision of the measurements was significantly higher for both digital photography (shoulder abduction [93% vs. 74%, p < 0.001], shoulder internal rotation [97% vs. 83%, p = 0.001], and elbow flexion [93% vs. 65%, p < 0.001]) and goniometry (shoulder abduction [92% vs. 74%, p < 0.001] and shoulder internal rotation [94% vs. 83%, p = 0.008]) than visual estimation. Digital photography was more precise than goniometry for measurements of elbow flexion only [93% vs. 76%, p < 0.001]. There was no clinically significant difference in measurement accuracy between the three techniques for shoulder and elbow motion. Digital photography showed higher measurement precision compared to visual estimation for shoulder abduction, shoulder internal rotation, and elbow flexion. However, digital photography was only more precise than goniometry for measurements of elbow flexion. Overall digital photography shows equivalent accuracy to visual estimation and goniometry, but with higher precision than visual estimation. Copyright © 2017. Published by Elsevier B.V.
High-Precision 40Ar/39Ar dating of the Deccan Traps
NASA Astrophysics Data System (ADS)
Sprain, C. J.; Renne, P. R.; Richards, M. A.; Self, S.; Vanderkluysen, L.; Pande, K.; Morgan, L. E.; Cosca, M. A.
2015-12-01
The Deccan Traps (DT) have been strongly implicated over the past thirty years as a potential cause of the mass extinctions at the Cretaceous-Paleogene boundary (KPB). While a broad coincidence between the DT eruptions and the KPB is increasingly clear, variables such as tempo, volume of eruptions, and amount of associated climate-modifying volatiles, are too poorly constrained to properly assess causality. In order to appropriately test whether the DT played a role in the mass extinctions a high-precision geochronologic framework defining the timing and tempo of volcanic eruptions is needed. Recent high-precision U/Pb dating of zircons from inferred paleosols (red boles) and melt segregation horizons is the only available geochronology of the DT that is sufficiently precise to resolve age differences of less than 100 ka (Schoene et al., 2015). While this technique can achieve high-precision dates for individual zircon crystals, protracted age distributions may not include the actual eruption age. Moreover, the applicability of U/Pb dating in the DT is limited as suitable material is only sporadically present and therefore the technique is unlikely to achieve the resolution necessary to assess the tempo of DT eruptions. To mediate these limitations, we present new high-precision 40Ar/39Ar ages for plagioclase separated from the lava flows sampled from each of ten chemostratigraphically-defined formations within the Western Ghats. Multiple (N = 1-4) plateau ages from each sample and detailed neutron fluence monitoring during irradiation yield ages with precision commonly better than 100 ka (1 sigma). Results provide the first precise location of the KPB within the DT eruption sequence, which approximately coincides with major changes in eruption frequency, flow-field volumes, extent of crustal contamination, and degree of fractionation. Collectively, these results suggest that a state shift occurred in the DT magma system within ~50 ka of the Chicxulub impact, consistent with transient effects of seismic energy associated with the impact. Further, our new data invalidate the concept of three discrete eruption pulses in the Western Ghats (Chenet et al., 2007, 2009; Keller et al., 2008) and rather indicate only a sharp increase in mean volumetric eruption rates near the KPB.
Fraguas, D; Díaz-Caneja, C M; State, M W; O'Donovan, M C; Gur, R E; Arango, C
2017-01-01
Personalized or precision medicine is predicated on the assumption that the average response to treatment is not necessarily representative of the response of each individual. A commitment to personalized medicine demands an effort to bring evidence-based medicine and personalized medicine closer together. The use of relatively homogeneous groups, defined using a priori criteria, may constitute a promising initial step for developing more accurate risk-prediction models with which to advance the development of personalized evidence-based medicine approaches to heterogeneous syndromes such as schizophrenia. However, this can lead to a paradoxical situation in the field of psychiatry. Since there has been a tendency to loosely define psychiatric disorders as ones without a known aetiology, the discovery of an aetiology for psychiatric syndromes (e.g. 22q11.2 deletion syndrome in some cases of schizophrenia), while offering a path toward more precise treatments, may also lead to their reclassification away from psychiatry. We contend that psychiatric disorders with a known aetiology should not be removed from the field of psychiatry. This knowledge should be used instead to guide treatment, inasmuch as psychotherapies, pharmacotherapies and other treatments can all be valid approaches to mental disorders. The translation of the personalized clinical approach inherent to psychiatry into evidence-based precision medicine can lead to the development of novel treatment options for mental disorders and improve outcomes.
Fraguas, D.; Díaz-Caneja, C. M.; State, M. W.; O’Donovan, M. C.; Gur, R. E.; Arango, C.
2016-01-01
Personalized or precision medicine is predicated on the assumption that the average response to treatment is not necessarily representative of the response of each individual. A commitment to personalized medicine demands an effort to bring evidence-based medicine and personalized medicine closer together. The use of relatively homogeneous groups, defined using a priori criteria, may constitute a promising initial step for developing more accurate risk-prediction models with which to advance the development of personalized evidence-based medicine approaches to heterogeneous syndromes such as schizophrenia. However, this can lead to a paradoxical situation in the field of psychiatry. Since there has been a tendency to loosely define psychiatric disorders as ones without a known aetiology, the discovery of an aetiology for psychiatric syndromes (e.g. 22q11.2 deletion syndrome in some cases of schizophrenia), while offering a path toward more precise treatments, may also lead to their reclassification away from psychiatry. We contend that psychiatric disorders with a known aetiology should not be removed from the field of psychiatry. This knowledge should be used instead to guide treatment, inasmuch as psychotherapies, pharmacotherapies and other treatments can all be valid approaches to mental disorders. The translation of the personalized clinical approach inherent to psychiatry into evidence-based precision medicine can lead to the development of novel treatment options for mental disorders and improve outcomes. PMID:27334937
Choi, Jungil; Xue, Yeguang; Xia, Wei; Ray, Tyler R; Reeder, Jonathan T; Bandodkar, Amay J; Kang, Daeshik; Xu, Shuai; Huang, Yonggang; Rogers, John A
2017-07-25
During periods of activity, sweat glands produce pressures associated with osmotic effects to drive liquid to the surface of the skin. The magnitudes of these pressures may provide insights into physiological health, the intensity of physical exertion, psychological stress factors and/other information of interest, yet they are currently unknown due to absence of means for non-invasive measurement. This paper introduces a thin, soft wearable microfluidic system that mounts onto the surface of the skin to enable precise and routine measurements of secretory fluidic pressures generated at the surface of the skin by eccrine sweat glands (surface SPSG, or s-SPSG) at nearly any location on the body. These platforms incorporate an arrayed collection of unit cells each of which includes an opening to the skin, an inlet through which sweat can flow, a capillary bursting valve (CBV) with a unique bursting pressure (BP), a corresponding microreservoir to receive sweat and an outlet to the surrounding ambient to allow release of backpressure. The BPs systematically span the physiologically relevant range, to enable a measurement precision approximately defined by the ratio of the range to the number of unit cells. Human studies demonstrate measurements of s-SPSG under different conditions, from various regions of the body. Average values in healthy young adults lie between 2.4 and 2.9 kPa. Sweat associated with vigorous exercise have s-SPSGs that are somewhat higher than those associated with sedentary activity. For all conditions, the forearm and lower back tend to yield the highest and lowest s-SPSGs, respectively.
Periodic three-body orbits with vanishing angular momentum in the Jacobi-Poincaré ‘strong’ potential
NASA Astrophysics Data System (ADS)
Dmitrašinović, V.; Petrović, Luka V.; Šuvakov, Milovan
2017-10-01
Moore (1993 Phys. Rev. Lett. 70 3675) and Montgomery (2005 Ergod. Theor. Dynam. Syst. 25 921-947) have argued that planar periodic orbits of three bodies moving in the Jacobi-Poincaré, or the ‘strong’ pairwise potential \\sumi>j\\frac{-1}{rij^2} , can have all possible topologies. Here we search systematically for such orbits with vanishing angular momentum and find 24 topologically distinct orbits, 22 of which are new, in a small section of the allowed phase space, with a tendency to overcrowd, due to overlapping initial conditions. The topologies of these 24 orbits belong to three algebraic sequences defined as functions of integer n=0, 1, 2, \\ldots . Each sequence extends to n \\to ∞ , but the separation of initial conditions for orbits with n ≥slant 10 becomes practically impossible with a numerical precision of 16 decimal places. Nevertheless, even with a precision of 16 decimals, it is clear that in each sequence both the orbit’s initial angle φn and its period T n approach finite values in the asymptotic limit (n \\to ∞ ). Two of three sequences are overlapping in the sense that their initial angles ϕ occupy the same segment on the circle and their asymptotic values φ∞ are (very) close to each other. The actions of these orbits rise linearly with the index n that describes the orbit’s topology, which is in agreement with the Newtonian case. We show that this behaviour is consistent with the assumption of analyticity of the action as a function of period.
System for precise position registration
Sundelin, Ronald M.; Wang, Tong
2005-11-22
An apparatus for enabling accurate retaining of a precise position, such as for reacquisition of a microscopic spot or feature having a size of 0.1 mm or less, on broad-area surfaces after non-in situ processing. The apparatus includes a sample and sample holder. The sample holder includes a base and three support posts. Two of the support posts interact with a cylindrical hole and a U-groove in the sample to establish location of one point on the sample and a line through the sample. Simultaneous contact of the third support post with the surface of the sample defines a plane through the sample. All points of the sample are therefore uniquely defined by the sample and sample holder. The position registration system of the current invention provides accuracy, as measured in x, y repeatability, of at least 140 .mu.m.
Precision Medicine: From Science To Value.
Ginsburg, Geoffrey S; Phillips, Kathryn A
2018-05-01
Precision medicine is making an impact on patients, health care delivery systems, and research participants in ways that were only imagined fifteen years ago when the human genome was first sequenced. Discovery of disease-causing and drug-response genetic variants has accelerated, while adoption into clinical medicine has lagged. We define precision medicine and the stakeholder community required to enable its integration into research and health care. We explore the intersection of data science, analytics, and precision medicine in the formation of health systems that carry out research in the context of clinical care and that optimize the tools and information used to deliver improved patient outcomes. We provide examples of real-world impact and conclude with a policy and economic agenda necessary for the adoption of this new paradigm of health care both in the United States and globally.
NASA Astrophysics Data System (ADS)
Soriano, Diogo C.; Santos, Odair V. dos; Suyama, Ricardo; Fazanaro, Filipe I.; Attux, Romis
2018-03-01
This work has a twofold aim: (a) to analyze an alternative approach for computing the conditional Lyapunov exponent (λcmax) aiming to evaluate the synchronization stability between nonlinear oscillators without solving the classical variational equations for the synchronization error dynamical system. In this first framework, an analytic reference value for λcmax is also provided in the context of Duffing master-slave scenario and precisely evaluated by the proposed numerical approach; (b) to apply this technique to the study of synchronization stability in chaotic Hindmarsh-Rose (HR) neuronal models under uni- and bi-directional resistive coupling and different excitation bias, which also considered the root mean square synchronization error, information theoretic measures and asymmetric transfer entropy in order to offer a better insight of the synchronization phenomenon. In particular, statistical and information theoretical measures were able to capture similarity increase between the neuronal oscillators just after a critical coupling value in accordance to the largest conditional Lyapunov exponent behavior. On the other hand, transfer entropy was able to detect neuronal emitter influence even in a weak coupling condition, i.e. under the increase of conditional Lyapunov exponent and apparently desynchronization tendency. In the performed set of numerical simulations, the synchronization measures were also evaluated for a two-dimensional parameter space defined by the neuronal coupling (emitter to a receiver neuron) and the (receiver) excitation current. Such analysis is repeated for different feedback couplings as well for different (emitter) excitation currents, revealing interesting characteristics of the attained synchronization region and conditions that facilitate the emergence of the synchronous behavior. These results provide a more detailed numerical insight of the underlying behavior of a HR in the excitation and coupling space, being in accordance with some general findings concerning HR coupling topologies. As a perspective, besides the synchronization overview from different standpoints, we hope that the proposed numerical approach for conditional Lyapunov exponent evaluation could outline a valuable strategy for studying neuronal stability, especially when realistic models are considered, in which analytical or even Jacobian evaluation could define a laborious or impracticable task.
Linked Orders Improve Safety in Scheduling and Administration of Chemotherapeutic Agents
Whipple, Nancy; Boulware, Joy; Danca, Kala; Boyarin, Kirill; Ginsberg, Eliot; Poon, Eric; Sweet, Micheal; Schade, Sue; Rogala, Jennifer
2010-01-01
The pharmacologic treatment for cancer must adhere to complex, finely orchestrated treatment plans, including not only chemotherapy medications, but pre/post-hydration, anti-emetics, anti-anxiety, and other medications that are given before, during and after chemotherapy doses. The treatment plans specify the medications and dictate precise dosing, frequency, and timing. This is a challenge to most Computerized Physician Order Entry (CPOE), Pharmacy and Electronic Medication Administration record (eMAR) Systems. Medications are scheduled on specific dates, referred to as chemo days, from the onset of the treatment, and precisely timed on the designated chemo day. For patients enrolled in research protocols, the adherence to the defined schedule takes on additional import, since variation is a violation of the protocol. If the oncologist determines that medications must be administered outside the defined constraints, the patient must be un-enrolled from the protocol and the course of therapy is re-written. Pharmacy and eMAR systems utilized in processing chemotherapy medications must be able to support the intricate relationships between each drug defined in the treatment plans. PMID:21347104
The economic case for precision medicine.
Gavan, Sean P; Thompson, Alexander J; Payne, Katherine
2018-01-01
Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.
Precision medicine in cardiology.
Antman, Elliott M; Loscalzo, Joseph
2016-10-01
The cardiovascular research and clinical communities are ideally positioned to address the epidemic of noncommunicable causes of death, as well as advance our understanding of human health and disease, through the development and implementation of precision medicine. New tools will be needed for describing the cardiovascular health status of individuals and populations, including 'omic' data, exposome and social determinants of health, the microbiome, behaviours and motivations, patient-generated data, and the array of data in electronic medical records. Cardiovascular specialists can build on their experience and use precision medicine to facilitate discovery science and improve the efficiency of clinical research, with the goal of providing more precise information to improve the health of individuals and populations. Overcoming the barriers to implementing precision medicine will require addressing a range of technical and sociopolitical issues. Health care under precision medicine will become a more integrated, dynamic system, in which patients are no longer a passive entity on whom measurements are made, but instead are central stakeholders who contribute data and participate actively in shared decision-making. Many traditionally defined diseases have common mechanisms; therefore, elimination of a siloed approach to medicine will ultimately pave the path to the creation of a universal precision medicine environment.
Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis
Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.
2015-01-01
Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505
The economic case for precision medicine
Gavan, Sean P.; Thompson, Alexander J.; Payne, Katherine
2018-01-01
ABSTRACT Introduction: The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered: The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary: The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers. PMID:29682615
Kerner, Berit; North, Kari E; Fallin, M Daniele
2010-01-01
Participants analyzed actual and simulated longitudinal data from the Framingham Heart Study for various metabolic and cardiovascular traits. The genetic information incorporated into these investigations ranged from selected single-nucleotide polymorphisms to genome-wide association arrays. Genotypes were incorporated using a broad range of methodological approaches including conditional logistic regression, linear mixed models, generalized estimating equations, linear growth curve estimation, growth modeling, growth mixture modeling, population attributable risk fraction based on survival functions under the proportional hazards models, and multivariate adaptive splines for the analysis of longitudinal data. The specific scientific questions addressed by these different approaches also varied, ranging from a more precise definition of the phenotype, bias reduction in control selection, estimation of effect sizes and genotype associated risk, to direct incorporation of genetic data into longitudinal modeling approaches and the exploration of population heterogeneity with regard to longitudinal trajectories. The group reached several overall conclusions: 1) The additional information provided by longitudinal data may be useful in genetic analyses. 2) The precision of the phenotype definition as well as control selection in nested designs may be improved, especially if traits demonstrate a trend over time or have strong age-of-onset effects. 3) Analyzing genetic data stratified for high-risk subgroups defined by a unique development over time could be useful for the detection of rare mutations in common multi-factorial diseases. 4) Estimation of the population impact of genomic risk variants could be more precise. The challenges and computational complexity demanded by genome-wide single-nucleotide polymorphism data were also discussed. PMID:19924713
Magnetic Ordering in Gold Nanoclusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrachev, Mikhail; Antonello, Sabrina; Dainese, Tiziano
Here, several research groups have observed magnetism in monolayer-protected gold-cluster samples, but the results were often contradictory and thus a clear understanding of this phenomenon is still missing. We used Au 25(SCH 2CH 2Ph) 18 0, which is a paramagnetic cluster that can be prepared with atomic precision and whose structure is known precisely. Previous magnetometry studies only detected paramagnetism. We used samples representing a range of crystallographic orders and studied their magnetic behaviors by electron paramagnetic resonance (EPR). As a film, Au 25(SCH 2CH 2Ph) 18 0 displays paramagnetic behavior but, at low temperature, ferromagnetic interactions are detectable. Onemore » or few single crystals undergo physical reorientation with the applied field and display ferromagnetism, as detected through hysteresis experiments. A large collection of microcrystals is magnetic even at room temperature and shows distinct paramagnetic, superparamagnetic, and ferromagnetic behaviors. Simulation of the EPR spectra shows that both spin-orbit coupling and crystal distortion are important to determine the observed magnetic behaviors. DFT calculations carried out on single cluster and periodic models predict values of spin6orbit coupling and crystal6splitting effects in agreement with the EPR derived quantities. Magnetism in gold nanoclusters is thus demonstrated to be the outcome of a very delicate balance of factors. To obtain reproducible results, the samples must be (i) controlled for composition and thus be monodispersed with atomic precision, (ii) of known charge state, and (iii) well defined also in terms of crystallinity and experimental conditions. This study highlights the efficacy of EPR spectroscopy to provide a molecular understanding of these phenomena« less
Magnetic Ordering in Gold Nanoclusters
Agrachev, Mikhail; Antonello, Sabrina; Dainese, Tiziano; ...
2017-06-12
Here, several research groups have observed magnetism in monolayer-protected gold-cluster samples, but the results were often contradictory and thus a clear understanding of this phenomenon is still missing. We used Au 25(SCH 2CH 2Ph) 18 0, which is a paramagnetic cluster that can be prepared with atomic precision and whose structure is known precisely. Previous magnetometry studies only detected paramagnetism. We used samples representing a range of crystallographic orders and studied their magnetic behaviors by electron paramagnetic resonance (EPR). As a film, Au 25(SCH 2CH 2Ph) 18 0 displays paramagnetic behavior but, at low temperature, ferromagnetic interactions are detectable. Onemore » or few single crystals undergo physical reorientation with the applied field and display ferromagnetism, as detected through hysteresis experiments. A large collection of microcrystals is magnetic even at room temperature and shows distinct paramagnetic, superparamagnetic, and ferromagnetic behaviors. Simulation of the EPR spectra shows that both spin-orbit coupling and crystal distortion are important to determine the observed magnetic behaviors. DFT calculations carried out on single cluster and periodic models predict values of spin6orbit coupling and crystal6splitting effects in agreement with the EPR derived quantities. Magnetism in gold nanoclusters is thus demonstrated to be the outcome of a very delicate balance of factors. To obtain reproducible results, the samples must be (i) controlled for composition and thus be monodispersed with atomic precision, (ii) of known charge state, and (iii) well defined also in terms of crystallinity and experimental conditions. This study highlights the efficacy of EPR spectroscopy to provide a molecular understanding of these phenomena« less
Surface Participation Effects in Titanium Nitride and Niobium Resonators
NASA Astrophysics Data System (ADS)
Dove, Allison; Kreikebaum, John Mark; Livingston, William; Delva, Remy; Qiu, Yanjie; Lolowang, Reinhard; Ramasesh, Vinay; O'Brien, Kevin; Siddiqi, Irfan
Improving the coherence time of superconducting qubits requires a precise understanding of the location and density of surface defects. Superconducting microwave resonators are commonly used for quantum state readout and are a versatile testbed to systematically characterize materials properties as a function of device geometry and fabrication method. We report on sputter deposited titanium nitride and niobium on silicon coplanar waveguide resonators patterned using reactive ion etches to define the device geometry. We discuss the impact of different growth conditions (temperature and electrical bias) and processing techniques on the internal quality factor (Q) of these devices. In particular, to investigate the effect of surface participation, we use a Bosch process to etch many-micron-deep trenches in the silicon substrate and quantify the impact of etch depth and profile on the internal Q. This research was supported by the ARO.
Cooper, Dan M; Radom-Aizik, Shlomit
2015-08-01
NIH Director Francis Collins noted that the Common Fund initiative would lead to unprecedented insights into the mechanisms responsible for the health effects of physical activity. He noted: “Armed with this knowledge, researchers and clinicians may one day be able to define optimal physical activity recommendations for people at various stages of life, as well as develop precisely targeted regimens for individuals with particular health needs.” Given the ominous burden of physical inactivity-related diseases and conditions in otherwise healthy children, and the growing number of children who survive chronic diseases in whom we know little about what constitutes healthy exercise, it is essential that the community of child health researchers develop compelling strategies and proposals in response to the unique opportunity offered through the Common Fund mechanism.
NASA Technical Reports Server (NTRS)
Bryant, N. A.; Zobrist, A. L.; Walker, R. E.; Gokhman, B.
1985-01-01
Performance requirements regarding geometric accuracy have been defined in terms of end product goals, but until recently no precise details have been given concerning the conditions under which that accuracy is to be achieved. In order to achieve higher spatial and spectral resolutions, the Thematic Mapper (TM) sensor was designed to image in both forward and reverse mirror sweeps in two separate focal planes. Both hardware and software have been augmented and changed during the course of the Landsat TM developments to achieve improved geometric accuracy. An investigation has been conducted to determine if the TM meets the National Map Accuracy Standards for geometric accuracy at larger scales. It was found that TM imagery, in terms of geometry, has come close to, and in some cases exceeded, its stringent specifications.
Investigation of breadboard temperature profiling system for SSME fuel preburner diagnostics
NASA Technical Reports Server (NTRS)
Shirley, J. A.
1986-01-01
The feasibility of measuring temperatures in the space shuttle main engine (SSME) fuel preburner using spontaneous Raman scattering from molecular hydrogen was studied. Laser radiation is transmitted to the preburner through a multimode optical fiber. Backscattered Raman-shifted light is collected and focused into a second fiber which connects to a remote-located spectrograph and a mutlichannel optical detector. Optics collimate and focus laser light from the transmitter fiber defining the probe volume. The high pressure, high temperature preburner environment was simulated by a heated pressure cell. Temperatures determined by the distribution of Q-branch co-vibrational transitions demonstrate precision and accuracy of 3%. It is indicated heat preburner temperatures can be determined with 5% accuracy with spatial resolution less than 1 cm and temporal resolution of 10 millisec at the nominal preburner operation conditions.
On the complete and partial integrability of non-Hamiltonian systems
NASA Astrophysics Data System (ADS)
Bountis, T. C.; Ramani, A.; Grammaticos, B.; Dorizzi, B.
1984-11-01
The methods of singularity analysis are applied to several third order non-Hamiltonian systems of physical significance including the Lotka-Volterra equations, the three-wave interaction and the Rikitake dynamo model. Complete integrability is defined and new completely integrable systems are discovered by means of the Painlevé property. In all these cases we obtain integrals, which reduce the equations either to a final quadrature or to an irreducible second order ordinary differential equation (ODE) solved by Painlevé transcendents. Relaxing the Painlevé property we find many partially integrable cases whose movable singularities are poles at leading order, with In( t- t0) terms entering at higher orders. In an Nth order, generalized Rössler model a precise relation is established between the partial fulfillment of the Painlevé conditions and the existence of N - 2 integrals of the motion.
Prospects for Precision Neutrino Cross Section Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Deborah A.
2016-01-28
The need for precision cross section measurements is more urgent now than ever before, given the central role neutrino oscillation measurements play in the field of particle physics. The definition of precision is something worth considering, however. In order to build the best model for an oscillation experiment, cross section measurements should span a broad range of energies, neutrino interaction channels, and target nuclei. Precision might better be defined not in the final uncertainty associated with any one measurement but rather with the breadth of measurements that are available to constrain models. Current experience shows that models are better constrainedmore » by 10 measurements across different processes and energies with 10% uncertainties than by one measurement of one process on one nucleus with a 1% uncertainty. This article describes the current status of and future prospects for the field of precision cross section measurements considering the metric of how many processes, energies, and nuclei have been studied.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeAngelis, K.M.; Gladden, J.G.; Allgaier, M.
2010-03-01
Producing cellulosic biofuels from plant material has recently emerged as a key U.S. Department of Energy goal. For this technology to be commercially viable on a large scale, it is critical to make production cost efficient by streamlining both the deconstruction of lignocellulosic biomass and fuel production. Many natural ecosystems efficiently degrade lignocellulosic biomass and harbor enzymes that, when identified, could be used to increase the efficiency of commercial biomass deconstruction. However, ecosystems most likely to yield relevant enzymes, such as tropical rain forest soil in Puerto Rico, are often too complex for enzyme discovery using current metagenomic sequencing technologies.more » One potential strategy to overcome this problem is to selectively cultivate the microbial communities from these complex ecosystems on biomass under defined conditions, generating less complex biomass-degrading microbial populations. To test this premise, we cultivated microbes from Puerto Rican soil or green waste compost under precisely defined conditions in the presence dried ground switchgrass (Panicum virgatum L.) or lignin, respectively, as the sole carbon source. Phylogenetic profiling of the two feedstock-adapted communities using SSU rRNA gene amplicon pyrosequencing or phylogenetic microarray analysis revealed that the adapted communities were significantly simplified compared to the natural communities from which they were derived. Several members of the lignin-adapted and switchgrass-adapted consortia are related to organisms previously characterized as biomass degraders, while others were from less well-characterized phyla. The decrease in complexity of these communities make them good candidates for metagenomic sequencing and will likely enable the reconstruction of a greater number of full length genes, leading to the discovery of novel lignocellulose-degrading enzymes adapted to feedstocks and conditions of interest.« less
Coenen, M
1998-03-01
The control of husbandry by veterinarians with the prospect of animal welfare demands a valuation of the nutritional status of farm animals. The situation of main importance is a suspected undernutrition. A prolonged failure in nutrient and energy supply results in mobilisation of body fat as well as body protein. Especially the protein depletion includes a loss of capacity of several essential functions, e.g. of the immune system or the respiratory tract. Undernutrition is often classified as stress, but the typical parameters for stress related reactions offer no sufficient information to evaluate a case of undernutrition. A useful tool to justify the nutritional status of an animal is the amount of body fat by sonographic measurements. Processes related to reproduction are rather sensible to a reduction of body fat; although they are less expensive by energy point of view compared to exercise or milk production. Measuring body fat offers the opportunity to describe the degree of undernutrition and to appreciate, if a malnourished animal is damaged accordingly the definitions of animal welfare. However, the equipment and the experience to use sonographic methods is often not available for veterinarians, who are responsible in official control of husbandry. But the visual and manual procedures to proof defined areas, mainly related to back fat thickness, well known as the body condition scoring, alternatively can be used. The body condition score systems, as defined for cows, sheep and horses, are proofed by different experiments with regard to accuracy and reproducibility. They completely cover the demand in precision to evaluate body fat and in consequence the nutritional status of an animal.
A comprehensive molecular cytogenetic analysis of chromosome rearrangements in gibbons
Capozzi, Oronzo; Carbone, Lucia; Stanyon, Roscoe R.; Marra, Annamaria; Yang, Fengtang; Whelan, Christopher W.; de Jong, Pieter J.; Rocchi, Mariano; Archidiacono, Nicoletta
2012-01-01
Chromosome rearrangements in small apes are up to 20 times more frequent than in most mammals. Because of their complexity, the full extent of chromosome evolution in these hominoids is not yet fully documented. However, previous work with array painting, BAC-FISH, and selective sequencing in two of the four karyomorphs has shown that high-resolution methods can precisely define chromosome breakpoints and map the complex flow of evolutionary chromosome rearrangements. Here we use these tools to precisely define the rearrangements that have occurred in the remaining two karyomorphs, genera Symphalangus (2n = 50) and Hoolock (2n = 38). This research provides the most comprehensive insight into the evolutionary origins of chromosome rearrangements involved in transforming small apes genome. Bioinformatics analyses of the human–gibbon synteny breakpoints revealed association with transposable elements and segmental duplications, providing some insight into the mechanisms that might have promoted rearrangements in small apes. In the near future, the comparison of gibbon genome sequences will provide novel insights to test hypotheses concerning the mechanisms of chromosome evolution. The precise definition of synteny block boundaries and orientation, chromosomal fusions, and centromere repositioning events presented here will facilitate genome sequence assembly for these close relatives of humans. PMID:22892276
Veraart, Jelle; Sijbers, Jan; Sunaert, Stefan; Leemans, Alexander; Jeurissen, Ben
2013-11-01
Linear least squares estimators are widely used in diffusion MRI for the estimation of diffusion parameters. Although adding proper weights is necessary to increase the precision of these linear estimators, there is no consensus on how to practically define them. In this study, the impact of the commonly used weighting strategies on the accuracy and precision of linear diffusion parameter estimators is evaluated and compared with the nonlinear least squares estimation approach. Simulation and real data experiments were done to study the performance of the weighted linear least squares estimators with weights defined by (a) the squares of the respective noisy diffusion-weighted signals; and (b) the squares of the predicted signals, which are reconstructed from a previous estimate of the diffusion model parameters. The negative effect of weighting strategy (a) on the accuracy of the estimator was surprisingly high. Multi-step weighting strategies yield better performance and, in some cases, even outperformed the nonlinear least squares estimator. If proper weighting strategies are applied, the weighted linear least squares approach shows high performance characteristics in terms of accuracy/precision and may even be preferred over nonlinear estimation methods. Copyright © 2013 Elsevier Inc. All rights reserved.
Automatic seed selection for segmentation of liver cirrhosis in laparoscopic sequences
NASA Astrophysics Data System (ADS)
Sinha, Rahul; Marcinczak, Jan Marek; Grigat, Rolf-Rainer
2014-03-01
For computer aided diagnosis based on laparoscopic sequences, image segmentation is one of the basic steps which define the success of all further processing. However, many image segmentation algorithms require prior knowledge which is given by interaction with the clinician. We propose an automatic seed selection algorithm for segmentation of liver cirrhosis in laparoscopic sequences which assigns each pixel a probability of being cirrhotic liver tissue or background tissue. Our approach is based on a trained classifier using SIFT and RGB features with PCA. Due to the unique illumination conditions in laparoscopic sequences of the liver, a very low dimensional feature space can be used for classification via logistic regression. The methodology is evaluated on 718 cirrhotic liver and background patches that are taken from laparoscopic sequences of 7 patients. Using a linear classifier we achieve a precision of 91% in a leave-one-patient-out cross-validation. Furthermore, we demonstrate that with logistic probability estimates, seeds with high certainty of being cirrhotic liver tissue can be obtained. For example, our precision of liver seeds increases to 98.5% if only seeds with more than 95% probability of being liver are used. Finally, these automatically selected seeds can be used as priors in Graph Cuts which is demonstrated in this paper.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks.
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A; Montefiori, David C; Zimmermann, Heiko; von Briesen, Hagen
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria-accuracy, precision as well as the specificity and robustness-were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; Zimmermann, Heiko
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria—accuracy, precision as well as the specificity and robustness—were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials. PMID:29300769
Spatio-temporal conditional inference and hypothesis tests for neural ensemble spiking precision
Harrison, Matthew T.; Amarasingham, Asohan; Truccolo, Wilson
2014-01-01
The collective dynamics of neural ensembles create complex spike patterns with many spatial and temporal scales. Understanding the statistical structure of these patterns can help resolve fundamental questions about neural computation and neural dynamics. Spatio-temporal conditional inference (STCI) is introduced here as a semiparametric statistical framework for investigating the nature of precise spiking patterns from collections of neurons that is robust to arbitrarily complex and nonstationary coarse spiking dynamics. The main idea is to focus statistical modeling and inference, not on the full distribution of the data, but rather on families of conditional distributions of precise spiking given different types of coarse spiking. The framework is then used to develop families of hypothesis tests for probing the spatio-temporal precision of spiking patterns. Relationships among different conditional distributions are used to improve multiple hypothesis testing adjustments and to design novel Monte Carlo spike resampling algorithms. Of special note are algorithms that can locally jitter spike times while still preserving the instantaneous peri-stimulus time histogram (PSTH) or the instantaneous total spike count from a group of recorded neurons. The framework can also be used to test whether first-order maximum entropy models with possibly random and time-varying parameters can account for observed patterns of spiking. STCI provides a detailed example of the generic principle of conditional inference, which may be applicable in other areas of neurostatistical analysis. PMID:25380339
Chokron, Sylvie; Dutton, Gordon N.
2016-01-01
Cerebral visual impairment (CVI) has become the primary cause of visual impairment and blindness in children in industrialized countries. Its prevalence has increased sharply, due to increased survival rates of children who sustain severe neurological conditions during the perinatal period. Improved diagnosis has probably contributed to this increase. As in adults, the nature and severity of CVI in children relate to the cause, location and extent of damage to the brain. In the present paper, we define CVI and how this impacts on visual function. We then define developmental coordination disorder (DCD) and discuss the link between CVI and DCD. The neuroanatomical correlates and aetiologies of DCD are also presented in relationship with CVI as well as the consequences of perinatal asphyxia (PA) and preterm birth on the occurrence and nature of DCD and CVI. This paper underlines why there are both clinical and theoretical reasons to disentangle CVI and DCD, and to categorize the features with more precision. In order to offer the most appropriate rehabilitation, we propose a systematic and rapid evaluation of visual function in at-risk children who have survived preterm birth or PA whether or not they have been diagnosed with cerebral palsy or DCD. PMID:27757087
Cancer stem cells: a systems biology view of their role in prognosis and therapy.
Mertins, Susan D
2014-04-01
Evidence has accumulated that characterizes highly tumorigenic cancer cells residing in heterogeneous populations. The accepted term for such a subpopulation is cancer stem cells (CSCs). While many questions still remain about their precise role in the origin, progression, and drug resistance of tumors, it is clear they exist. In this review, a current understanding of the nature of CSC, their potential usefulness in prognosis, and the need to target them will be discussed. In particular, separate studies now suggest that the CSC is plastic in its phenotype, toggling between tumorigenic and nontumorigenic states depending on both intrinsic and extrinsic conditions. Because of this, a static view of gene and protein levels defined by correlations may not be sufficient to either predict disease progression or aid in the discovery and development of drugs to molecular targets leading to cures. Quantitative dynamic modeling, a bottom up systems biology approach whereby signal transduction pathways are described by differential equations, may offer a novel means to overcome the challenges of oncology today. In conclusion, the complexity of CSCs can be captured in mathematical models that may be useful for selecting molecular targets, defining drug action, and predicting sensitivity or resistance pathways for improved patient outcomes.
High-frequency promoter firing links THO complex function to heavy chromatin formation.
Mouaikel, John; Causse, Sébastien Z; Rougemaille, Mathieu; Daubenton-Carafa, Yves; Blugeon, Corinne; Lemoine, Sophie; Devaux, Frédéric; Darzacq, Xavier; Libri, Domenico
2013-11-27
The THO complex is involved in transcription, genome stability, and messenger ribonucleoprotein (mRNP) formation, but its precise molecular function remains enigmatic. Under heat shock conditions, THO mutants accumulate large protein-DNA complexes that alter the chromatin density of target genes (heavy chromatin), defining a specific biochemical facet of THO function and a powerful tool of analysis. Here, we show that heavy chromatin distribution is dictated by gene boundaries and that the gene promoter is necessary and sufficient to convey THO sensitivity in these conditions. Single-molecule fluorescence in situ hybridization measurements show that heavy chromatin formation correlates with an unusually high firing pace of the promoter with more than 20 transcription events per minute. Heavy chromatin formation closely follows the modulation of promoter firing and strongly correlates with polymerase occupancy genome wide. We propose that the THO complex is required for tuning the dynamic of gene-nuclear pore association and mRNP release to the same high pace of transcription initiation. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Huang, Ling; Holtzinger, Audrey; Jagan, Ishaan; BeGora, Michael; Lohse, Ines; Ngai, Nicholas; Nostro, Cristina; Wang, Rennian; Muthuswamy, Lakshmi B; Crawford, Howard C; Arrowsmith, Cheryl; Kalloger, Steve E; Renouf, Daniel J; Connor, Ashton A; Cleary, Sean; Schaeffer, David F; Roehrl, Michael; Tsao, Ming-Sound; Gallinger, Steven; Keller, Gordon; Muthuswamy, Senthil K
2015-11-01
There are few in vitro models of exocrine pancreas development and primary human pancreatic adenocarcinoma (PDAC). We establish three-dimensional culture conditions to induce the differentiation of human pluripotent stem cells into exocrine progenitor organoids that form ductal and acinar structures in culture and in vivo. Expression of mutant KRAS or TP53 in progenitor organoids induces mutation-specific phenotypes in culture and in vivo. Expression of TP53(R175H) induces cytosolic SOX9 localization. In patient tumors bearing TP53 mutations, SOX9 was cytoplasmic and associated with mortality. We also define culture conditions for clonal generation of tumor organoids from freshly resected PDAC. Tumor organoids maintain the differentiation status, histoarchitecture and phenotypic heterogeneity of the primary tumor and retain patient-specific physiological changes, including hypoxia, oxygen consumption, epigenetic marks and differences in sensitivity to inhibition of the histone methyltransferase EZH2. Thus, pancreatic progenitor organoids and tumor organoids can be used to model PDAC and for drug screening to identify precision therapy strategies.
Desilication of ZSM-5 zeolites for mesoporosity development using microwave irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Zubair; Jun, Jong Won; Kim, Chul-Ung
2015-01-15
Highlights: • Microwaves have beneficial effects on desilication of zeolites. • Produced mesopores with microwaves have narrow pore-size distribution. • Advantages and disadvantages of various desilicating agents were also reported. - Abstract: Mesoporous ZSM-5 zeolite was obtained by desilication in alkaline solutions with microwave (MW) and conventional electric (CE) heating under hydrothermal conditions. Both methods were effective in the production of mesoporous zeolites; however, MW was more efficient than CE as it led to well-defined mesopores with relatively small sizes and a narrow size distribution within a short treatment time. Moreover, the mesoporous ZSM-5 obtained through this method was effectivemore » in producing less bulky products from an acid-catalyzed reaction, specifically the butylation of phenol. Finally, various bases were found to have advantages and disadvantages in desilication. NaOH was the most reactive; however, macroporosity could develop easily under a severe condition. Ammonia water was weakly reactive; however, it could be used to precisely control the pore architecture, and no ion exchange is needed for acid catalysis. Organic amines such as ethylenediamine can also be used in desilication.« less
NASA Astrophysics Data System (ADS)
Xiang, Huazhong; Guo, Hang; Fu, Dongxiang; Zheng, Gang; Zhuang, Songlin; Chen, JiaBi; Wang, Cheng; Wu, Jie
2018-05-01
To precisely measure the whole-surface characterization of freeform progressive addition lenses (PALs), considering the multi-optical-axis conditions is becoming particularly important. Spherical power and astigmatism (cylinder) measurements for freeform PALs, using a Hartmann-Shack wavefront sensor (HSWFS) are proposed herein. Conversion formulas for the optical performance results were provided as HSWFS Zernike polynomial expansions. For each selected zone, the studied PALs were placed and tilted to simulate the multi-optical-axis conditions. The results of two tested PALs were analyzed using MATLAB programs and represented as contour plots of the spherical equivalent and cylinder of the whole-surface. The proposed experimental setup can provide a high accuracy as well as a possibility of choosing 12 lines and positions of 193 measurement zones on the entire surface. This approach to PAL analysis is potentially an efficient and useful method to objectively evaluate the optical performances, in which the full lens surface is defined and expressed as the contour plots of power in different regions (i.e., the distance region, progressive region, and near region) of the lens for regions of interest.
NASA Astrophysics Data System (ADS)
Wartmann, David; Rothbauer, Mario; Kuten, Olga; Barresi, Caterina; Visus, Carmen; Felzmann, Thomas; Ertl, Peter
2015-09-01
The combination of microfabrication-based technologies with cell biology has laid the foundation for the development of advanced in vitro diagnostic systems capable of evaluating cell cultures under defined, reproducible and standardizable measurement conditions. In the present review we describe recent lab-on-a-chip developments for cell analysis and how these methodologies could improve standard quality control in the field of manufacturing cell-based vaccines for clinical purposes. We highlight in particular the regulatory requirements for advanced cell therapy applications using as an example dendritic cell-based cancer vaccines to describe the tangible advantages of microfluidic devices that overcome most of the challenges associated with automation, miniaturization and integration of cell-based assays. As its main advantage lab-on-a-chip technology allows for precise regulation of culturing conditions, while simultaneously monitoring cell relevant parameters using embedded sensory systems. State-of-the-art lab-on-a-chip platforms for in vitro assessment of cell cultures and their potential future applications for cell therapies and cancer immunotherapy are discussed in the present review.
Role of diode lasers in oro-facial pain management.
Javed, F; Kellesarian, S V; Romanos, G E
2017-01-01
With the increasing use of low level laser therapy (LLLT) in clinical dentistry, the aim of the present study was to assess the effectiveness of diode lasers in the management of orofacial pain. Indexed databases were searched without language and time restrictions up to and including July 2016 using different combinations of the following key words: oral, low level laser therapy, dental, pain, diode lasers, discomfort and analgesia. From the literature reviewed it is evident that LLLT is effective compared to traditional procedures in the management of oro-facial pain associated to soft tissue and hard tissue conditions such as premalignant lesions, gingival conditions and dental extractions. However, it remains to be determined which particular wavelength will produce the more favorable and predictable outcome in terms of pain reduction. It is highly recommended that further randomized control trials with well-defined control groups should be performed to determine the precise wavelengths of the diode lasers for the management of oro-facial pain. Within the limits of the present review, it is concluded that diode lasers therapy is more effective in the management of oro-facial pain compared to traditional procedures.
Building Cre Knockin Rat Lines Using CRISPR/Cas9.
Ma, Yuanwu; Zhang, Lianfeng; Huang, Xingxu
2017-01-01
Conditional gene inactivation strategy helps researchers to study the gene functions that are critical in embryogenesis or in defined tissues of adulthood. The Cre/loxP system is widely used for conditional gene inactivation/activation in cells or organisms. Cre knockin animal lines are essential for gene expression or inactivation in a spatially and temporally restricted manner. However, to generate a Cre knockin line by traditional approach is laborious. Recently, the clustered regularly interspaced short palindromic repeats and CRISPR-associated protein 9 (CRISPR/Cas9) has been proven as a simple and efficient genome-editing tool. We have used CRISPR/Cas9 system to generate rat strains that carry Cre genes in different targeted gene loci by direct delivery of gRNAs/Cas9/donors into fertilized eggs. Here, we described a stepwise procedure for the generation of Cre knockin rat, including target site selection, RNA preparation, the construction of the template donor, pronuclear injection, and the genotyping of precise Cre insertion in F 0 rats. Taken together, the establishment of Cre knockin line can be achieved within 6 weeks.
NASA Astrophysics Data System (ADS)
Borecki, M.; Prus, P.; Korwin-Pawlowski, M. L.; Rychlik, A.; Kozubel, W.
2017-08-01
Modern rims and wheels are tested at the design and production stages. Tests can be performed in laboratory conditions and on the ride. In the laboratory, complex and costly equipment is used, as for example wheel balancers and impact testers. Modern wheel balancers are equipped with electronic and electro-mechanical units that enable touch-less measurement of dimensions, including precision measurement of radial and lateral wheel run-out, automatic positioning and application of the counterweights, and vehicle wheel set monitoring - tread wear, drift angles and run-out unbalance. Those tests are performed by on-wheel axis measurements with laser distance meters. The impact tester enables dropping of weights from a defined height onto a wheel. Test criteria are the loss of pressure of the tire and generation of cracks in the wheel without direct impact of the falling weights. In the present paper, a set up composed of three accelerometers, a temperature sensor and a pressure sensor is examined as the base of a wheel tester. The sensor set-up configuration, on-line diagnostic and signal transmission are discussed.
Liu, Shiau-Hua; Dosher, Barbara Anne; Lu, Zhong-Lin
2009-06-01
Multiple attributes of a single-object are often processed more easily than attributes of different objects-a phenomenon associated with object attention. Here we investigate the influence of two factors, judgment frames and judgment precision, on dual-object report deficits as an index of object attention. [Han, S., Dosher, B., & Lu, Z.-L. (2003). Object attention revisited: Identifying mechanisms and boundary conditions. Psychological Science, 14, 598-604] predicted that consistency of the frame for judgments about two separate objects could reduce or eliminate the expression of object attention limitations. The current studies examine the effects of judgment frames and of task precision in orientation identification and find that dual-object report deficits within one feature are indeed affected modestly by the congruency of the judgments and more substantially by the required precision of judgments. The observed dual-object deficits affected contrast thresholds for incongruent frame conditions and for high precision judgments and reduce psychometric asymptotes. These dual-object deficits reflect a combined effect of multiplicative noise and external noise exclusion in dual-object conditions, both related to the effects of attention on the tuning of perceptual templates. These results have implications for modification of object attention theory, for understanding limitations on concurrent tasks.
Bravo, G; Bragança, S; Arezes, P M; Molenbroek, J F M; Castellucci, H I
2018-05-22
Despite offering many benefits, direct manual anthropometric measurement method can be problematic due to their vulnerability to measurement errors. The purpose of this literature review was to determine, whether or not the currently published anthropometric studies of school children, related to ergonomics, mentioned or evaluated the variables precision, reliability or accuracy in the direct manual measurement method. Two bibliographic databases, and the bibliographic references of all the selected papers were used for finding relevant published papers in the fields considered in this study. Forty-six (46) studies met the criteria previously defined for this literature review. However, only ten (10) studies mentioned at least one of the analyzed variables, and none has evaluated all of them. Only reliability was assessed by three papers. Moreover, in what regards the factors that affect precision, reliability and accuracy, the reviewed papers presented large differences. This was particularly clear in the instruments used for the measurements, which were not consistent throughout the studies. Additionally, it was also clear that there was a lack of information regarding the evaluators' training and procedures for anthropometric data collection, which are assumed to be the most important issues that affect precision, reliability and accuracy. Based on the review of the literature, it was possible to conclude that the considered anthropometric studies had not focused their attention to the analysis of precision, reliability and accuracy of the manual measurement methods. Hence, and with the aim of avoiding measurement errors and misleading data, anthropometric studies should put more efforts and care on testing measurement error and defining the procedures used to collect anthropometric data.
Newman, John H; Rich, Stuart; Abman, Steven H; Alexander, John H; Barnard, John; Beck, Gerald J; Benza, Raymond L; Bull, Todd M; Chan, Stephen Y; Chun, Hyung J; Doogan, Declan; Dupuis, Jocelyn; Erzurum, Serpil C; Frantz, Robert P; Geraci, Mark; Gillies, Hunter; Gladwin, Mark; Gray, Michael P; Hemnes, Anna R; Herbst, Roy S; Hernandez, Adrian F; Hill, Nicholas S; Horn, Evelyn M; Hunter, Kendall; Jing, Zhi-Cheng; Johns, Roger; Kaul, Sanjay; Kawut, Steven M; Lahm, Tim; Leopold, Jane A; Lewis, Greg D; Mathai, Stephen C; McLaughlin, Vallerie V; Michelakis, Evangelos D; Nathan, Steven D; Nichols, William; Page, Grier; Rabinovitch, Marlene; Rich, Jonathan; Rischard, Franz; Rounds, Sharon; Shah, Sanjiv J; Tapson, Victor F; Lowy, Naomi; Stockbridge, Norman; Weinmann, Gail; Xiao, Lei
2017-06-15
The Division of Lung Diseases of the NHLBI and the Cardiovascular Medical Education and Research Fund held a workshop to discuss how to leverage the anticipated scientific output from the recently launched "Redefining Pulmonary Hypertension through Pulmonary Vascular Disease Phenomics" (PVDOMICS) program to develop newer approaches to pulmonary vascular disease. PVDOMICS is a collaborative, protocol-driven network to analyze all patient populations with pulmonary hypertension to define novel pulmonary vascular disease (PVD) phenotypes. Stakeholders, including basic, translational, and clinical investigators; clinicians; patient advocacy organizations; regulatory agencies; and pharmaceutical industry experts, joined to discuss the application of precision medicine to PVD clinical trials. Recommendations were generated for discussion of research priorities in line with NHLBI Strategic Vision Goals that include: (1) A national effort, involving all the stakeholders, should seek to coordinate biosamples and biodata from all funded programs to a web-based repository so that information can be shared and correlated with other research projects. Example programs sponsored by NHLBI include PVDOMICS, Pulmonary Hypertension Breakthrough Initiative, the National Biological Sample and Data Repository for PAH, and the National Precision Medicine Initiative. (2) A task force to develop a master clinical trials protocol for PVD to apply precision medicine principles to future clinical trials. Specific features include: (a) adoption of smaller clinical trials that incorporate biomarker-guided enrichment strategies, using adaptive and innovative statistical designs; and (b) development of newer endpoints that reflect well-defined and clinically meaningful changes. (3) Development of updated and systematic variables in imaging, hemodynamic, cellular, genomic, and metabolic tests that will help precisely identify individual and shared features of PVD and serve as the basis of novel phenotypes for therapeutic interventions.
Does precision decrease with set size?
Mazyar, Helga; van den Berg, Ronald; Ma, Wei Ji
2012-01-01
The brain encodes visual information with limited precision. Contradictory evidence exists as to whether the precision with which an item is encoded depends on the number of stimuli in a display (set size). Some studies have found evidence that precision decreases with set size, but others have reported constant precision. These groups of studies differed in two ways. The studies that reported a decrease used displays with heterogeneous stimuli and tasks with a short-term memory component, while the ones that reported constancy used homogeneous stimuli and tasks that did not require short-term memory. To disentangle the effects of heterogeneity and short-memory involvement, we conducted two main experiments. In Experiment 1, stimuli were heterogeneous, and we compared a condition in which target identity was revealed before the stimulus display with one in which it was revealed afterward. In Experiment 2, target identity was fixed, and we compared heterogeneous and homogeneous distractor conditions. In both experiments, we compared an optimal-observer model in which precision is constant with set size with one in which it depends on set size. We found that precision decreases with set size when the distractors are heterogeneous, regardless of whether short-term memory is involved, but not when it is homogeneous. This suggests that heterogeneity, not short-term memory, is the critical factor. In addition, we found that precision exhibits variability across items and trials, which may partly be caused by attentional fluctuations. PMID:22685337
Wrap spring clutch syringe ram and frit mixer
Simpson, Frank B.
2006-07-25
A wrap spring clutch syringe ram pushes at least one syringe with virtually instantaneous starting and stopping, and with constant motion at a defined velocity during the intervening push. The wrap spring clutch syringe ram includes an electric motor, a computer, a flywheel, a wrap spring clutch, a precision lead screw, a slide platform, and syringe reservoirs, a mixing chamber, and a reaction incubation tube. The electric motor drives a flywheel and the wrap spring clutch couples the precision lead screw to the flywheel when a computer enables a solenoid of the wrap spring clutch. The precision lead screw drives a precision slide which causes syringes to supply a portion of solution into the mixing chamber and the incubation tube. The wrap spring clutch syringe ram is designed to enable the quantitative study of solution phase chemical and biochemical reactions, particularly those reactions that occur on the subsecond time scale.
Methods for the Precise Locating and Forming of Arrays of Curved Features into a Workpiece
Gill, David Dennis; Keeler, Gordon A.; Serkland, Darwin K.; Mukherjee, Sayan D.
2008-10-14
Methods for manufacturing high precision arrays of curved features (e.g. lenses) in the surface of a workpiece are described utilizing orthogonal sets of inter-fitting locating grooves to mate a workpiece to a workpiece holder mounted to the spindle face of a rotating machine tool. The matching inter-fitting groove sets in the workpiece and the chuck allow precisely and non-kinematically indexing the workpiece to locations defined in two orthogonal directions perpendicular to the turning axis of the machine tool. At each location on the workpiece a curved feature can then be on-center machined to create arrays of curved features on the workpiece. The averaging effect of the corresponding sets of inter-fitting grooves provide for precise repeatability in determining, the relative locations of the centers of each of the curved features in an array of curved features.
Precision Medicine: From Science to Value
Ginsburg, Geoffrey S; Phillips, Kathryn A
2018-01-01
Precision medicine is poised to have an impact on patients, health care delivery systems and research participants in ways that were only imagined 15 years ago when the human genome was first sequenced. While discovery using genome-based technologies has accelerated, these have only begun to be adopted into clinical medicine. Here we define precision medicine and the stakeholder ecosystem required to enable its integration into research and health care. We explore the intersection of data science, analytics and precision medicine in creating a learning health system that carries out research in the context of clinical care and at the same time optimizes the tools and information used to delivery improved patient outcomes. We provide examples of real world impact, and conclude with a policy and economic agenda that will be necessary for the adoption of this new paradigm of health care both in the United States and globally. PMID:29733705
Ma, Shu-Ching; Li, Yu-Chi; Yui, Mei-Shu
2014-01-01
Background Workplace bullying is a prevalent problem in contemporary work places that has adverse effects on both the victims of bullying and organizations. With the rapid development of computer technology in recent years, there is an urgent need to prove whether item response theory–based computerized adaptive testing (CAT) can be applied to measure exposure to workplace bullying. Objective The purpose of this study was to evaluate the relative efficiency and measurement precision of a CAT-based test for hospital nurses compared to traditional nonadaptive testing (NAT). Under the preliminary conditions of a single domain derived from the scale, a CAT module bullying scale model with polytomously scored items is provided as an example for evaluation purposes. Methods A total of 300 nurses were recruited and responded to the 22-item Negative Acts Questionnaire-Revised (NAQ-R). All NAT (or CAT-selected) items were calibrated with the Rasch rating scale model and all respondents were randomly selected for a comparison of the advantages of CAT and NAT in efficiency and precision by paired t tests and the area under the receiver operating characteristic curve (AUROC). Results The NAQ-R is a unidimensional construct that can be applied to measure exposure to workplace bullying through CAT-based administration. Nursing measures derived from both tests (CAT and NAT) were highly correlated (r=.97) and their measurement precisions were not statistically different (P=.49) as expected. CAT required fewer items than NAT (an efficiency gain of 32%), suggesting a reduced burden for respondents. There were significant differences in work tenure between the 2 groups (bullied and nonbullied) at a cutoff point of 6 years at 1 worksite. An AUROC of 0.75 (95% CI 0.68-0.79) with logits greater than –4.2 (or >30 in summation) was defined as being highly likely bullied in a workplace. Conclusions With CAT-based administration of the NAQ-R for nurses, their burden was substantially reduced without compromising measurement precision. PMID:24534113
Siebers, Jeffrey V
2008-04-04
Monte Carlo (MC) is rarely used for IMRT plan optimization outside of research centres due to the extensive computational resources or long computation times required to complete the process. Time can be reduced by degrading the statistical precision of the MC dose calculation used within the optimization loop. However, this eventually introduces optimization convergence errors (OCEs). This study determines the statistical noise levels tolerated during MC-IMRT optimization under the condition that the optimized plan has OCEs <100 cGy (1.5% of the prescription dose) for MC-optimized IMRT treatment plans.Seven-field prostate IMRT treatment plans for 10 prostate patients are used in this study. Pre-optimization is performed for deliverable beams with a pencil-beam (PB) dose algorithm. Further deliverable-based optimization proceeds using: (1) MC-based optimization, where dose is recomputed with MC after each intensity update or (2) a once-corrected (OC) MC-hybrid optimization, where a MC dose computation defines beam-by-beam dose correction matrices that are used during a PB-based optimization. Optimizations are performed with nominal per beam MC statistical precisions of 2, 5, 8, 10, 15, and 20%. Following optimizer convergence, beams are re-computed with MC using 2% per beam nominal statistical precision and the 2 PTV and 10 OAR dose indices used in the optimization objective function are tallied. For both the MC-optimization and OC-optimization methods, statistical equivalence tests found that OCEs are less than 1.5% of the prescription dose for plans optimized with nominal statistical uncertainties of up to 10% per beam. The achieved statistical uncertainty in the patient for the 10% per beam simulations from the combination of the 7 beams is ~3% with respect to maximum dose for voxels with D>0.5D(max). The MC dose computation time for the OC-optimization is only 6.2 minutes on a single 3 Ghz processor with results clinically equivalent to high precision MC computations.
Understanding the Effectiveness of Performance Management Practices
2010-03-01
clarity and precision focus defining accomplishments and practices. Jones (1995) research on organizational transformation in the Monsanto Company...Performance managment in a changing context: Monsanto Pioneers a competency-based, developmental approach. Human Resource Management , 34 (3), 425-442
Knock-Outs, Stick-Outs, Cut-Outs: Clipping Paths Separate Objects from Background.
ERIC Educational Resources Information Center
Wilson, Bradley
1998-01-01
Outlines a six-step process that allows computer operators, using Photoshop software, to create "knock-outs" to precisely define the path that will serve to separate the object from the background. (SR)
Yokoo, Takeshi; Serai, Suraj D; Pirasteh, Ali; Bashir, Mustafa R; Hamilton, Gavin; Hernando, Diego; Hu, Houchun H; Hetterich, Holger; Kühn, Jens-Peter; Kukuk, Guido M; Loomba, Rohit; Middleton, Michael S; Obuchowski, Nancy A; Song, Ji Soo; Tang, An; Wu, Xinhuai; Reeder, Scott B; Sirlin, Claude B
2018-02-01
Purpose To determine the linearity, bias, and precision of hepatic proton density fat fraction (PDFF) measurements by using magnetic resonance (MR) imaging across different field strengths, imager manufacturers, and reconstruction methods. Materials and Methods This meta-analysis was performed in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. A systematic literature search identified studies that evaluated the linearity and/or bias of hepatic PDFF measurements by using MR imaging (hereafter, MR imaging-PDFF) against PDFF measurements by using colocalized MR spectroscopy (hereafter, MR spectroscopy-PDFF) or the precision of MR imaging-PDFF. The quality of each study was evaluated by using the Quality Assessment of Studies of Diagnostic Accuracy 2 tool. De-identified original data sets from the selected studies were pooled. Linearity was evaluated by using linear regression between MR imaging-PDFF and MR spectroscopy-PDFF measurements. Bias, defined as the mean difference between MR imaging-PDFF and MR spectroscopy-PDFF measurements, was evaluated by using Bland-Altman analysis. Precision, defined as the agreement between repeated MR imaging-PDFF measurements, was evaluated by using a linear mixed-effects model, with field strength, imager manufacturer, reconstruction method, and region of interest as random effects. Results Twenty-three studies (1679 participants) were selected for linearity and bias analyses and 11 studies (425 participants) were selected for precision analyses. MR imaging-PDFF was linear with MR spectroscopy-PDFF (R 2 = 0.96). Regression slope (0.97; P < .001) and mean Bland-Altman bias (-0.13%; 95% limits of agreement: -3.95%, 3.40%) indicated minimal underestimation by using MR imaging-PDFF. MR imaging-PDFF was precise at the region-of-interest level, with repeatability and reproducibility coefficients of 2.99% and 4.12%, respectively. Field strength, imager manufacturer, and reconstruction method each had minimal effects on reproducibility. Conclusion MR imaging-PDFF has excellent linearity, bias, and precision across different field strengths, imager manufacturers, and reconstruction methods. © RSNA, 2017 Online supplemental material is available for this article. An earlier incorrect version of this article appeared online. This article was corrected on October 2, 2017.
Clinical and Mechanistic Insights into the Genetics of Cardiomyopathy
Burke, Michael A.; Cook, Stuart A.; Seidman, Jonathan G.; Seidman, Christine E.
2018-01-01
Over the last quarter-century, there has been tremendous progress in genetics research that has defined molecular causes for cardiomyopathies. More than a thousand mutations have been identified in many genes with varying ontologies, therein indicating the diverse molecules and pathways that cause hypertrophic, dilated, restrictive, and arrhythmogenic cardiomyopathies. Translation of this research to the clinic via genetic testing can precisely group affected patients according to molecular etiology, and identify individuals without evidence of disease who are at high risk for developing cardiomyopathy. These advances provide insights into the earliest manifestations of cardiomyopathy and help to define the molecular pathophysiological basis for cardiac remodeling. Although these efforts remain incomplete, new genomic technologies and analytic strategies provide unparalleled opportunities to fully explore the genetic architecture of cardiomyopathies. Such data hold the promise that mutation-specific pathophysiology will uncover novel therapeutic targets, and herald the beginning of precision therapy for cardiomyopathy patients. PMID:28007147
Combinatorial Gata2 and Sca1 expression defines hematopoietic stem cells in the bone marrow niche
Suzuki, Norio; Ohneda, Osamu; Minegishi, Naoko; Nishikawa, Mitsuo; Ohta, Takayuki; Takahashi, Satoru; Engel, James Douglas; Yamamoto, Masayuki
2006-01-01
The interaction between stem cells and their supportive microenvironment is critical for their maintenance, function, and survival. Whereas hematopoietic stem cells (HSCs) are among the best characterized of tissue stem cells, their precise site of residence (referred to as the niche) in the adult bone marrow has not been precisely defined. In this study, we found that a Gata2 promoter directs activity in all HSCs. We show that HSCs can be isolated efficiently from bone marrow cells by following Gata2-directed GFP fluorescence, and that they can also be monitored in vivo. Each individual GFP-positive cell lay in a G0/G1 cell cycle state, in intimate contact with osteoblasts beside the endosteum, at the edge of the bone marrow. We conclude that the HSC niche is composed of solitary cells and that adult bone marrow HSC are not clustered. PMID:16461905
Time-calibrated Milankovitch cycles for the late Permian.
Wu, Huaichun; Zhang, Shihong; Hinnov, Linda A; Jiang, Ganqing; Feng, Qinglai; Li, Haiyan; Yang, Tianshui
2013-01-01
An important innovation in the geosciences is the astronomical time scale. The astronomical time scale is based on the Milankovitch-forced stratigraphy that has been calibrated to astronomical models of paleoclimate forcing; it is defined for much of Cenozoic-Mesozoic. For the Palaeozoic era, however, astronomical forcing has not been widely explored because of lack of high-precision geochronology or astronomical modelling. Here we report Milankovitch cycles from late Permian (Lopingian) strata at Meishan and Shangsi, South China, time calibrated by recent high-precision U-Pb dating. The evidence extends empirical knowledge of Earth's astronomical parameters before 250 million years ago. Observed obliquity and precession terms support a 22-h length-of-day. The reconstructed astronomical time scale indicates a 7.793-million year duration for the Lopingian epoch, when strong 405-kyr cycles constrain astronomical modelling. This is the first significant advance in defining the Palaeozoic astronomical time scale, anchored to absolute time, bridging the Palaeozoic-Mesozoic transition.
Snyder, David A; Montelione, Gaetano T
2005-06-01
An important open question in the field of NMR-based biomolecular structure determination is how best to characterize the precision of the resulting ensemble of structures. Typically, the RMSD, as minimized in superimposing the ensemble of structures, is the preferred measure of precision. However, the presence of poorly determined atomic coordinates and multiple "RMSD-stable domains"--locally well-defined regions that are not aligned in global superimpositions--complicate RMSD calculations. In this paper, we present a method, based on a novel, structurally defined order parameter, for identifying a set of core atoms to use in determining superimpositions for RMSD calculations. In addition we present a method for deciding whether to partition that core atom set into "RMSD-stable domains" and, if so, how to determine partitioning of the core atom set. We demonstrate our algorithm and its application in calculating statistically sound RMSD values by applying it to a set of NMR-derived structural ensembles, superimposing each RMSD-stable domain (or the entire core atom set, where appropriate) found in each protein structure under consideration. A parameter calculated by our algorithm using a novel, kurtosis-based criterion, the epsilon-value, is a measure of precision of the superimposition that complements the RMSD. In addition, we compare our algorithm with previously described algorithms for determining core atom sets. The methods presented in this paper for biomolecular structure superimposition are quite general, and have application in many areas of structural bioinformatics and structural biology.
Ishida, Junichi; Saitoh, Masakazu; Doehner, Wolfram; von Haehling, Stephan; Anker, Markus; Anker, Stefan D; Springer, Jochen
2017-07-01
Cachexia is defined as a complex metabolic syndrome associated with underlying illness that is characterized by the loss of body weight consisting of muscle and fat mass wasting. Sarcopenia is defined as the ageing related loss of muscle mass in health and disease that may not have an effect on body weight. As millions of patients are in cachectic or sarcopenic states, both conditions contribute to high numbers to death worldwide. A number of treatments have been proposed for cachexia and sarcopenia, but these are either in the preclinical stage or in clinical trials and hence not available to the general population. Particularly in cachexia there is a massive problem of recruiting patients for trials and also with the follow-up, due to the seriousness of the disease. This underlines the importance of well-characterized animal models. Obviously, most of the widely used cachexia and sarcopenia animal models have limitations in reproducibility of the condition and novel models are warranted in this context. The key findings of developing models in the field of cachexia and sarcopenia are that more types of the conditions have been taken into the researchers' interest. In cardiac cachexia, technical issues, which limit the preciseness and reproducibility in surgical heart failure models, have been overcome by a combination of surgery and the use of transgenic mouse models or salt sensitive rat models. Fatigue is the most pronounced symptom of cachexia and may be caused by reduced cardiac function independent of the underlying disease. Sarcopenia models often suffer from the use of young animals, due to the limited availability and very high costs of using aged animals. This review will focus on rodent models designed to mimic cachexia and sarcopenia including co-morbidities such as cancer, heart failure, as well as other diseases and conditions. Copyright © 2017 Elsevier B.V. All rights reserved.
Web mining for topics defined by complex and precise predicates
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Sampathkumar, Sushma
2004-04-01
The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.
NASA GRAIL Spacecraft in Science Collection Phase Artist Concept
2012-03-27
An artist depiction of the twin spacecraft that comprise NASA GRAIL mission. During the GRAIL mission science phase, spacecraft Ebb and Flow transmit radio signals precisely defining the distance between them as they orbit the moon in formation.
Panama Canal Fog Navigation Study : Candidate System Definition
DOT National Transportation Integrated Search
1984-01-01
A candidate system for solving fog navigation problems in the Panama Canal is defined. The vessel monitoring subsystem is a shore-based, all-weather, precision ranging system with ranging accuracies of 9 feet (2 standard deviations, 95 percent).
Precision medicine for advanced prostate cancer
Mullane, Stephanie A.; Van Allen, Eliezer M.
2016-01-01
Purpose of review Precision cancer medicine, the use of genomic profiling of patient tumors at the point-of-care to inform treatment decisions, is rapidly changing treatment strategies across cancer types. Precision medicine for advanced prostate cancer may identify new treatment strategies and change clinical practice. In this review, we discuss the potential and challenges of precision medicine in advanced prostate cancer. Recent findings Although primary prostate cancers do not harbor highly recurrent targetable genomic alterations, recent reports on the genomics of metastatic castration-resistant prostate cancer has shown multiple targetable alterations in castration-resistant prostate cancer metastatic biopsies. Therapeutic implications include targeting prevalent DNA repair pathway alterations with PARP-1 inhibition in genomically defined subsets of patients, among other genomically stratified targets. In addition, multiple recent efforts have demonstrated the promise of liquid tumor profiling (e.g., profiling circulating tumor cells or cell-free tumor DNA) and highlighted the necessary steps to scale these approaches in prostate cancer. Summary Although still in the initial phase of precision medicine for prostate cancer, there is extraordinary potential for clinical impact. Efforts to overcome current scientific and clinical barriers will enable widespread use of precision medicine approaches for advanced prostate cancer patients. PMID:26909474
Precision medicine for advanced prostate cancer.
Mullane, Stephanie A; Van Allen, Eliezer M
2016-05-01
Precision cancer medicine, the use of genomic profiling of patient tumors at the point-of-care to inform treatment decisions, is rapidly changing treatment strategies across cancer types. Precision medicine for advanced prostate cancer may identify new treatment strategies and change clinical practice. In this review, we discuss the potential and challenges of precision medicine in advanced prostate cancer. Although primary prostate cancers do not harbor highly recurrent targetable genomic alterations, recent reports on the genomics of metastatic castration-resistant prostate cancer has shown multiple targetable alterations in castration-resistant prostate cancer metastatic biopsies. Therapeutic implications include targeting prevalent DNA repair pathway alterations with PARP-1 inhibition in genomically defined subsets of patients, among other genomically stratified targets. In addition, multiple recent efforts have demonstrated the promise of liquid tumor profiling (e.g., profiling circulating tumor cells or cell-free tumor DNA) and highlighted the necessary steps to scale these approaches in prostate cancer. Although still in the initial phase of precision medicine for prostate cancer, there is extraordinary potential for clinical impact. Efforts to overcome current scientific and clinical barriers will enable widespread use of precision medicine approaches for advanced prostate cancer patients.
The Lcn2-engineered HEK-293 cells show senescence under stressful condition
Bahmani, Bahareh; Amiri, Fatemeh; Mohammadi Roushandeh, Amaneh; Bahadori, Marzie; Harati, Mozhgan Dehghan; Habibi Roudkenar, Mehryar
2015-01-01
Objective(s): Lipocalin2 (Lcn2) gene is highly expressed in response to various types of cellular stresses. The precise role of Lcn2 has not been fully understood yet. However, it plays a key role in controlling vital cellular processes such as proliferation, apoptosis and metabolism. Recently it was shown that Lcn2 decreases senescence and increases proliferation of mesenchymal stem cells (MSC) with finite life span under either normal or oxidative stress conditions. However, Lcn2 effects on immortal cell line with infinite proliferation are not defined completely. Materials and Materials and Methods: HEK-293 cells were transfected with recombinant pcDNA3.1 containing Lcn2 fragment (pcDNA3.1-Lcn2). Expression of lipocalin2 in transfected cells was evaluated by RT-PCR, real time RT-PCR, and ELISA. Different cell groups were treated with H2O2 and WST-1 assay was performed to determine their proliferation rate. Senescence was studied by β-galactosidase and gimsa staining methods as well as evaluation of the expression of senescence-related genes by real time RT-PCR. Results: Lcn2 increased cell proliferation under normal culture condition, while the proliferation slightly decreased under oxidative stress. This decrease was further found to be attributed to senescence. Conclusion: Our findings indicated that under harmful conditions, Lcn2 gene is responsible for the regulation of cell survival through senescence. PMID:26124931
Billiard, Michel
2007-10-01
Defining the precise nosological limits of narcolepsy and idiopathic hypersomnia is an ongoing process dating back to the first description of the two conditions. The most recent step forward has been done within the preparation of the second edition of the "International classification of sleep disorders" published in June 2005. Appointed by Dr Emmanuel Mignot, the Task Force on "Hypersomnias of central origin, not due to a circadian rhythm sleep disorder, sleep related breathing disorder, or other causes of disturbed nocturnal sleep" thoroughly revisited the nosology of narcolepsy and of idiopathic hypersomnia. Narcolepsy is now distinguished into three different entities, narcolepsy with cataplexy, narcolepsy without cataplexy and narcolepsy due to medical condition, and idiopathic hypersomnia into two entities, idiopathic hypersomnia with long sleep time and idiopathic hypersomnia without long sleep time. Nevertheless there are still a number of pending issues. What are the limits of narcolepsy without cataplexy? Is there a continuum in the pathophysiology of narcolepsy with and without cataplexy? Should sporadic and familial forms of narcolepsy with cataplexy appear as subgroups in the classification? Are idiopathic hypersomnia with long sleep time and idiopathic hypersomnia without long sleep time, two forms of the same condition or two different conditions? Is there a pathophysiological relationship between narcolepsy without cataplexy and idiopathic hypersomnia without long sleep time?
Staley, Dennis; Kean, Jason W.; Cannon, Susan H.; Schmidt, Kevin M.; Laber, Jayme L.
2012-01-01
Rainfall intensity–duration (ID) thresholds are commonly used to predict the temporal occurrence of debris flows and shallow landslides. Typically, thresholds are subjectively defined as the upper limit of peak rainstorm intensities that do not produce debris flows and landslides, or as the lower limit of peak rainstorm intensities that initiate debris flows and landslides. In addition, peak rainstorm intensities are often used to define thresholds, as data regarding the precise timing of debris flows and associated rainfall intensities are usually not available, and rainfall characteristics are often estimated from distant gauging locations. Here, we attempt to improve the performance of existing threshold-based predictions of post-fire debris-flow occurrence by utilizing data on the precise timing of debris flows relative to rainfall intensity, and develop an objective method to define the threshold intensities. We objectively defined the thresholds by maximizing the number of correct predictions of debris flow occurrence while minimizing the rate of both Type I (false positive) and Type II (false negative) errors. We identified that (1) there were statistically significant differences between peak storm and triggering intensities, (2) the objectively defined threshold model presents a better balance between predictive success, false alarms and failed alarms than previous subjectively defined thresholds, (3) thresholds based on measurements of rainfall intensity over shorter duration (≤60 min) are better predictors of post-fire debris-flow initiation than longer duration thresholds, and (4) the objectively defined thresholds were exceeded prior to the recorded time of debris flow at frequencies similar to or better than subjective thresholds. Our findings highlight the need to better constrain the timing and processes of initiation of landslides and debris flows for future threshold studies. In addition, the methods used to define rainfall thresholds in this study represent a computationally simple means of deriving critical values for other studies of nonlinear phenomena characterized by thresholds.
Self-assembled DNA tetrahedral optofluidic lasers with precise and tunable gain control.
Chen, Qiushu; Liu, Huajie; Lee, Wonsuk; Sun, Yuze; Zhu, Dan; Pei, Hao; Fan, Chunhai; Fan, Xudong
2013-09-07
We have applied self-assembled DNA tetrahedral nanostructures for the precise and tunable control of the gain in an optofluidic fluorescence resonance energy transfer (FRET) laser. By adjusting the ratio of the donor and the acceptor attached to the tetrahedral vertices, 3.8 times reduction in the lasing threshold and 28-fold enhancement in the lasing efficiency were demonstrated. This work takes advantage of the self-recognition and self-assembly capabilities of biomolecules with well-defined structures and addressability, enabling nano-engineering of the laser down to the molecular level.
Frischer, Robert; Penhaker, Marek; Krejcar, Ondrej; Kacerovsky, Marian; Selamat, Ali
2014-01-01
Precise temperature measurement is essential in a wide range of applications in the medical environment, however the regarding the problem of temperature measurement inside a simple incubator, neither a simple nor a low cost solution have been proposed yet. Given that standard temperature sensors don't satisfy the necessary expectations, the problem is not measuring temperature, but rather achieving the desired sensitivity. In response, this paper introduces a novel hardware design as well as the implementation that increases measurement sensitivity in defined temperature intervals at low cost. PMID:25494352
Adaptive Pre-FFT Equalizer with High-Precision Channel Estimator for ISI Channels
NASA Astrophysics Data System (ADS)
Yoshida, Makoto
We present an attractive approach for OFDM transmission using an adaptive pre-FFT equalizer, which can select ICI reduction mode according to channel condition, and a degenerated-inverse-matrix-based channel estimator (DIME), which uses a cyclic sinc-function matrix uniquely determined by transmitted subcarriers. In addition to simulation results, the proposed system with an adaptive pre-FFT equalizer and DIME has been laboratory tested by using a software defined radio (SDR)-based test bed. The simulation and experimental results demonstrated that the system at a rate of more than 100Mbps can provide a bit error rate of less than 10-3 for a fast multi-path fading channel that has a moving velocity of more than 200km/h with a delay spread of 1.9µs (a maximum delay path of 7.3µs) in the 5-GHz band.
Production of Isolated Giant Unilamellar Vesicles under High Salt Concentrations
Stein, Hannah; Spindler, Susann; Bonakdar, Navid; Wang, Chun; Sandoghdar, Vahid
2017-01-01
The cell membrane forms a dynamic and complex barrier between the living cell and its environment. However, its in vivo studies are difficult because it consists of a high variety of lipids and proteins and is continuously reorganized by the cell. Therefore, membrane model systems with precisely controlled composition are used to investigate fundamental interactions of membrane components under well-defined conditions. Giant unilamellar vesicles (GUVs) offer a powerful model system for the cell membrane, but many previous studies have been performed in unphysiologically low ionic strength solutions which might lead to altered membrane properties, protein stability and lipid-protein interaction. In the present work, we give an overview of the existing methods for GUV production and present our efforts on forming single, free floating vesicles up to several tens of μm in diameter and at high yield in various buffer solutions with physiological ionic strength and pH. PMID:28243205
On Maximal Hard-Core Thinnings of Stationary Particle Processes
NASA Astrophysics Data System (ADS)
Hirsch, Christian; Last, Günter
2018-02-01
The present paper studies existence and distributional uniqueness of subclasses of stationary hard-core particle systems arising as thinnings of stationary particle processes. These subclasses are defined by natural maximality criteria. We investigate two specific criteria, one related to the intensity of the hard-core particle process, the other one being a local optimality criterion on the level of realizations. In fact, the criteria are equivalent under suitable moment conditions. We show that stationary hard-core thinnings satisfying such criteria exist and are frequently distributionally unique. More precisely, distributional uniqueness holds in subcritical and barely supercritical regimes of continuum percolation. Additionally, based on the analysis of a specific example, we argue that fluctuations in grain sizes can play an important role for establishing distributional uniqueness at high intensities. Finally, we provide a family of algorithmically constructible approximations whose volume fractions are arbitrarily close to the maximum.
Construction of Orthonormal Wavelets Using Symbolic Algebraic Methods
NASA Astrophysics Data System (ADS)
Černá, Dana; Finěk, Václav
2009-09-01
Our contribution is concerned with the solution of nonlinear algebraic equations systems arising from the computation of scaling coefficients of orthonormal wavelets with compact support. Specifically Daubechies wavelets, symmlets, coiflets, and generalized coiflets. These wavelets are defined as a solution of equation systems which are partly linear and partly nonlinear. The idea of presented methods consists in replacing those equations for scaling coefficients by equations for scaling moments. It enables us to eliminate some quadratic conditions in the original system and then simplify it. The simplified system is solved with the aid of the Gröbner basis method. The advantage of our approach is that in some cases, it provides all possible solutions and these solutions can be computed to arbitrary precision. For small systems, we are even able to find explicit solutions. The computation was carried out by symbolic algebra software Maple.
Extreme disorder in an ultrahigh-affinity protein complex
NASA Astrophysics Data System (ADS)
Borgia, Alessandro; Borgia, Madeleine B.; Bugge, Katrine; Kissling, Vera M.; Heidarsson, Pétur O.; Fernandes, Catarina B.; Sottini, Andrea; Soranno, Andrea; Buholzer, Karin J.; Nettels, Daniel; Kragelund, Birthe B.; Best, Robert B.; Schuler, Benjamin
2018-03-01
Molecular communication in biology is mediated by protein interactions. According to the current paradigm, the specificity and affinity required for these interactions are encoded in the precise complementarity of binding interfaces. Even proteins that are disordered under physiological conditions or that contain large unstructured regions commonly interact with well-structured binding sites on other biomolecules. Here we demonstrate the existence of an unexpected interaction mechanism: the two intrinsically disordered human proteins histone H1 and its nuclear chaperone prothymosin-α associate in a complex with picomolar affinity, but fully retain their structural disorder, long-range flexibility and highly dynamic character. On the basis of closely integrated experiments and molecular simulations, we show that the interaction can be explained by the large opposite net charge of the two proteins, without requiring defined binding sites or interactions between specific individual residues. Proteome-wide sequence analysis suggests that this interaction mechanism may be abundant in eukaryotes.
Liquid crystalline spinning of spider silk
NASA Astrophysics Data System (ADS)
Vollrath, Fritz; Knight, David P.
2001-03-01
Spider silk has outstanding mechanical properties despite being spun at close to ambient temperatures and pressures using water as the solvent. The spider achieves this feat of benign fibre processing by judiciously controlling the folding and crystallization of the main protein constituents, and by adding auxiliary compounds, to create a composite material of defined hierarchical structure. Because the `spinning dope' (the material from which silk is spun) is liquid crystalline, spiders can draw it during extrusion into a hardened fibre using minimal forces. This process involves an unusual internal drawdown within the spider's spinneret that is not seen in industrial fibre processing, followed by a conventional external drawdown after the dope has left the spinneret. Successful copying of the spider's internal processing and precise control over protein folding, combined with knowledge of the gene sequences of its spinning dopes, could permit industrial production of silk-based fibres with unique properties under benign conditions.
HSF1 critically attunes proteotoxic stress sensing by mTORC1 to combat stress and promote growth.
Su, Kuo-Hui; Cao, Junyue; Tang, Zijian; Dai, Siyuan; He, Yishu; Sampson, Stephen Byers; Benjamin, Ivor J; Dai, Chengkai
2016-05-01
To cope with proteotoxic stress, cells attenuate protein synthesis. However, the precise mechanisms underlying this fundamental adaptation remain poorly defined. Here we report that mTORC1 acts as an immediate cellular sensor of proteotoxic stress. Surprisingly, the multifaceted stress-responsive kinase JNK constitutively associates with mTORC1 under normal growth conditions. On activation by proteotoxic stress, JNK phosphorylates both RAPTOR at S863 and mTOR at S567, causing partial disintegration of mTORC1 and subsequent translation inhibition. Importantly, HSF1, the central player in the proteotoxic stress response (PSR), preserves mTORC1 integrity and function by inactivating JNK, independently of its canonical transcriptional action. Thereby, HSF1 translationally augments the PSR. Beyond promoting stress resistance, this intricate HSF1-JNK-mTORC1 interplay, strikingly, regulates cell, organ and body sizes. Thus, these results illuminate a unifying mechanism that controls stress adaptation and growth.
Modelling and tuning for a time-delayed vibration absorber with friction
NASA Astrophysics Data System (ADS)
Zhang, Xiaoxu; Xu, Jian; Ji, Jinchen
2018-06-01
This paper presents an integrated analytical and experimental study to the modelling and tuning of a time-delayed vibration absorber (TDVA) with friction. In system modelling, this paper firstly applies the method of averaging to obtain the frequency response function (FRF), and then uses the derived FRF to evaluate the fitness of different friction models. After the determination of the system model, this paper employs the obtained FRF to evaluate the vibration absorption performance with respect to tunable parameters. A significant feature of the TDVA with friction is that its stability is dependent on the excitation parameters. To ensure the stability of the time-delayed control, this paper defines a sufficient condition for stability estimation. Experimental measurements show that the dynamic response of the TDVA with friction can be accurately predicted and the time-delayed control can be precisely achieved by using the modelling and tuning technique provided in this paper.
Modeling and Sound Insulation Performance Analysis of Two Honeycomb-hole Coatings
NASA Astrophysics Data System (ADS)
Ye, H. F.; Tao, M.; Zhang, W. Z.
2018-05-01
During the sound transmission loss test in the standing-wave tube, the unavoidable reflected wave from the termination of downstream tube would affect the precision measurement of sound transmission loss(TL). However, it can be solved by defining the non-reflected boundary conditions when modeling based on the finite element method. Then, the model has been validated by comparing with the analytical method. Based on the present model, the sound insulation performance of two types of honeycomb-hole coatings have been analyzed and discussed. Moreover, the changes of parameters play an important role on the sound insulation performance of honeycomb-hole coating and the negative Poisson’s ratio honeycomb-hole coating has better sound insulation performance at special frequencies. Finally, it is summarized that sound insulation performance is the result of various factors that include the impedance changes, the waveform transformation and so on.
On the relationship between topological and geometric defects.
Griffin, Sinéad M; Spaldin, Nicola A
2017-08-31
The study of topology in solids is undergoing a renaissance following renewed interest in the properties of ferroic domain walls as well as recent discoveries regarding skyrmionic lattices. Each of these systems possess a property that is 'protected' in a symmetry sense, and is defined rigorously using a branch of mathematics known as topology. In this article we review the formal definition of topological defects as they are classified in terms of homotopy theory, and discuss the precise symmetry-breaking conditions that lead to their formation. We distinguish topological defects from defects that arise from the details of the stacking or structure of the material but are not protected by symmetry, and we propose the term 'geometric defects' to describe the latter. We provide simple material examples of both topological and geometric defect types, and discuss the implications of the classification on the resulting material properties.
Time to rethink the neural mechanisms of learning and memory
Gallistel, Charles R.; Balsam, Peter D
2014-01-01
Most studies in the neurobiology of learning assume that the underlying learning process is a pairing – dependent change in synaptic strength that requires repeated experience of events presented in close temporal contiguity. However, much learning is rapid and does not depend on temporal contiguity which has never been precisely defined. These points are well illustrated by studies showing that temporal relationships between events are rapidly learned-even over long delays- and this knowledge governs the form and timing of behavior. The speed with which anticipatory responses emerge in conditioning paradigms is determined by the information that cues provide about the timing of rewards. The challenge for understanding the neurobiology of learning is to understand the mechanisms in the nervous system that encode information from even a single experience, the nature of the memory mechanisms that can encode quantities such as time, and how the brain can flexibly perform computations based on this information. PMID:24309167
Hydrocarbon-fuel/combustion-chamber-liner materials compatibility
NASA Technical Reports Server (NTRS)
Gage, Mark L.
1990-01-01
Results of material compatibility experiments using hydrocarbon fuels in contact with copper-based combustion chamber liner materials are presented. Mil-Spec RP-1, n- dodecane, propane, and methane fuels were tested in contact with OFHC, NASA-Z, and ZrCu coppers. Two distinct test methods were employed. Static tests, in which copper coupons were exposed to fuel for long durations at constant temperature and pressure, provided compatibility data in a precisely controlled environment. Dynamic tests, using the Aerojet Carbothermal Test Facility, provided fuel and copper compatibility data under realistic booster engine service conditions. Tests were conducted using very pure grades of each fuel and fuels to which a contaminant, e.g., ethylene or methyl mercaptan, was added to define the role played by fuel impurities. Conclusions are reached as to degradation mechanisms and effects, methods for the elimination of these mechanisms, selection of copper alloy combustion chamber liners, and hydrocarbon fuel purchase specifications.
Design criteria for flightpath and airspeed control for the approach and landing of STOL aircraft
NASA Technical Reports Server (NTRS)
Franklin, J. A.; Innis, R. C.; Hardy, G. H.; Stephenson, J. D.
1982-01-01
A flight research program was conducted to assess requirements for flightpath and airspeed control for glide-slope tracking during a precision approach and for flare control, particularly as applied to powered-lift, short takeoff and landing (STOL) aircraft. Ames Research Center's Augmentor Wing Research Aircraft was used to fly approaches on a 7.5 deg glide slope to landings on a 30 X 518 m (100 X 1700 ft) STOL runway. The dominant aircraft response characteristics determined were flightpath overshoot, flightpath-airspeed coupling, and initial flightpath response time. The significant contribution to control of the landing flare using pitch attitude was the short-term flightpath response. The limiting condition for initial flightpath response time for flare control with thrust was also identified. It is possible to define flying-qualities design criteria for glide-slope and flare control based on the aforementioned response characteristics.
Catalysis by clusters with precise numbers of atoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tyo, Eric C.; Vajda, Stefan
2015-07-03
Clusters that contain only a small number of atoms can exhibit unique and often unexpected properties. The clusters are of particular interest in catalysis because they can act as individual active sites, and minor changes in size and composition – such as the addition or removal of a single atom – can have a substantial influence on the activity and selectivity of a reaction. Here we review recent progress in the synthesis, characterization and catalysis of well-defined sub-nanometre clusters. We examine work on size-selected supported clusters in ultra-high vacuum environments and under realistic reaction conditions, and explore the use ofmore » computational methods to provide a mechanistic understanding of their catalytic properties. We also highlight the potential of size-selected clusters to provide insights into important catalytic processes and their use in the development of novel catalytic systems.« less
Testing of CMA-2000 Microwave Landing System (MLS) airborne receiver
NASA Astrophysics Data System (ADS)
Labreche, L.; Murfin, A. J.
1989-09-01
Microwave landing system (MLS) is a precision approach and landing guidance system which provides position information and various air to ground data. Position information is provided on a wide coverage sector and is determined by an azimuth angle measurement, an elevation angle measurement, and a range measurement. MLS performance standards and testing of the MLS airborne receiver is mainly governed by Technical Standard Order TSO-C104 issued by the Federal Aviation Administration. This TSO defines detailed test procedures for use in determining the required performance under standard and stressed conditions. It also imposes disciplines on software development and testing procedures. Testing performed on the CMA-2000 MLS receiver and methods used in its validation are described. A computer automated test system has been developed to test for compliance with RTCA/DO-177 Minimum Operation Performance Standards. Extensive software verification and traceability tests designed to ensure compliance with RTCA/DO-178 are outlined.
2016-01-01
Purpose The purpose of this research forum article is to provide an overview of a collection of invited articles on the topic “specific language impairment (SLI) in children with concomitant health conditions or nonmainstream language backgrounds.” Topics include SLI, attention-deficit/hyperactivity disorder, autism spectrum disorder, cochlear implants, bilingualism, and dialectal language learning contexts. Method The topic is timely due to current debates about the diagnosis of SLI. An overarching comparative conceptual framework is provided for comparisons of SLI with other clinical conditions. Comparisons of SLI in children with low-normal or normal nonverbal IQ illustrate the unexpected outcomes of 2 × 2 comparison designs. Results Comparative studies reveal unexpected relationships among speech, language, cognitive, and social dimensions of children's development as well as precise ways to identify children with SLI who are bilingual or dialect speakers. Conclusions The diagnosis of SLI is essential for elucidating possible causal pathways of language impairments, risks for language impairments, assessments for identification of language impairments, linguistic dimensions of language impairments, and long-term outcomes. Although children's language acquisition is robust under high levels of risk, unexplained individual variations in language acquisition lead to persistent language impairments. PMID:26502218
Self-assembly of crystalline nanotubes from monodisperse amphiphilic diblock copolypeptoid tiles
Sun, Jing; Jiang, Xi; Lund, Reidar; ...
2016-03-28
The folding and assembly of sequence-defined polymers into precisely ordered nanostructures promises a class of well-defined biomimetic architectures with specific function. Amphiphilic diblock copolymers are known to self-assemble in water to form a variety of nanostructured morphologies including spheres, disks, cylinders, and vesicles. In all of these cases, the predominant driving force for assembly is the formation of a hydrophobic core that excludes water, whereas the hydrophilic blocks are solvated and extend into the aqueous phase. However, such polymer systems typically have broad molar mass distributions and lack the purity and sequence-defined structure often associated with biologically derived polymers. Here,more » we demonstrate that purified, monodisperse amphiphilic diblock copolypeptoids, with chemically distinct domains that are congruent in size and shape, can behave like molecular tile units that spontaneously assemble into hollow, crystalline nanotubes in water. The nanotubes consist of stacked, porous crystalline rings, and are held together primarily by side-chain van der Waals interactions. The peptoid nanotubes form without a central hydrophobic core, chirality, a hydrogen bond network, and electrostatic or π-π interactions. These results demonstrate the remarkable structure-directing influence of n-alkane and ethyleneoxy side chains in polymer self-assembly. More broadly, this work suggests that flexible, low-molecular-weight sequence-defined polymers can serve as molecular tile units that can assemble into precision supramolecular architectures.« less
Data-driven approach for assessing utility of medical tests using electronic medical records.
Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram
2015-02-01
To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Self-assembly of crystalline nanotubes from monodisperse amphiphilic diblock copolypeptoid tiles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Jing; Jiang, Xi; Lund, Reidar
The folding and assembly of sequence-defined polymers into precisely ordered nanostructures promises a class of well-defined biomimetic architectures with specific function. Amphiphilic diblock copolymers are known to self-assemble in water to form a variety of nanostructured morphologies including spheres, disks, cylinders, and vesicles. In all of these cases, the predominant driving force for assembly is the formation of a hydrophobic core that excludes water, whereas the hydrophilic blocks are solvated and extend into the aqueous phase. However, such polymer systems typically have broad molar mass distributions and lack the purity and sequence-defined structure often associated with biologically derived polymers. Here,more » we demonstrate that purified, monodisperse amphiphilic diblock copolypeptoids, with chemically distinct domains that are congruent in size and shape, can behave like molecular tile units that spontaneously assemble into hollow, crystalline nanotubes in water. The nanotubes consist of stacked, porous crystalline rings, and are held together primarily by side-chain van der Waals interactions. The peptoid nanotubes form without a central hydrophobic core, chirality, a hydrogen bond network, and electrostatic or π-π interactions. These results demonstrate the remarkable structure-directing influence of n-alkane and ethyleneoxy side chains in polymer self-assembly. More broadly, this work suggests that flexible, low-molecular-weight sequence-defined polymers can serve as molecular tile units that can assemble into precision supramolecular architectures.« less
NASA Astrophysics Data System (ADS)
Ciofu, C.; Stan, G.
2016-08-01
The paper emphasise positioning precision of an elephant's trunk robotic arm which has joints driven by wires with variable length while operating The considered 5 degrees of freedom robotic arm has a particular structure of joint that makes possible inner actuation with wire-driven mechanism. We analyse solely the length change of wires as a consequence due inner winding and unwinding on joints for certain values of rotational angles. Variations in wires length entail joint angular displacements. We analyse positioning precision by taking into consideration equations from inverse kinematics of the elephant's trunk robotic arm. The angular displacements of joints are considered into computational method after partial derivation of positioning equations. We obtain variations of wires length at about tenths of micrometers. These variations employ angular displacements which are about minutes of sexagesimal degree and, thus, define positioning precision of elephant's trunk robotic arms. The analytical method is used for determining aftermath design structure of an elephant's trunk robotic arm with inner actuation through wires on positioning precision. Thus, designers could take suitable decisions on accuracy specifications limits of the robotic arm.
Asymptotic structure of the Einstein-Maxwell theory on AdS3
NASA Astrophysics Data System (ADS)
Pérez, Alfredo; Riquelme, Miguel; Tempo, David; Troncoso, Ricardo
2016-02-01
The asymptotic structure of AdS spacetimes in the context of General Relativity coupled to the Maxwell field in three spacetime dimensions is analyzed. Although the fall-off of the fields is relaxed with respect to that of Brown and Henneaux, the variation of the canonical generators associated to the asymptotic Killing vectors can be shown to be finite once required to span the Lie derivative of the fields. The corresponding surface integrals then acquire explicit contributions from the electromagnetic field, and become well-defined provided they fulfill suitable integrability conditions, implying that the leading terms of the asymptotic form of the electromagnetic field are functionally related. Consequently, for a generic choice of boundary conditions, the asymptotic symmetries are broken down to {R}⊗ U(1)⊗ U(1) . Nonetheless, requiring compatibility of the boundary conditions with one of the asymptotic Virasoro symmetries, singles out the set to be characterized by an arbitrary function of a single variable, whose precise form depends on the choice of the chiral copy. Remarkably, requiring the asymptotic symmetries to contain the full conformal group selects a very special set of boundary conditions that is labeled by a unique constant parameter, so that the algebra of the canonical generators is given by the direct sum of two copies of the Virasoro algebra with the standard central extension and U (1). This special set of boundary conditions makes the energy spectrum of electrically charged rotating black holes to be well-behaved.
Functional precision cancer medicine-moving beyond pure genomics.
Letai, Anthony
2017-09-08
The essential job of precision medicine is to match the right drugs to the right patients. In cancer, precision medicine has been nearly synonymous with genomics. However, sobering recent studies have generally shown that most patients with cancer who receive genomic testing do not benefit from a genomic precision medicine strategy. Although some call the entire project of precision cancer medicine into question, I suggest instead that the tools employed must be broadened. Instead of relying exclusively on big data measurements of initial conditions, we should also acquire highly actionable functional information by perturbing-for example, with cancer therapies-viable primary tumor cells from patients with cancer.
Coughlin, Justin G; Yu, Zhongjie; Elliott, Emily M
2017-07-30
Nitrogen oxides or NO x (NO x = NO + NO 2 ) play an important role in air quality, atmospheric chemistry, and climate. The isotopic compositions of anthropogenic and natural NO 2 sources are wide-ranging, and they can be used to constrain sources of ambient NO 2 and associated atmospheric deposition of nitrogen compounds. While passive sample collection of NO 2 isotopes has been used in field studies to determine NO x source influences on atmospheric deposition, this approach has not been evaluated for accuracy or precision under different environmental conditions. The efficacy of NO 2 passive sampler collection for NO 2 isotopes was evaluated under varied temperature and relative humidity (RH) conditions in a dynamic flux chamber. The precision and accuracy of the filter NO 2 collection as nitrite (NO 2 - ) for isotopic analysis were determined using a reference NO 2 gas tank and through inter-calibration with a modified EPA Method 7. The bacterial denitrifer method was used to convert 20 μM of collected NO 2 - or nitrate (NO 3 - ) into N 2 O and was carried out on an Isoprime continuous flow isotope ratio mass spectrometer. δ 15 N-NO 2 values determined from passive NO 2 collection, in conditions of 11-34 °C, 1-78% RH, have an overall accuracy and precision of ±2.1 ‰, and individual run precision of ±0.6 ‰. δ 18 O-NO 2 values obtained from passive NO 2 sampler collection, under the same conditions, have an overall precision of ± 1.3 ‰. Suitable conditions for passive sampler collection of NO 2 isotopes are in environments ranging from 11 to 34 °C and 1 to 78% RH. The passive NO 2 isotope measurement technique provides an accurate method to determine variations in atmospheric δ 15 N-NO 2 values and a precise method for determining atmospheric δ 18 O-NO 2 values. The ability to measure NO 2 isotopes over spatial gradients at the same temporal resolution provides a unique perspective on the extent and seasonality of fluctuations in atmospheric NO 2 isotopic compositions. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Latest developments on documentary film ``The State of the Unit: The Kilogram''
NASA Astrophysics Data System (ADS)
Young, Amy
2013-03-01
This presentation shows the recent developments in the documentary film project ``The State of the Unit.'' The film, to be completed Fall 2013, looks at historical and current efforts to define precisely the unit of mass.
Human factors research on performance-based navigation instrument procedures for NextGEN
DOT National Transportation Integrated Search
2012-10-14
Area navigation (RNAV) and required navigation performance (RNP) are key components of performance-based navigation (PBN). Instrument procedures that use RNAV and RNP can have more flexible and precise paths than conventional routes that are defined ...
Driver behavior at rail-highway grade crossings : a signal detection theory analysis
DOT National Transportation Integrated Search
1996-01-01
Signal Detection Theory (SDT) is often used in studies of sensory psychology and perception to describe laboratory experiments in which subjects are asked to detect small changes in very wellcontrolled, precisely defined stimuli such as the intensity...
6 CFR 7.28 - Automatic declassification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... years after the date of its original classification with the exception of specific information exempt... information whenever the information exempted does not identify a confidential human source or human... Classification Appeals Panel (ISCAP) for approval. (d) Declassification guides that narrowly and precisely define...
6 CFR 7.28 - Automatic declassification.
Code of Federal Regulations, 2014 CFR
2014-01-01
... years after the date of its original classification with the exception of specific information exempt... information whenever the information exempted does not identify a confidential human source or human... Classification Appeals Panel (ISCAP) for approval. (d) Declassification guides that narrowly and precisely define...
6 CFR 7.28 - Automatic declassification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... years after the date of its original classification with the exception of specific information exempt... information whenever the information exempted does not identify a confidential human source or human... Classification Appeals Panel (ISCAP) for approval. (d) Declassification guides that narrowly and precisely define...
6 CFR 7.28 - Automatic declassification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... years after the date of its original classification with the exception of specific information exempt... information whenever the information exempted does not identify a confidential human source or human... Classification Appeals Panel (ISCAP) for approval. (d) Declassification guides that narrowly and precisely define...
Diffusion tensor tracking of neuronal fiber pathways in the living human brain
NASA Astrophysics Data System (ADS)
Lori, Nicolas Francisco
2001-11-01
The technique of diffusion tensor tracking (DTT) is described, in which diffusion tensor magnetic resonance imaging (DT-MRI) data are processed to allow the visualization of white matter (WM) tracts in a living human brain. To illustrate the methods, a detailed description is given of the physics of DT-MRI, the structure of the DT-MRI experiment, the computer tools that were developed to visualize WM tracts, the anatomical consistency of the obtained WM tracts, and the accuracy and precision of DTT using computer simulations. When presenting the physics of DT-MRI, a completely quantum-mechanical view of DT-MRI is given where some of the results are new. Examples of anatomical tracts viewed using DTT are presented, including the genu and the splenium of the corpus callosum, the ventral pathway with its amygdala connection highlighted, the geniculo- calcarine tract separated into anterior and posterior parts, the geniculo-calcarine tract defined using functional magnetic resonance imaging (MRI), and U- fibers. In the simulation, synthetic DT-MRI data were constructed that would be obtained for a cylindrical WM tract with a helical trajectory surrounded by gray matter. Noise was then added to the synthetic DT-MRI data, and DTT trajectories were calculated using the noisy data (realistic tracks). Simulated DTT errors were calculated as the vector distance between the realistic tracks and the ideal trajectory. The simulation tested the effects of a comprehensive set of experimental conditions, including voxel size, data sampling, data averaging, type of tract tissue, tract diameter and type of tract trajectory. Simulated DTT accuracy and precision were typically below the voxel dimension, and precision was compatible with the experimental results.
Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution
NASA Astrophysics Data System (ADS)
Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin
2018-06-01
Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6 ± 36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.
NASA Astrophysics Data System (ADS)
Ma, Lin
2017-11-01
This paper develops a method for precisely determining the tension of an inclined cable with unknown boundary conditions. First, the nonlinear motion equation of an inclined cable is derived, and a numerical model of the motion of the cable is proposed using the finite difference method. The proposed numerical model includes the sag-extensibility, flexural stiffness, inclination angle and rotational stiffness at two ends of the cable. Second, the influence of the dynamic parameters of the cable on its frequencies is discussed in detail, and a method for precisely determining the tension of an inclined cable is proposed based on the derivatives of the eigenvalues of the matrices. Finally, a multiparameter identification method is developed that can simultaneously identify multiple parameters, including the rotational stiffness at two ends. This scheme is applicable to inclined cables with varying sag, varying flexural stiffness and unknown boundary conditions. Numerical examples indicate that the method provides good precision. Because the parameters of cables other than tension (e.g., the flexural stiffness and rotational stiffness at the ends) are not accurately known in practical engineering, the multiparameter identification method could further improve the accuracy of cable tension measurements.
Simple Perfusion Apparatus (SPA) for Manipulation, Tracking and Study of Oocytes and Embryos
Angione, Stephanie L.; Oulhen, Nathalie; Brayboy, Lynae M.; Tripathi, Anubhav; Wessel, Gary M.
2016-01-01
Objective To develop and implement a device and protocol for oocyte analysis at a single cell level. The device must be capable of high resolution imaging, temperature control, perfusion of media, drugs, sperm, and immunolabeling reagents all at defined flow-rates. Each oocyte and resultant embryo must remain spatially separated and defined. Design Experimental laboratory study Setting University and Academic Center for reproductive medicine. Patients/Animals Women with eggs retrieved for ICSI cycles, adult female FVBN and B6C3F1 mouse strains, sea stars. Intervention Real-time, longitudinal imaging of oocytes following fluorescent labeling, insemination, and viability tests. Main outcome measure(s) Cell and embryo viability, immunolabeling efficiency, live cell endocytosis quantitation, precise metrics of fertilization and embryonic development. Results Single oocytes were longitudinally imaged following significant changes in media, markers, endocytosis quantitation, and development, all with supreme control by microfluidics. Cells remained viable, enclosed, and separate for precision measurements, repeatability, and imaging. Conclusions We engineered a simple device to load, visualize, experiment, and effectively record individual oocytes and embryos, without loss of cells. Prolonged incubation capabilities provide longitudinal studies without need for transfer and potential loss of cells. This simple perfusion apparatus (SPA) provides for careful, precise, and flexible handling of precious samples facilitating clinical in vitro fertilization approaches. PMID:25450296
Current Trends in Satellite Laser Ranging
NASA Technical Reports Server (NTRS)
Pearlman, M. R.; Appleby, G. M.; Kirchner, G.; McGarry, J.; Murphy, T.; Noll, C. E.; Pavlis, E. C.; Pierron, F.
2010-01-01
Satellite Laser Ranging (SLR) techniques are used to accurately measure the distance from ground stations to retroreflectors on satellites and the moon. SLR is one of the fundamental techniques that define the international Terrestrial Reference Frame (iTRF), which is the basis upon which we measure many aspects of global change over space, time, and evolving technology. It is one of the fundamental techniques that define at a level of precision of a few mm the origin and scale of the ITRF. Laser Ranging provides precision orbit determination and instrument calibration/validation for satellite-borne altimeters for the better understanding of sea level change, ocean dynamics, ice budget, and terrestrial topography. Laser ranging is also a tool to study the dynamics of the Moon and fundamental constants. Many of the GNSS satellites now carry retro-reflectors for improved orbit determination, harmonization of reference frames, and in-orbit co-location and system performance validation. The GNSS Constellations will be the means of making the reference frame available to worldwide users. Data and products from these measurements support key aspects of the GEOSS 10-Year implementation Plan adopted on February 16, 2005, The ITRF has been identified as a key contribution of the JAG to GEOSS and the ILRS makes a major contribution for its development since its foundation. The ILRS delivers weekly additional realizations that are accumulated sequentially to extend the ITRF and the Earth Orientation Parameter (EOP) series with a daily resolution. Additional products are currently under development such as precise orbits of satellites, EOP with daily availability, low-degree gravitational harmonics for studies of Earth dynamics and kinematics, etc. SLR technology continues to evolve toward the next generation laser ranging systems as programmatic requirements become more stringent. Ranging accuracy is improving as higher repetition rate, narrower pulse lasers and faster detectors are implemented. Automation and pass interleaving at some stations is already expanding temporal coverage. Web-based safety keys are allowing the SLR network stations to range to optically vulnerable satellites. Some stations are experimenting with two-wavelength operation as a means of better understanding the atmospheric refraction and with very low power laser to improve eye-safety conditions. New retroreflector designs are improving the signal link and enable daylight ranging. Dramatic improvements have also been made with lunar ranging with the new APOLLO Site in New ?Mexico, USA and the upgraded lunar station "MEO" in Grasse,
High precision AlGaAsSb ridge-waveguide etching by in situ reflectance monitored ICP-RIE
NASA Astrophysics Data System (ADS)
Tran, N. T.; Breivik, Magnus; Patra, S. K.; Fimland, Bjørn-Ove
2014-05-01
GaSb-based semiconductor diode lasers are promising candidates for light sources working in the mid-infrared wavelength region of 2-5 μm. Using edge emitting lasers with ridge-waveguide structure, light emission with good beam quality can be achieved. Fabrication of the ridge waveguide requires precise etch stop control for optimal laser performance. Simulation results are presented that show the effect of increased confinement in the waveguide when the etch depth is well-defined. In situ reflectance monitoring with a 675 nm-wavelength laser was used to determine the etch stop with high accuracy. Based on the simulations of laser reflectance from a proposed sample, the etching process can be controlled to provide an endpoint depth precision within +/- 10 nm.
Ground control requirements for precision processing of ERTS images
Burger, Thomas C.
1973-01-01
With the successful flight of the ERTS-1 satellite, orbital height images are available for precision processing into products such as 1:1,000,000-scale photomaps and enlargements up to 1:250,000 scale. In order to maintain positional error below 100 meters, control points for the precision processing must be carefully selected, clearly definitive on photos in both X and Y. Coordinates of selected control points measured on existing ½ and 15-minute standard maps provide sufficient accuracy for any space imaging system thus far defined. This procedure references the points to accepted horizontal and vertical datums. Maps as small as 1:250,000 scale can be used as source material for coordinates, but to maintain the desired accuracy, maps of 1:100,000 and larger scale should be used when available.
Beyond Precision: Issues of Morality and Decision Making in Minimizing Collateral Casualties
2003-04-28
possible contributions from moral judgment and decision making . As Fuller himself said, laws “can create the conditions essential for a rational ...BEYOND PRECISION: Issues of Morality and Decision Making in Minimizing Collateral Casualties Program in Arms Control, Disarmament, and...28 APR 2003 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Beyond Precision: Issues of Morality and Decision Making in
Hemeda, Hatim; Giebel, Bernd; Wagner, Wolfgang
2014-02-01
Culture media for therapeutic cell preparations-such as mesenchymal stromal cells (MSCs)-usually comprise serum additives. Traditionally, fetal bovine serum is supplemented in basic research and in most clinical trials. Within the past years, many laboratories adapted their culture conditions to human platelet lysate (hPL), which further stimulates proliferation and expansion of MSCs. Particularly with regard to clinical application, human alternatives for fetal bovine serum are clearly to be preferred. hPL is generated from human platelet units by disruption of the platelet membrane, which is commonly performed by repeated freeze and thaw cycles. Such culture supplements are notoriously ill-defined, and many parameters contribute to batch-to-batch variation in hPL such as different amounts of plasma, a broad range of growth factors and donor-specific effects. The plasma components of hPL necessitate addition of anticoagulants such as heparins to prevent gelatinization of hPL medium, and their concentration must be standardized. Labels for description of hPL-such as "xenogen-free," "animal-free" and "serum free"-are not used consistently in the literature and may be misleading if not critically assessed. Further analysis of the precise composition of relevant growth factors, attachment factors, microRNAs and exosomes will pave the way for optimized and defined culture conditions. The use of hPL has several advantages and disadvantages: they must be taken into account because the choice of cell culture additive has major impact on cell preparations. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Accuracy of GPS time transfer verified by closure around the world
NASA Technical Reports Server (NTRS)
Lewandowski, Wlodimierz W.; Petit, Gerard; Thomas, Claudine
1992-01-01
The precision of time transfer over intercontinental distances by the Global Positioning System common-view method, using measurements of ionospheric delays, precise ephemerides provided by the Defense Mapping Agency (DMA) and a consistent set of antenna coordinates, reaches 3 to 4 ns for a single 13-minute measurement, and decreases to 2 ns when averaging several measurements over the period of one day. It is thought that even this level of precision can be bettered by improving the ionospheric measurements, the ephemerides of satellites, and the antenna coordinates. In the same conditions, an estimation of the accuracy is attained by using three intercontinental links encircling the Earth to establish a closure condition; the three independent links should add to zero. We have computed such a closure condition over a period of 13 months using data recorded at the Paris Observatory, at the Communications Research Laboratory in Tokyo, and at the National Institute for Standards and Technology in Boulder, Colorado. The closure condition is verified to within a few nanoseconds, but a bias, varying with time, can be detected.
Play-fairway analysis for geothermal exploration: Examples from the Great Basin, western USA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siler, Drew L; Faulds, James E
2013-10-27
Elevated permeability within fault systems provides pathways for circulation of geothermal fluids. Future geothermal development depends on precise and accurate location of such fluid flow pathways in order to both accurately assess geothermal resource potential and increase drilling success rates. The collocation of geologic characteristics that promote permeability in a given geothermal system define the geothermal ‘fairway’, the location(s) where upflow zones are probable and where exploration efforts including drilling should be focused. We define the geothermal fairway as the collocation of 1) fault zones that are ideally oriented for slip or dilation under ambient stress conditions, 2) areas withmore » a high spatial density of fault intersections, and 3) lithologies capable of supporting dense interconnected fracture networks. Areas in which these characteristics are concomitant with both elevated temperature and fluids are probable upflow zones where economic-scale, sustainable temperatures and flow rates are most likely to occur. Employing a variety of surface and subsurface data sets, we test this ‘play-fairway’ exploration methodology on two Great Basin geothermal systems, the actively producing Brady’s geothermal system and a ‘greenfield’ geothermal prospect at Astor Pass, NV. These analyses, based on 3D structural and stratigraphic framework models, reveal subsurface characteristics about each system, well beyond the scope of standard exploration methods. At Brady’s, the geothermal fairways we define correlate well with successful production wells and pinpoint several drilling targets for maintaining or expanding production in the field. In addition, hot-dry wells within the Brady’s geothermal field lie outside our defined geothermal fairways. At Astor Pass, our play-fairway analysis provides for a data-based conceptual model of fluid flow within the geothermal system and indicates several targets for exploration drilling.« less
Quantifying Cyber-Resilience Against Resource-Exhaustion Attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fink, Glenn A.; Griswold, Richard L.; Beech, Zachary W.
2014-07-11
Resilience in the information sciences is notoriously difficult to define much less to measure. But in mechanical engi- neering, the resilience of a substance is mathematically defined as the area under the stress vs. strain curve. We took inspiration from mechanics in an attempt to define resilience precisely for information systems. We first examine the meaning of resilience in language and engineering terms and then translate these definitions to information sciences. Then we tested our definitions of resilience for a very simple problem in networked queuing systems. We discuss lessons learned and make recommendations for using this approach in futuremore » work.« less
Precision of computer vision systems for real-time inspection of contact wire wear in railways
NASA Astrophysics Data System (ADS)
Borromeo, Susana; Aparicio, Jose L.
2005-02-01
This paper is oriented to study techniques to improve the precision of the systems for wear measurement of contact wire in the railways. The problematic of wear measurement characterized by some important determining factors like rate of sampling and auscultation conditions is studied in detail. The different solutions to resolve the problematic successfully are examined. Issues related to image acquisition and image processing are discussed. Type of illumination and sensors employed, image processing hardware and image processing algorithms are some topics studied. Once analyzed each one factor which have influence on the precision of the measurement system, there are proposed an assembly of solutions that allow to optimize the conditions under which the inspection can be carried out.
Teleman localization of Hochschild homology in a singular setting
NASA Astrophysics Data System (ADS)
Brasselet, J.-P.; Legrand, A.
2009-09-01
The aim of this paper is to generalize the Hochschild-Kostant-Rosenberg theorem to the case of singular varieties, more precisely, to manifolds with boundary and to varieties with isolated singularities. In these situations, we define suitable algebras of functions and study the localization of the corresponding Hochschild homology. The tool we use is the Teleman localization process. In the case of isolated singularities, the closed Hochschild homology corresponds to the intersection complex which relates the objects defined here to intersection homology.
Revisiting the pH Effect on Orthophosphate Control of Plumbosolvency
Although solubility models for Pb(II) have largely been successful for giving corrosion control treatment guidance for over 2 decades, very little systematic research has been done to precisely define plumbosolvency responses to changes in pH, carbonate and phosphate concentratio...
After Behaviourism, Navigationism?
ERIC Educational Resources Information Center
Moran, Sean
2008-01-01
Two previous articles in this journal advocate the greater use of a behaviourist methodology called "Precision Teaching" (PT). From a position located within virtue ethics, this article argues that the technical feat of raising narrowly defined performance in mathematics and other subjects is not sufficient justification for the…
Design of Mechanisms for Deployable, Optical Instruments: Guidelines for Reducing Hysteresis
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Hachkowski, M. Roman
2000-01-01
This paper is intended to facilitate the development of deployable, optical instruments by providing a rational approach for the design, testing, and qualification of high-precision (i.e., low-hysteresis) deployment mechanisms for these instruments. Many of the guidelines included herein come directly from the field of optomechanical engineering, and are, therefore, neither newly developed guidelines, nor are they uniquely applicable to the design of high-precision deployment mechanisms. This paper is to be regarded as a guide to design and not a set of NASA requirements, except as may be defined in formal project specifications. Furthermore, due to the rapid pace of advancement in the field of precision deployment, this paper should be regarded as a preliminary set of guidelines. However, it is expected that this paper, with revisions as experience may indicate to be desirable, might eventually form the basis for a set of uniform design requirements for high-precision deployment mechanisms on future NASA space-based science instruments.
Personalized medicine and chronic obstructive pulmonary disease.
Wouters, E F M; Wouters, B B R A F; Augustin, I M L; Franssen, F M E
2017-05-01
The current review summarizes ongoing developments in personalized medicine and precision medicine in chronic obstructive pulmonary disease (COPD). Our current approach is far away of personalized management algorithms as current recommendations for COPD are largely based on a reductionist disease description, operationally defined by results of spirometry. Besides precision medicine developments, a personalized medicine approach in COPD is described based on a holistic approach of the patient and considering illness as the consequence of dynamic interactions within and between multiple interacting and self-adjusting systems. Pulmonary rehabilitation is described as a model of personalized medicine. Largely based on current understanding of inflammatory processes in COPD, targeted interventions in COPD are reviewed. Augmentation therapy for α-1-antitrypsine deficiency is described as model of precision medicine in COPD based in profound understanding of the related genetic endotype. Future developments of precision medicine in COPD require identification of relevant endotypes combined with proper identification of phenotypes involved in the complex and heterogeneous manifestations of COPD.
Jia, Mochen; Liu, Guofeng; Sun, Zhen; Fu, Zuoling; Xu, Weiguo
2018-02-05
Absolute temperature sensitivity (S a ) reflects the precision of sensors that belong to the same mechanism, whereas relative temperature sensitivity (S r ) is used to compare sensors from different mechanisms. For the fluorescence intensity ratio (FIR) thermometry based on two thermally coupled energy levels of one rare earth (RE) ion, we define a new ratio as the temperature-sensing parameter that can vary greatly with temperature in some circumstances, which can obtain higher S a without changing S r . Further discussion is made on the conditions under which these two forms of temperature-sensing parameters can be used to achieve higher S a for biomedical temperature sensing. Based on the new ratio as the temperature-sensing parameter, the S a and S r of the BaTiO 3 : 0.01%Pr 3+ , 8%Yb 3+ nanoparticles at 313 K reach as high as 0.1380 K -1 and 1.23% K -1 , respectively. Similarly, the S a and S r of the BaTiO 3 : 1%Er 3+ , 3%Yb 3+ nanoparticles at 313 K are as high as 0.0413 K -1 and 1.05% K -1 , respectively. By flexibly choosing the two ratios as the temperature-sensing parameter, higher S a can be obtained at the target temperature, which means higher precision for the FIR thermometers.
Yang, Cui; Heinze, Julia; Helmert, Jens; Weitz, Juergen; Reissfelder, Christoph; Mees, Soeren Torge
2017-12-01
Distractions such as phone calls during laparoscopic surgery play an important role in many operating rooms. The aim of this single-centre, prospective study was to assess if laparoscopic performance is impaired by intraoperative phone calls in novice surgeons. From October 2015 to June 2016, 30 novice surgeons (medical students) underwent a laparoscopic surgery training curriculum including two validated tasks (peg transfer, precision cutting) until achieving a defined level of proficiency. For testing, participants were required to perform these tasks under three conditions: no distraction (control) and two standardised distractions in terms of phone calls requiring response (mild and strong distraction). Task performance was evaluated by analysing time and accuracy of the tasks and response of the phone call. In peg transfer (easy task), mild distraction did not worsen the performance significantly, while strong distraction was linked to error and inefficiency with significantly deteriorated performance (P < 0.05). Precision cutting (difficult task) was not slowed down by mild distraction, but surgical and cognitive errors were significantly increased when participants were distracted (P < 0.05). Compared to mild distraction, participants reported a more severe subjective disturbance when they were diverted by strong distraction (P < 0.05). Our data reveals that phone call distractions result in impaired laparoscopic performance under certain circumstances. To ensure patient safety, phone calls should be avoided as far as possible in operating rooms.
Theoretical and Applied Research in the Field of Higher Geodesy Conducted in Rzeszow
NASA Astrophysics Data System (ADS)
Kadaj, Roman; Świętoń, Tomasz
2016-06-01
Important qualitative changes were taking place in polish geodesy in last few years. It was related to application of new techniques and technologies and to introduction of European reference frames in Poland. New reference stations network ASG-EUPOS, together with Internet services which helps in precise positioning was created. It allows to fast setting up precise hybrid networks. New, accurate satellite networks became the basis of new definitions in the field of reference systems. Simultaneously arise the need of new software, which enables to execute the geodetic works in new technical conditions. Authors had an opportunity to participate in mentioned undertakings, also under the aegis of GUGiK, by creation of methods, algorithms and necessary software tools. In this way the automatic postprocessing module (APPS) in POZGEO service, a part of ASG-EUPOS system came into being. It is an entirely polish product which works in Trimble environment. Universal software for transformation between PLETRF89, PL-ETRF2000, PULKOWO'42 reference systems as well as defined coordinate systems was created (TRANSPOL v. 2.06) and published as open product. An essential functional element of the program is the quasi-geoid model PL-geoid-2011, which has been elaborated by adjustment (calibration) of the global quasi-geoid model EGM2008 to 570 geodetic points (satellite-leveling points). Those and other studies are briefly described in this paper.
Maduri, Rodolfo; Viaroli, Edoardo; Levivier, Marc; Daniel, Roy T; Messerer, Mahmoud
2017-01-01
Cranioplasty is considered a simple reconstructive procedure, usually performed in a single stage. In some clinical conditions, such as in children with multifocal flap osteolysis, it could represent a surgical challenge. In these patients, the partially resorbed autologous flap should be removed and replaced with a precustomed prosthesis which should perfectly match the expected bone defect. We describe the technique used for a navigated cranioplasty in a 3-year-old child with multifocal autologous flap osteolysis. We decided to perform a cranioplasty using a custom-made hydroxyapatite porous ceramic flap. The prosthesis was produced with an epoxy resin 3D skull model of the patient, which included a removable flap corresponding to the planned cranioplasty. Preoperatively, a CT scan of the 3D skull model was performed without the removable flap. The CT scan images of the 3D skull model were merged with the preoperative 3D CT scan of the patient and navigated during the cranioplasty to define with precision the cranioplasty margins. After removal of the autologous resorbed flap, the hydroxyapatite prosthesis matched perfectly with the skull defect. The anatomical result was excellent. Thus, the implementation of cranioplasty with image merge navigation of a 3D skull model may improve cranioplasty accuracy, allowing precise anatomic reconstruction in complex skull defect cases. © 2017 S. Karger AG, Basel.
Pre-Test Assessment of the Upper Bound of the Drag Coefficient Repeatability of a Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Ulbrich, N.; L'Esperance, A.
2017-01-01
A new method is presented that computes a pre{test estimate of the upper bound of the drag coefficient repeatability of a wind tunnel model. This upper bound is a conservative estimate of the precision error of the drag coefficient. For clarity, precision error contributions associated with the measurement of the dynamic pressure are analyzed separately from those that are associated with the measurement of the aerodynamic loads. The upper bound is computed by using information about the model, the tunnel conditions, and the balance in combination with an estimate of the expected output variations as input. The model information consists of the reference area and an assumed angle of attack. The tunnel conditions are described by the Mach number and the total pressure or unit Reynolds number. The balance inputs are the partial derivatives of the axial and normal force with respect to all balance outputs. Finally, an empirical output variation of 1.0 microV/V is used to relate both random instrumentation and angle measurement errors to the precision error of the drag coefficient. Results of the analysis are reported by plotting the upper bound of the precision error versus the tunnel conditions. The analysis shows that the influence of the dynamic pressure measurement error on the precision error of the drag coefficient is often small when compared with the influence of errors that are associated with the load measurements. Consequently, the sensitivities of the axial and normal force gages of the balance have a significant influence on the overall magnitude of the drag coefficient's precision error. Therefore, results of the error analysis can be used for balance selection purposes as the drag prediction characteristics of balances of similar size and capacities can objectively be compared. Data from two wind tunnel models and three balances are used to illustrate the assessment of the precision error of the drag coefficient.
Attentional priority determines working memory precision.
Klyszejko, Zuzanna; Rahmati, Masih; Curtis, Clayton E
2014-12-01
Visual working memory is a system used to hold information actively in mind for a limited time. The number of items and the precision with which we can store information has limits that define its capacity. How much control do we have over the precision with which we store information when faced with these severe capacity limitations? Here, we tested the hypothesis that rank-ordered attentional priority determines the precision of multiple working memory representations. We conducted two psychophysical experiments that manipulated the priority of multiple items in a two-alternative forced choice task (2AFC) with distance discrimination. In Experiment 1, we varied the probabilities with which memorized items were likely to be tested. To generalize the effects of priority beyond simple cueing, in Experiment 2, we manipulated priority by varying monetary incentives contingent upon successful memory for items tested. Moreover, we illustrate our hypothesis using a simple model that distributed attentional resources across items with rank-ordered priorities. Indeed, we found evidence in both experiments that priority affects the precision of working memory in a monotonic fashion. Our results demonstrate that representations of priority may provide a mechanism by which resources can be allocated to increase the precision with which we encode and briefly store information. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nanomaterials for Cancer Precision Medicine.
Wang, Yilong; Sun, Shuyang; Zhang, Zhiyuan; Shi, Donglu
2018-04-01
Medical science has recently advanced to the point where diagnosis and therapeutics can be carried out with high precision, even at the molecular level. A new field of "precision medicine" has consequently emerged with specific clinical implications and challenges that can be well-addressed by newly developed nanomaterials. Here, a nanoscience approach to precision medicine is provided, with a focus on cancer therapy, based on a new concept of "molecularly-defined cancers." "Next-generation sequencing" is introduced to identify the oncogene that is responsible for a class of cancers. This new approach is fundamentally different from all conventional cancer therapies that rely on diagnosis of the anatomic origins where the tumors are found. To treat cancers at molecular level, a recently developed "microRNA replacement therapy" is applied, utilizing nanocarriers, in order to regulate the driver oncogene, which is the core of cancer precision therapeutics. Furthermore, the outcome of the nanomediated oncogenic regulation has to be accurately assessed by the genetically characterized, patient-derived xenograft models. Cancer therapy in this fashion is a quintessential example of precision medicine, presenting many challenges to the materials communities with new issues in structural design, surface functionalization, gene/drug storage and delivery, cell targeting, and medical imaging. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Wang, Kai; Luo, Ying
2013-07-08
As one important category of biological molecules on the cell surface and in the extracellular matrix (ECM), glycosaminoglycans (GAGs) have been widely studied for biomedical applications. With the understanding that the biological functions of GAGs are driven by the complex dynamics of physiological and pathological processes, methodologies are desired to allow the elucidation of cell-GAG interactions with molecular level precision. In this study, a microtiter plate-based system was devised through a new surface modification strategy involving polydopamine (PDA) and GAG molecules functionalized with hydrazide chemical groups. A small library of GAGs including hyaluronic acid (with different molecular weights), heparin, and chondroitin sulfate was successfully immobilized via defined binding sites onto the microtiter plate surface under facile aqueous conditions. The methodology then allowed parallel studies of the GAG-modified surfaces in a high-throughput format. The results show that immobilized GAGs possess distinct properties to mediate protein adsorption, cell adhesion, and inflammatory responses, with each property showing dependence on the type and molecular weight of specific GAG molecules. The PDA-assisted immobilization of hydrazide-functionalized GAGs allows biomimetic attachment of GAG molecules and retains their bioactivity, providing a new methodology to systematically probe fundamental cell-GAG interactions to modulate the bioactivity and biocompatibility of biomaterials.
Chemically defined, ultrasoft PDMS elastomers with selectable elasticity for mechanobiology
Heinrichs, Viktor; Dieluweit, Sabine; Stellbrink, Jörg; Pyckhout-Hintzen, Wim; Hersch, Nils; Richter, Dieter
2018-01-01
Living animal cells are strongly influenced by the mechanical properties of their environment. To model physiological conditions ultrasoft cell culture substrates, in some instances with elasticity (Young's modulus) of only 1 kPa, are mandatory. Due to their long shelf life PDMS-based elastomers are a popular choice. However, uncertainty about additives in commercial formulations and difficulties to reach very soft materials limit their use. Here, we produced silicone elastomers from few, chemically defined and commercially available substances. Elastomers exhibited elasticities in the range from 1 kPa to 55 kPa. In detail, a high molecular weight (155 kg/mol), vinyl-terminated linear silicone was crosslinked with a multifunctional (f = 51) crosslinker (a copolymer of dimethyl siloxane and hydrosilane) by a platinum catalyst. The following different strategies towards ultrasoft materials were explored: sparse crosslinking, swelling with inert silicone polymers, and, finally, deliberate introduction of dangling ends into the network (inhibition). Rheological experiments with very low frequencies led to precise viscoelastic characterizations. All strategies enabled tuning of stiffness with the lowest stiffness of ~1 kPa reached by inhibition. This system was also most practical to use. Biocompatibility of materials was tested using primary cortical neurons from rats. Even after several days of cultivation no adverse effects were found. PMID:29624610
A robust definition of South Asian monsoon onset and retreat
NASA Astrophysics Data System (ADS)
Walker, J. M.; Bordoni, S.
2017-12-01
In this study, we revisit one of the major outstanding problems in the monsoon literature: defining the onset and retreat of the South Asian summer monsoon (SASM). The SASM rainy season, which provides essential water resources to densely populated and rapidly growing countries in South Asia, begins with a dramatic increase in rainfall and an abrupt reversal in near-surface winds, and concludes with a more gradual transition at season's end. Many different measures of SASM onset and retreat have been developed for specific applications, but there is no widely accepted and broadly applicable objective definition. Existing definitions generally rely upon thresholds, posing challenges such as sensitivity to threshold selection and susceptibility to false onsets due to transient weather conditions. In this study, we use the large-scale atmospheric moisture budget to define an SASM onset and retreat index that captures the seasonal transitions in both precipitation and circulation. Our use of change point detection eliminates the need for thresholds, provides a precise characterization of the timescales and stages of the SASM, and allows straightforward comparison across different datasets and climate models. This robust and flexible methodology is ideal for studying variability and trends in monsoon timing, as well as comparing model performance and assessing future SASM changes in climate simulations.
Deliquescence and efflorescence of small particles.
McGraw, Robert; Lewis, Ernie R
2009-11-21
We examine size-dependent deliquescence/efflorescence phase transformation for particles down to several nanometers in size. Thermodynamic properties of inorganic salt particles, coated with aqueous solution layers of varying thickness and surrounded by vapor, are analyzed. A thin layer criterion (TLC) is introduced to define a limiting deliquescence relative humidity (RH(D)) for small particles. This requires: (1) equality of chemical potentials between salt in an undissolved core, and thin adsorbed solution layer, and (2) equality of chemical potentials between water in the thin layer and vapor phase. The usual bulk deliquescence conditions are recovered in the limit of large dry particle size. Nanosize particles are found to deliquesce at relative humidity just below the RH(D) on crossing a nucleation barrier, located at a critical solution layer thickness. This barrier vanishes precisely at the RH(D) defined by the TLC. Concepts and methods from nucleation theory including the kinetic potential, self-consistent nucleation theory, nucleation theorems, and the Gibbs dividing surface provide theoretical foundation and point to unifying features of small particle deliquescence/efflorescence processes. These include common thermodynamic area constructions, useful for interpretation of small particle water uptake measurements, and a common free-energy surface, with constant RH cross sections describing deliquescence and efflorescence related through the nucleation theorem.
Qiang, Weiguang; Wu, Qinqin; Zhou, Fuxiang; Xie, Conghua; Wu, Changping; Zhou, Yunfeng
2014-03-07
Mammalian telomeres are protected by the shelterin complex that contains the six core proteins POT1, TPP1, TIN2, TRF1, TRF2 and RAP1. TPP1, formerly known as TINT1, PTOP, and PIP1, is a key factor that regulates telomerase recruitment and activity. In addition to this, TPP1 is required to mediate the shelterin assembly and stabilize telomere. Previous work has found that TPP1 expression was elevated in radioresistant cells and that overexpression of TPP1 led to radioresistance and telomere lengthening in telomerase-positive cells. However, the exact effects and mechanism of TPP1 on radiosensitivity are yet to be precisely defined in the ALT cells. Here we report on the phenotypes of the conditional deletion of TPP1 from the human osteosarcoma U2OS cells using ALT pathway to extend the telomeres.TPP1 deletion resulted in telomere shortening, increased apoptosis and radiation sensitivity enhancement. Together, our findings show that TPP1 plays a vital role in telomere maintenance and protection and establish an intimate relationship between TPP1, telomere and cellular response to ionizing radiation, but likely has the specific mechanism yet to be defined. Copyright © 2014 Elsevier Inc. All rights reserved.
Precision agriculture in large-scale mechanized farming
USDA-ARS?s Scientific Manuscript database
Precision agriculture involves a great deal of technologies and requires additional investments of money and time, but it can be practiced at different levels depending on the specific field and crop conditions and the resources and technology services available to the farmer. If practiced properly,...
Proteomic Analysis of Arsenic-Induced Oxidative Stress in Human Epidermal Keratinocytes
Chronic exposure to inorganic arsenic (IAs) has been associated with the development of several human cancers, including those found in the skin, lung, urinary bladder, liver, prostate and kidney. The precise mechanisms by which arsenic causes cancer are unknown. Defining the mod...
If cannot define and quantify Ecosystem Services consistently and systematically - we might be lost!
Imagine that every industrial sector, firm, municipality and state reported and classified their production using different definitions and units - Gross Domestic Product (GDP) would be impossible to calculate! This is precisely the difficult situation in which we find ourselves...
Microholography of Living Organisms.
ERIC Educational Resources Information Center
Solem, Johndale C.; Baldwin, George C.
1982-01-01
By using intense pulsed coherent x-ray sources it will be possible to obtain magnified three-dimensional images of living elementary biological structures at precisely defined instants. Discussed are sources/geometrics for x-ray holography, x-radiation interactions, factors affecting resolution, recording the hologram, high-intensity holography,…
Adaptive electron beam shaping using a photoemission gun and spatial light modulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxson, Jared; Lee, Hyeri; Bartnik, Adam C.
The need for precisely defined beam shapes in photoelectron sources has been well established. In this paper, we use a spatial light modulator and simple shaping algorithm to create arbitrary, detailed transverse laser shapes with high fidelity. We transmit this shaped laser to the photocathode of a high voltage dc gun. Using beam currents where space charge is negligible, and using an imaging solenoid and fluorescent viewscreen, we show that the resultant beam shape preserves these detailed features with similar fidelity. Next, instead of transmitting a shaped laser profile, we use an active feedback on the unshaped electron beam imagemore » to create equally accurate and detailed shapes. We demonstrate that this electron beam feedback has the added advantage of correcting for electron optical aberrations, yielding shapes without skew. The method may serve to provide precisely defined electron beams for low current target experiments, space-charge dominated beam commissioning, as well as for online adaptive correction of photocathode quantum efficiency degradation.« less
Adaptive electron beam shaping using a photoemission gun and spatial light modulator
NASA Astrophysics Data System (ADS)
Maxson, Jared; Lee, Hyeri; Bartnik, Adam C.; Kiefer, Jacob; Bazarov, Ivan
2015-02-01
The need for precisely defined beam shapes in photoelectron sources has been well established. In this paper, we use a spatial light modulator and simple shaping algorithm to create arbitrary, detailed transverse laser shapes with high fidelity. We transmit this shaped laser to the photocathode of a high voltage dc gun. Using beam currents where space charge is negligible, and using an imaging solenoid and fluorescent viewscreen, we show that the resultant beam shape preserves these detailed features with similar fidelity. Next, instead of transmitting a shaped laser profile, we use an active feedback on the unshaped electron beam image to create equally accurate and detailed shapes. We demonstrate that this electron beam feedback has the added advantage of correcting for electron optical aberrations, yielding shapes without skew. The method may serve to provide precisely defined electron beams for low current target experiments, space-charge dominated beam commissioning, as well as for online adaptive correction of photocathode quantum efficiency degradation.
Leukaemia cell of origin identified by chromatin landscape of bulk tumour cells
George, Joshy; Uyar, Asli; Young, Kira; Kuffler, Lauren; Waldron-Francis, Kaiden; Marquez, Eladio; Ucar, Duygu; Trowbridge, Jennifer J.
2016-01-01
The precise identity of a tumour's cell of origin can influence disease prognosis and outcome. Methods to reliably define tumour cell of origin from primary, bulk tumour cell samples has been a challenge. Here we use a well-defined model of MLL-rearranged acute myeloid leukaemia (AML) to demonstrate that transforming haematopoietic stem cells (HSCs) and multipotent progenitors results in more aggressive AML than transforming committed progenitor cells. Transcriptome profiling reveals a gene expression signature broadly distinguishing stem cell-derived versus progenitor cell-derived AML, including genes involved in immune escape, extravasation and small GTPase signal transduction. However, whole-genome profiling of open chromatin reveals precise and robust biomarkers reflecting each cell of origin tested, from bulk AML tumour cell sampling. We find that bulk AML tumour cells exhibit distinct open chromatin loci that reflect the transformed cell of origin and suggest that open chromatin patterns may be leveraged as prognostic signatures in human AML. PMID:27397025
Evaluating structural pattern recognition for handwritten math via primitive label graphs
NASA Astrophysics Data System (ADS)
Zanibbi, Richard; MoucheÌre, Harold; Viard-Gaudin, Christian
2013-01-01
Currently, structural pattern recognizer evaluations compare graphs of detected structure to target structures (i.e. ground truth) using recognition rates, recall and precision for object segmentation, classification and relationships. In document recognition, these target objects (e.g. symbols) are frequently comprised of multiple primitives (e.g. connected components, or strokes for online handwritten data), but current metrics do not characterize errors at the primitive level, from which object-level structure is obtained. Primitive label graphs are directed graphs defined over primitives and primitive pairs. We define new metrics obtained by Hamming distances over label graphs, which allow classification, segmentation and parsing errors to be characterized separately, or using a single measure. Recall and precision for detected objects may also be computed directly from label graphs. We illustrate the new metrics by comparing a new primitive-level evaluation to the symbol-level evaluation performed for the CROHME 2012 handwritten math recognition competition. A Python-based set of utilities for evaluating, visualizing and translating label graphs is publicly available.
Adaptive electron beam shaping using a photoemission gun and spatial light modulator
Maxson, Jared; Lee, Hyeri; Bartnik, Adam C.; ...
2015-02-01
The need for precisely defined beam shapes in photoelectron sources has been well established. In this paper, we use a spatial light modulator and simple shaping algorithm to create arbitrary, detailed transverse laser shapes with high fidelity. We transmit this shaped laser to the photocathode of a high voltage dc gun. Using beam currents where space charge is negligible, and using an imaging solenoid and fluorescent viewscreen, we show that the resultant beam shape preserves these detailed features with similar fidelity. Next, instead of transmitting a shaped laser profile, we use an active feedback on the unshaped electron beam imagemore » to create equally accurate and detailed shapes. We demonstrate that this electron beam feedback has the added advantage of correcting for electron optical aberrations, yielding shapes without skew. The method may serve to provide precisely defined electron beams for low current target experiments, space-charge dominated beam commissioning, as well as for online adaptive correction of photocathode quantum efficiency degradation.« less
Atomically precise graphene nanoribbon heterojunctions from a single molecular precursor
NASA Astrophysics Data System (ADS)
Nguyen, Giang D.; Tsai, Hsin-Zon; Omrani, Arash A.; Marangoni, Tomas; Wu, Meng; Rizzo, Daniel J.; Rodgers, Griffin F.; Cloke, Ryan R.; Durr, Rebecca A.; Sakai, Yuki; Liou, Franklin; Aikawa, Andrew S.; Chelikowsky, James R.; Louie, Steven G.; Fischer, Felix R.; Crommie, Michael F.
2017-11-01
The rational bottom-up synthesis of atomically defined graphene nanoribbon (GNR) heterojunctions represents an enabling technology for the design of nanoscale electronic devices. Synthetic strategies used thus far have relied on the random copolymerization of two electronically distinct molecular precursors to yield GNR heterojunctions. Here we report the fabrication and electronic characterization of atomically precise GNR heterojunctions prepared through late-stage functionalization of chevron GNRs obtained from a single precursor. Post-growth excitation of fully cyclized GNRs induces cleavage of sacrificial carbonyl groups, resulting in atomically well-defined heterojunctions within a single GNR. The GNR heterojunction structure was characterized using bond-resolved scanning tunnelling microscopy, which enables chemical bond imaging at T = 4.5 K. Scanning tunnelling spectroscopy reveals that band alignment across the heterojunction interface yields a type II heterojunction, in agreement with first-principles calculations. GNR heterojunction band realignment proceeds over a distance less than 1 nm, leading to extremely large effective fields.
NASA Technical Reports Server (NTRS)
Hibbard, William L.; Dyer, Charles R.; Paul, Brian E.
1994-01-01
The VIS-AD data model integrates metadata about the precision of values, including missing data indicators and the way that arrays sample continuous functions, with the data objects of a scientific programming language. The data objects of this data model form a lattice, ordered by the precision with which they approximate mathematical objects. We define a similar lattice of displays and study visualization processes as functions from data lattices to display lattices. Such functions can be applied to visualize data objects of all data types and are thus polymorphic.
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
Wöstheinrich, K; Schmidt, P C
2000-06-01
The instrumentation and validation of a laboratory-scale fluidized bed apparatus is described. For continuous control of the process, the apparatus is instrumented with sensors for temperature, relative humidity (RH), and air velocity. Conditions of inlet air, fluidizing air, product, and exhaust air were determined. The temperature sensors were calibrated at temperatures of 0.0 degree C and 99.9 degrees C. The calibration of the humidity sensors covered the range from 12% RH to 98% RH using saturated electrolyte solutions. The calibration of the anemometer took place in a wind tunnel at defined air velocities. The calibrations led to satisfying results concerning sensitivity and precision. To evaluate the reproducibility of the process, 15 granules were prepared under identical conditions. The influence of the type of pump used for delivering the granulating liquid was investigated. Particle size distribution, bulk density, and tapped density were determined. Granules were tableted on a rotary press at four different compression force levels, followed by determination of tablet properties such as weight, crushing strength, and disintegration time. The apparatus was found to produce granules with good reproducibility concerning the granule and tablet properties.
Temporal Control over Transient Chemical Systems using Structurally Diverse Chemical Fuels.
Chen, Jack L-Y; Maiti, Subhabrata; Fortunati, Ilaria; Ferrante, Camilla; Prins, Leonard J
2017-08-25
The next generation of adaptive, intelligent chemical systems will rely on a continuous supply of energy to maintain the functional state. Such systems will require chemical methodology that provides precise control over the energy dissipation process, and thus, the lifetime of the transiently activated function. This manuscript reports on the use of structurally diverse chemical fuels to control the lifetime of two different systems under dissipative conditions: transient signal generation and the transient formation of self-assembled aggregates. The energy stored in the fuels is dissipated at different rates by an enzyme, which installs a dependence of the lifetime of the active system on the chemical structure of the fuel. In the case of transient signal generation, it is shown that different chemical fuels can be used to generate a vast range of signal profiles, allowing temporal control over two orders of magnitude. Regarding self-assembly under dissipative conditions, the ability to control the lifetime using different fuels turns out to be particularly important as stable aggregates are formed only at well-defined surfactant/fuel ratios, meaning that temporal control cannot be achieved by simply changing the fuel concentration. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bravi, Riccardo; Del Tongo, Claudia; Cohen, Erez James; Dalle Mura, Gabriele; Tognetti, Alessandro; Minciacchi, Diego
2014-06-01
The ability to perform isochronous movements while listening to a rhythmic auditory stimulus requires a flexible process that integrates timing information with movement. Here, we explored how non-temporal and temporal characteristics of an auditory stimulus (presence, interval occupancy, and tempo) affect motor performance. These characteristics were chosen on the basis of their ability to modulate the precision and accuracy of synchronized movements. Subjects have participated in sessions in which they performed sets of repeated isochronous wrist's flexion-extensions under various conditions. The conditions were chosen on the basis of the defined characteristics. Kinematic parameters were evaluated during each session, and temporal parameters were analyzed. In order to study the effects of the auditory stimulus, we have minimized all other sensory information that could interfere with its perception or affect the performance of repeated isochronous movements. The present study shows that the distinct characteristics of an auditory stimulus significantly influence isochronous movements by altering their duration. Results provide evidence for an adaptable control of timing in the audio-motor coupling for isochronous movements. This flexibility would make plausible the use of different encoding strategies to adapt audio-motor coupling for specific tasks.
Yang, Guanxue; Wang, Lin; Wang, Xiaofan
2017-06-07
Reconstruction of networks underlying complex systems is one of the most crucial problems in many areas of engineering and science. In this paper, rather than identifying parameters of complex systems governed by pre-defined models or taking some polynomial and rational functions as a prior information for subsequent model selection, we put forward a general framework for nonlinear causal network reconstruction from time-series with limited observations. With obtaining multi-source datasets based on the data-fusion strategy, we propose a novel method to handle nonlinearity and directionality of complex networked systems, namely group lasso nonlinear conditional granger causality. Specially, our method can exploit different sets of radial basis functions to approximate the nonlinear interactions between each pair of nodes and integrate sparsity into grouped variables selection. The performance characteristic of our approach is firstly assessed with two types of simulated datasets from nonlinear vector autoregressive model and nonlinear dynamic models, and then verified based on the benchmark datasets from DREAM3 Challenge4. Effects of data size and noise intensity are also discussed. All of the results demonstrate that the proposed method performs better in terms of higher area under precision-recall curve.
Wireless inertial measurement of head kinematics in freely-moving rats
Pasquet, Matthieu O.; Tihy, Matthieu; Gourgeon, Aurélie; Pompili, Marco N.; Godsil, Bill P.; Léna, Clément; Dugué, Guillaume P.
2016-01-01
While miniature inertial sensors offer a promising means for precisely detecting, quantifying and classifying animal behaviors, versatile inertial sensing devices adapted for small, freely-moving laboratory animals are still lacking. We developed a standalone and cost-effective platform for performing high-rate wireless inertial measurements of head movements in rats. Our system is designed to enable real-time bidirectional communication between the headborne inertial sensing device and third party systems, which can be used for precise data timestamping and low-latency motion-triggered applications. We illustrate the usefulness of our system in diverse experimental situations. We show that our system can be used for precisely quantifying motor responses evoked by external stimuli, for characterizing head kinematics during normal behavior and for monitoring head posture under normal and pathological conditions obtained using unilateral vestibular lesions. We also introduce and validate a novel method for automatically quantifying behavioral freezing during Pavlovian fear conditioning experiments, which offers superior performance in terms of precision, temporal resolution and efficiency. Thus, this system precisely acquires movement information in freely-moving animals, and can enable objective and quantitative behavioral scoring methods in a wide variety of experimental situations. PMID:27767085
Development and validity of an instrumented handbike: initial results of propulsion kinetics.
van Drongelen, Stefan; van den Berg, Jos; Arnet, Ursina; Veeger, Dirkjan H E J; van der Woude, Lucas H V
2011-11-01
To develop an instrumented handbike system to measure the forces applied to the handgrip during handbiking. A 6 degrees of freedom force sensor was built into the handgrip of an attach-unit handbike, together with two optical encoders to measure the orientation of the handgrip and crank in space. Linearity, precision, and percent error were determined for static and dynamic tests. High linearity was demonstrated for both the static and the dynamic condition (r=1.01). Precision was high under the static condition (standard deviation of 0.2N), however the precision decreased with higher loads during the dynamic condition. Percent error values were between 0.3 and 5.1%. This is the first instrumented handbike system that can register 3-dimensional forces. It can be concluded that the instrumented handbike system allows for an accurate force analysis based on forces registered at the handle bars. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.
Precision about the automatic emotional brain.
Vuilleumier, Patrik
2015-01-01
The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.
Drift in Neural Population Activity Causes Working Memory to Deteriorate Over Time.
Schneegans, Sebastian; Bays, Paul M
2018-05-23
Short-term memories are thought to be maintained in the form of sustained spiking activity in neural populations. Decreases in recall precision observed with increasing number of memorized items can be accounted for by a limit on total spiking activity, resulting in fewer spikes contributing to the representation of each individual item. Longer retention intervals likewise reduce recall precision, but it is unknown what changes in population activity produce this effect. One possibility is that spiking activity becomes attenuated over time, such that the same mechanism accounts for both effects of set size and retention duration. Alternatively, reduced performance may be caused by drift in the encoded value over time, without a decrease in overall spiking activity. Human participants of either sex performed a variable-delay cued recall task with a saccadic response, providing a precise measure of recall latency. Based on a spike integration model of decision making, if the effects of set size and retention duration are both caused by decreased spiking activity, we would predict a fixed relationship between recall precision and response latency across conditions. In contrast, the drift hypothesis predicts no systematic changes in latency with increasing delays. Our results show both an increase in latency with set size, and a decrease in response precision with longer delays within each set size, but no systematic increase in latency for increasing delay durations. These results were quantitatively reproduced by a model based on a limited neural resource in which working memories drift rather than decay with time. SIGNIFICANCE STATEMENT Rapid deterioration over seconds is a defining feature of short-term memory, but what mechanism drives this degradation of internal representations? Here, we extend a successful population coding model of working memory by introducing possible mechanisms of delay effects. We show that a decay in neural signal over time predicts that the time required for memory retrieval will increase with delay, whereas a random drift in the stored value predicts no effect of delay on retrieval time. Testing these predictions in a multi-item memory task with an eye movement response, we identified drift as a key mechanism of memory decline. These results provide evidence for a dynamic spiking basis for working memory, in contrast to recent proposals of activity-silent storage. Copyright © 2018 Schneegans and Bays.
Cost Implications of Value-Based Pricing for Companion Diagnostic Tests in Precision Medicine.
Zaric, Gregory S
2016-07-01
Many interpretations of personalized medicine, also referred to as precision medicine, include discussions of companion diagnostic tests that allow drugs to be targeted to those individuals who are most likely to benefit or that allow treatment to be designed in a way such that individuals who are unlikely to benefit do not receive treatment. Many authors have commented on the clinical and competitive implications of companion diagnostics, but there has been relatively little formal analysis of the cost implications of companion diagnostics, although cost reduction is often cited as a significant benefit of precision medicine. We investigate the potential impact on costs of precision medicine implemented through the use of companion diagnostics. We develop a framework in which the costs of companion diagnostic tests are determined by considerations of profit maximization and cost effectiveness. We analyze four scenarios that are defined by the incremental cost-effectiveness ratio of the new drug in the absence of a companion diagnostic test. We find that, in most scenarios, precision medicine strategies based on companion diagnostics should be expected to lead to increases in costs in the short term and that costs would fall only in a limited number of situations.
Dylla, Daniel P.; Megison, Susan D.
2015-01-01
Objective. We compared the precision of a search strategy designed specifically to retrieve randomized controlled trials (RCTs) and systematic reviews of RCTs with search strategies designed for broader purposes. Methods. We designed an experimental search strategy that automatically revised searches up to five times by using increasingly restrictive queries as long at least 50 citations were retrieved. We compared the ability of the experimental and alternative strategies to retrieve studies relevant to 312 test questions. The primary outcome, search precision, was defined for each strategy as the proportion of relevant, high quality citations among the first 50 citations retrieved. Results. The experimental strategy had the highest median precision (5.5%; interquartile range [IQR]: 0%–12%) followed by the narrow strategy of the PubMed Clinical Queries (4.0%; IQR: 0%–10%). The experimental strategy found the most high quality citations (median 2; IQR: 0–6) and was the strategy most likely to find at least one high quality citation (73% of searches; 95% confidence interval 68%–78%). All comparisons were statistically significant. Conclusions. The experimental strategy performed the best in all outcomes although all strategies had low precision. PMID:25922798
A Strategy for DoD Manufacturing Science and Technology R and D in Precision Fabrication
1994-01-01
3-11 vii Contents (Continued) Bibliography Appendix A. Progress Since the 1991 Plan Appendix B. Why "Precision" Appendix C...preci- sion fabrication R&D. Appendix A summarizes progress in precision fabrication R&D since the previous plan was prepared in 1991. Appendix B...lathe’s power consumption may indicate worn bearings. Detecting and acting on this condition can prevent costly spindle damage and associated machine down
Balancing Flexible Constraints and Measurement Precision in Computerized Adaptive Testing
ERIC Educational Resources Information Center
Moyer, Eric L.; Galindo, Jennifer L.; Dodd, Barbara G.
2012-01-01
Managing test specifications--both multiple nonstatistical constraints and flexibly defined constraints--has become an important part of designing item selection procedures for computerized adaptive tests (CATs) in achievement testing. This study compared the effectiveness of three procedures: constrained CAT, flexible modified constrained CAT,…
A Better Model for Management Training
ERIC Educational Resources Information Center
Bobele, H. Kenneth; Buchanan, Peter J.
1976-01-01
Greater precision in appraising training needs, greater clarity in defining training objectives, and an emphasis on a practical, skills-oriented approach to management development can result from using Henry Mintzberg's model which describes managerial work in terms of 6 job characteristics and 10 interpersonal, informational, or decisional roles.…
AN OVERVIEW OF COASTAL ENVIRONMENTAL HEALTH INDICATORS
Discussions of the coastal environment and its health can be improved by more precise use of terms and clarification of the relationship, if any, between the health of ecosystems and the risks to human health. Ecosystem health is seldom defined and, in any case, has to be regarde...
Henson, Mary P.; Bergstedt, Roger A.; Adams, Jean V.
2003-01-01
The ability to predict when sea lampreys (Petromyzon marinus) will metamorphose from the larval phase to the parasitic phase is essential to the operation of the sea lamprey control program. During the spring of 1994, two populations of sea lamprey larvae from two rivers were captured, measured, weighed, implanted with coded wire tags, and returned to the same sites in the streams from which they were taken. Sea lampreys were recovered in the fall, after metamorphosis would have occurred, and checked for the presence of a tag. When the spring data were compared to the fall data it was found that the minimum requirements (length ≥ 120 mm, weight ≥ 3 g, and condition factor ≥ 1.50) suggested for metamorphosis did define a pool of larvae capable of metamorphosing. However, logistic regressions that relate the probability of metamorphosis to size are necessary to predict metamorphosis in a population. The data indicated, based on cross-validation, that weight measurements alone predicted metamorphosis with greater precision than length or condition factor in both the Marengo and Amnicon rivers. Based on the Akaike Information Criterion, weight alone was a better predictor in the Amnicon River, but length and condition factor combined predicted metamorphosis better in the Marengo River. There would be no additional cost if weight alone were used instead of length. However, if length and weight were measured the gain in predictive power would not be enough to justify the additional cost.
Watts, Sarah E; Turnell, Adrienne; Kladnitski, Natalie; Newby, Jill M; Andrews, Gavin
2015-04-01
There were three aims of this study, the first was to examine the efficacy of CBT versus treatment-as-usual (TAU) in the treatment of anxiety and depressive disorders, the second was to examine how TAU is defined in TAU control groups for those disorders, and the third was to explore whether the type of TAU condition influences the estimate of effects of CBT. A systematic search of Cochrane Central Register of Controlled Trials, PsycINFO, and CINAHL was conducted. 48 studies of CBT for depressive or anxiety disorders (n=6926) that specified that their control group received TAU were identified. Most (n=45/48) provided an explanation of the TAU group however there was significant heterogeneity amongst TAU conditions. The meta-analysis showed medium effects favoring CBT over TAU for both anxiety (g=0.69, 95% CI 0.47-0.92, p<0.001, n=1318) and depression (g=0.70, 95% CI 0.49-0.90, p<0.001, n=5054), with differential effects observed across TAU conditions. CBT is superior to TAU and the size of the effect of CBT compared to TAU depends on the nature of the TAU condition. The term TAU is used in different ways and should be more precisely described. The four key details to be reported can be thought of as "who, what, how many, and any additional treatments?" Copyright © 2014 Elsevier B.V. All rights reserved.
Precision wood particle feedstocks
Dooley, James H; Lanning, David N
2013-07-30
Wood particles having fibers aligned in a grain, wherein: the wood particles are characterized by a length dimension (L) aligned substantially parallel to the grain, a width dimension (W) normal to L and aligned cross grain, and a height dimension (H) normal to W and L; the L.times.H dimensions define two side surfaces characterized by substantially intact longitudinally arrayed fibers; the W.times.H dimensions define two cross-grain end surfaces characterized individually as aligned either normal to the grain or oblique to the grain; the L.times.W dimensions define two substantially parallel top and bottom surfaces; and, a majority of the W.times.H surfaces in the mixture of wood particles have end checking.
Design and Validation of High Date Rate Ka-Band Software Defined Radio for Small Satellite
NASA Technical Reports Server (NTRS)
Xia, Tian
2016-01-01
The Design and Validation of High Date Rate Ka- Band Software Defined Radio for Small Satellite project will develop a novel Ka-band software defined radio (SDR) that is capable of establishing high data rate inter-satellite links with a throughput of 500 megabits per second (Mb/s) and providing millimeter ranging precision. The system will be designed to operate with high performance and reliability that is robust against various interference effects and network anomalies. The Ka-band radio resulting from this work will improve upon state of the art Ka-band radios in terms of dimensional size, mass and power dissipation, which limit their use in small satellites.
Unambiguous UML Composite Structures: The OMEGA2 Experience
NASA Astrophysics Data System (ADS)
Ober, Iulian; Dragomir, Iulia
Starting from version 2.0, UML introduced hierarchical composite structures, which are a very expressive way of defining complex software architectures, but which have a very loosely defined semantics in the standard. In this paper we propose a set of consistency rules that ensure UML composite structures are unambiguous and can be given a precise semantics. Our primary application of the static consistency rules defined in this paper is within the OMEGA UML profile [6], but these rules are general and applicable to other hierarchical component models based on the same concepts, such as MARTE GCM or SysML. The rule set has been formalized in OCL and is currently used in the OMEGA UML compiler.
NASA Astrophysics Data System (ADS)
Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming
2018-03-01
In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.
Hans-Erik Andersen; Stephen E. Reutebuch; Robert J. McGaughey
2006-01-01
The development of remote sensing technologies increases the potential to support more precise, efficient, and ecologically-sensitive approaches to forest resource management. One of the primary requirements of precision forest management is accurate and detailed 3D spatial data relating to the type and condition of forest stands and characteristics of the underlying...
Ar/Ar Dating Independent of Monitor Standard Ages
NASA Astrophysics Data System (ADS)
Boswell, S.; Hemming, S. R.
2015-12-01
Because the reported age of an analyzed sample is dependent on the age of the co-irradiated monitor standard(s), Ar/Ar dating is a relative dating technique. There is disagreement at the 1% scale in the age of commonly used monitor standards, and there is a great need to improve the inter-laboratory calibrations. Additionally, new approaches and insights are needed to meet the challenge of bringing the Ar/Ar chronometer to the highest possible precision and accuracy. In this spirit, we present a conceptual framework for Ar/Ar dating that does not depend on the age of monitor standards, but only on the K content of a solid standard. The concept is demonstrated by introducing a re-expressed irradiation parameter (JK) that depends on the ratio of 39ArK to 40Ar* rather than the 40Ar*/39ArK ratio. JK is equivalent to the traditional irradiation parameter J and is defined as JK = (39Ar/40K) • (λ/λe). The ultimate precision and accuracy of the method will depend on how precisely and accurately the 39Ar and 40K can be estimated, and will require isotope dilution measurements of both from the same aliquot. We are testing the workability of our technique at the 1% level by measuring weighed and irradiated hornblende and biotite monitor standards using GLO-1 glauconite to define a calibration curve for argon signals versus abundance.
Ceciliason, Ann-Sofie; Andersson, M Gunnar; Lindström, Anders; Sandler, Håkan
2018-02-01
This study's objective is to obtain accuracy and precision in estimating the postmortem interval (PMI) for decomposing human remains discovered in indoor settings. Data were collected prospectively from 140 forensic cases with a known date of death, scored according to the Total Body Score (TBS) scale at the post-mortem examination. In our model setting, it is estimated that, in cases with or without the presence of blowfly larvae, approximately 45% or 66% respectively, of the variance in TBS can be derived from Accumulated Degree-Days (ADD). The precision in estimating ADD/PMI from TBS is, in our setting, moderate to low. However, dividing the cases into defined subgroups suggests the possibility to increase the precision of the model. Our findings also suggest a significant seasonal difference with concomitant influence on TBS in the complete data set, possibly initiated by the presence of insect activity mainly during summer. PMI may be underestimated in cases with presence of desiccation. Likewise, there is a need for evaluating the effect of insect activity, to avoid overestimating the PMI. Our data sample indicates that the scoring method might need to be slightly modified to better reflect indoor decomposition, especially in cases with insect infestations or/and extensive desiccation. When applying TBS in an indoor setting, the model requires distinct inclusion criteria and a defined population. Copyright © 2017 Elsevier B.V. All rights reserved.
Impact of orbit modeling on DORIS station position and Earth rotation estimates
NASA Astrophysics Data System (ADS)
Štěpánek, Petr; Rodriguez-Solano, Carlos Javier; Hugentobler, Urs; Filler, Vratislav
2014-04-01
The high precision of estimated station coordinates and Earth rotation parameters (ERP) obtained from satellite geodetic techniques is based on the precise determination of the satellite orbit. This paper focuses on the analysis of the impact of different orbit parameterizations on the accuracy of station coordinates and the ERPs derived from DORIS observations. In a series of experiments the DORIS data from the complete year 2011 were processed with different orbit model settings. First, the impact of precise modeling of the non-conservative forces on geodetic parameters was compared with results obtained with an empirical-stochastic modeling approach. Second, the temporal spacing of drag scaling parameters was tested. Third, the impact of estimating once-per-revolution harmonic accelerations in cross-track direction was analyzed. And fourth, two different approaches for solar radiation pressure (SRP) handling were compared, namely adjusting SRP scaling parameter or fixing it on pre-defined values. Our analyses confirm that the empirical-stochastic orbit modeling approach, which does not require satellite attitude information and macro models, results for most of the monitored station parameters in comparable accuracy as the dynamical model that employs precise non-conservative force modeling. However, the dynamical orbit model leads to a reduction of the RMS values for the estimated rotation pole coordinates by 17% for x-pole and 12% for y-pole. The experiments show that adjusting atmospheric drag scaling parameters each 30 min is appropriate for DORIS solutions. Moreover, it was shown that the adjustment of cross-track once-per-revolution empirical parameter increases the RMS of the estimated Earth rotation pole coordinates. With recent data it was however not possible to confirm the previously known high annual variation in the estimated geocenter z-translation series as well as its mitigation by fixing the SRP parameters on pre-defined values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-06-01
Following a planning period during which the Lawrence Livermore Laboratory and the Department of Defense managing sponsor, the USAF Materials Laboratory, agreed on work statements, the Department of Defense Tri-Service Precision Machine-Tool Program began in February 1978. Milestones scheduled for the first quarter have been met. Tasks and manpower requirements for two basic projects, precision-machining commercialization (PMC) and a machine-tool task force (MTTF), were defined. Progress by PMC includes: (1) documentation of existing precision machine-tool technology by initiation and compilation of a bibliography containing several hundred entries: (2) identification of the problems and needs of precision turning-machine builders and ofmore » precision turning-machine users interested in developing high-precision machining capability; and (3) organization of the schedule and content of the first seminar, to be held in October 1978, which will bring together representatives from the machine-tool and optics communities to address the problems and begin the process of high-precision machining commercialization. Progress by MTTF includes: (1) planning for the organization of a team effort of approximately 60 to 80 international experts to contribute in various ways to project objectives, namely, to summarize state-of-the-art cutting-machine-tool technology and to identify areas where future R and D should prove technically and economically profitable; (2) preparation of a comprehensive plan to achieve those objectives; and (3) preliminary arrangements for a plenary session, also in October, when the task force will meet to formalize the details for implementing the plan.« less
A novel planar flow cell for studies of biofilm heterogeneity and flow-biofilm interactions
Zhang, Wei; Sileika, Tadas S.; Chen, Cheng; Liu, Yang; Lee, Jisun; Packman, Aaron I.
2012-01-01
Biofilms are microbial communities growing on surfaces, and are ubiquitous in nature, in bioreactors, and in human infection. Coupling between physical, chemical, and biological processes is known to regulate the development of biofilms; however, current experimental systems do not provide sufficient control of environmental conditions to enable detailed investigations of these complex interactions. We developed a novel planar flow cell that supports biofilm growth under complex two-dimensional fluid flow conditions. This device provides precise control of flow conditions and can be used to create well-defined physical and chemical gradients that significantly affect biofilm heterogeneity. Moreover, the top and bottom of the flow chamber are transparent, so biofilm growth and flow conditions are fully observable using non-invasive confocal microscopy and high-resolution video imaging. To demonstrate the capability of the device, we observed the growth of Pseudomonas aeruginosa biofilms under imposed flow gradients. We found a positive relationship between patterns of fluid velocity and biofilm biomass because of faster microbial growth under conditions of greater local nutrient influx, but this relationship eventually reversed because high hydrodynamic shear leads to the detachment of cells from the surface. These results reveal that flow gradients play a critical role in the development of biofilm communities. By providing new capability for observing biofilm growth, solute and particle transport, and net chemical transformations under user-specified environmental gradients, this new planar flow cell system has broad utility for studies of environmental biotechnology and basic biofilm microbiology, as well as applications in bioreactor design, environmental engineering, biogeochemistry, geomicrobiology, and biomedical research. PMID:21656713
Discrete square root filtering - A survey of current techniques.
NASA Technical Reports Server (NTRS)
Kaminskii, P. G.; Bryson, A. E., Jr.; Schmidt, S. F.
1971-01-01
Current techniques in square root filtering are surveyed and related by applying a duality association. Four efficient square root implementations are suggested, and compared with three common conventional implementations in terms of computational complexity and precision. It is shown that the square root computational burden should not exceed the conventional by more than 50% in most practical problems. An examination of numerical conditioning predicts that the square root approach can yield twice the effective precision of the conventional filter in ill-conditioned problems. This prediction is verified in two examples.
Evaluation of the concentration and bioactivity of adenovirus vectors for gene therapy.
Mittereder, N; March, K L; Trapnell, B C
1996-01-01
Development of adenovirus vectors as potential therapeutic agents for multiple applications of in vivo human gene therapy has resulted in numerous preclinical and clinical studies. However, lack of standardization of the methods for quantifying the physical concentration and functionally active fraction of virions in these studies has often made comparison between various studies difficult or impossible. This study was therefore carried out to define the variables for quantification of the concentration of adenovirus vectors. The methods for evaluation of total virion concentration included electron microscopy and optical absorbance. The methods for evaluation of the concentration of functional virions included detection of gene transfer (transgene transfer and expression) and the plaque assay on 293 cells. Enumeration of total virion concentration by optical absorbance was found to be a precise procedure, but accuracy was dependent on physical disruption of the virion to eliminate artifacts from light scattering and also on a correct value for the extinction coefficient. Both biological assays for enumerating functional virions were highly dependent on the assay conditions and in particular the time of virion adsorption and adsorption volume. Under optimal conditions, the bioactivity of the vector, defined as the fraction of total virions which leads to detected target cell infection, was determined to be 0.10 in the plaque assay and 0.29 in the gene transfer assay. This difference is most likely due to the fact that detection by gene transfer requires only measurement of levels of transgene expression in the infected cell whereas plaque formation is dependent on a series of biological events of much greater complexity. These results show that the exact conditions for determination of infectious virion concentration and bioactivity of recombinant adenovirus vectors are critical and must be standardized for comparability. These observations may be very useful in comparison of data from different preclinical and clinical studies and may also have important implications for how adenovirus vectors can optimally be used in human gene therapy. PMID:8892868
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Hongkun; Zhong, Mingjiang; Adzima, Brian
2013-03-20
Poly(ionic liquid)s (PILs) are an important class of technologically relevant materials. However, characterization of well-defined polyionic materials remains a challenge. Herein, we have developed a simple and versatile gel permeation chromatography (GPC) methodology for molecular weight (MW) characterization of PILs with a variety of anions. PILs with narrow MW distributions were synthesized via atom transfer radical polymerization, and the MWs obtained from GPC were further confirmed via nuclear magnetic resonance end group analysis.
Defining space around conducting polymers: reversible protonic doping of a canopied polypyrrole.
Lee, Dongwhan; Swager, Timothy M
2003-06-11
A canopy-shaped pyrrole derivative 2 was prepared, in which a sterically demanding pendant group is juxtaposed to the pyrrole fragment to minimize interstrand pi-pi stacking interactions in the resulting polymer. Anodic polymerization of 2 afforded highly conductive poly(2), the electronic structure of which was probed by various spectroelectrochemical techniques. A limited charge delocalization within poly(2) translates into a well-defined conductivity profile, properties important for resistivity-based sensing. Notably, the bulk conductivity was precisely modulated by a rapid and reversible deprotonation and reprotonation of the polymer backbone.
Amarenco, G
2014-09-01
The Bristol Stool Chart (BSC) allows patients to identify their stool form using seven different images with accompanying written descriptors. Stool form was found to correlate better than stool frequency with whole-gut transit as measured by a radio-opaque marker study. This score is widely used in order to verify the presence of a constipation and to evaluate the therapeutic impact of various treatments. In our clinical practice, we was strongly surprised by the facility and the great precision of the patients to report their stool form, meaning that they usually and daily verify these stools. We wanted to precise the goals of a such attitude. Two questionnaires were proposed to healthy and voluntary subjects. Q1 was supposedly presented in order to verify the sensibility of a French version of BSC in a healthy population. Thus, Q1 precised the difficulties or not to understand pictures and written descriptors, asked about exhaustive analysis by means of BSC of stool form and bowel condition. All subjects with history of ano-rectal disorders or specific treatment for bowel dysfunction were excluded. After Q1 fulfilled, Q2 was proposed to the subjects. Q2 was designed to precise the goals of the patient when he look at his stool and the frequency of such an investigation. Finally a specific question concerning the subject opinion about this behavior in terms of bothersome, shame, or metaphysic interrogation. Eighty-five healthy subjects were recruited (42 female and 43 male). Mean age was 37.2 (sd = 15.7). Mean score of BCS was 2.07 (sd =1.05) (2.07 for female and 1.81 for male, P = 0.22). Number of categories of stool form was only 1 in 40%, 2 categories in 31%, 3 in 19%, 4 in 10%. Presence of a constipation defined by category 1 or 2 was found in 17% (23% in F, 12% in M, P = 0.075). Precision of BSC was noted as excellent in 68%, moderated in 18% and poor in 14%. BSC was considered as easy to use in 75%. Frequency of inspection of feces was systematic for 37%, 1/2 in 20%, 1/3 in 13%, 1 to 4 per month in 30%. The goal of inspection was "routine" in 54%, and devoted to track down any pathological condition ("self examination") in 46%. Eighty percent of the subjects considered having no shame or specific reticence and only 17% of them, had some interrogations concerning the real rational of such an inspection. BSC is a useful tool widely used in routine practice, helping to the diagnosis of constipation and the control of the different therapeutic strategies. There is no psychological barriers or metaphysics inconveniences for its use. But it seems legitimate to understand the hidden reasons of such a behavior with unconscious purposes reflecting the intimal nature of the humans. 3. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Nakata, Maho; Braams, Bastiaan J; Fujisawa, Katsuki; Fukuda, Mituhiro; Percus, Jerome K; Yamashita, Makoto; Zhao, Zhengji
2008-04-28
The reduced density matrix (RDM) method, which is a variational calculation based on the second-order reduced density matrix, is applied to the ground state energies and the dipole moments for 57 different states of atoms, molecules, and to the ground state energies and the elements of 2-RDM for the Hubbard model. We explore the well-known N-representability conditions (P, Q, and G) together with the more recent and much stronger T1 and T2(') conditions. T2(') condition was recently rederived and it implies T2 condition. Using these N-representability conditions, we can usually calculate correlation energies in percentage ranging from 100% to 101%, whose accuracy is similar to CCSD(T) and even better for high spin states or anion systems where CCSD(T) fails. Highly accurate calculations are carried out by handling equality constraints and/or developing multiple precision arithmetic in the semidefinite programming (SDP) solver. Results show that handling equality constraints correctly improves the accuracy from 0.1 to 0.6 mhartree. Additionally, improvements by replacing T2 condition with T2(') condition are typically of 0.1-0.5 mhartree. The newly developed multiple precision arithmetic version of SDP solver calculates extraordinary accurate energies for the one dimensional Hubbard model and Be atom. It gives at least 16 significant digits for energies, where double precision calculations gives only two to eight digits. It also provides physically meaningful results for the Hubbard model in the high correlation limit.
Schrader, Thorsten; Münter, Klaus; Kleine-Ostmann, Thomas; Schmid, Ernst
2008-12-01
The production of spindle disturbances in FC2 cells, a human-hamster hybrid (A(L)) cell line, by non-ionizing radiation was studied using an electromagnetic field with a field strength of 90 V/m at a frequency of 835 MHz. Due to the given experimental conditions slide flask cultures were exposed at room temperature in a microTEM (transversal electromagnetic field) cell, which allows optimal experimental conditions for small samples of biological material. Numerical calculations suggest that specific absorption rates of up to 60 mW/kg are reached for maximum field exposure. All exposure field parameters--either measured or calculable--are precisely defined and, for the first time, traceable to the standards of the SI system of physical units. Compared with co-incident negative controls, the results of two independently performed experiments suggest that exposure periods of time from 0.5 to 2 h with an electric field strength of 90 V/m are spindle acting agents as predominately indicated by the appearance of spindle disturbances at the ana- and telophase stages (especially lagging and non-disjunction of single chromosomes) of cell divisions. The spindle disturbances do not change the fraction of mitotic cells with increasing exposure time up to 2 h. Due to the applied experimental conditions an influence of temperature as a confounder parameter for spindle disturbances can be excluded.
NASA Astrophysics Data System (ADS)
Bessa, Filipa; Rossano, Claudia; Nourisson, Delphine; Gambineri, Simone; Marques, João Carlos; Scapini, Felicita
2013-01-01
Environmental and human controls are widely accepted as the main structuring forces of the macrofauna communities on sandy beaches. A population of the talitrid amphipod Talitrus saltator (Montagu, 1808) was investigated on an exposed sandy beach on the Atlantic coast of Portugal (Leirosa beach) to estimate orientation capabilities and endogenous rhythms in conditions of recent changes in the landscape (artificial reconstruction of the foredune) and beach morphodynamics (stabilization against erosion from the sea). We tested sun orientation of talitrids on the beach and recorded their locomotor activity rhythms under constant conditions in the laboratory. The orientation data were analysed with circular statistics and multiple regression models adapted to angular distributions, to highlight the main factors and variables influencing the variation of orientation. The talitrids used the sun compass, visual cues (landscape and sun visibility) to orient and the precision of orientation varied according to the tidal regime (rising or ebbing tides). A well-defined free-running rhythm (circadian with in addition a bimodal rhythmicity, likely tidal) was highlighted in this population. This showed a stable behavioural adaptation on a beach that has experienced a process of artificial stabilization of the dune through nourishment actions over a decade. Monitoring the conditions of such dynamic environments and the resilience capacity of the inhabiting macroinfauna is a main challenge for sandy beach ecologists.
New method of processing heat treatment experiments with numerical simulation support
NASA Astrophysics Data System (ADS)
Kik, T.; Moravec, J.; Novakova, I.
2017-08-01
In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.
ERIC Educational Resources Information Center
Berry, Marva L.
2012-01-01
Expectations for nonprofit organizations (NPOs) continue to increase. Additionally, it is difficult to successfully carry out the mission of organizations while dealing with decreased funding and reduced staffing. NPOs need to be operationally consistent and precise to achieve pre-defined measures of success. Many factors impact…
Technical Snobbery Versus Clear Communicating.
ERIC Educational Resources Information Center
Ransone, R. K.
Jargon, when used properly, defines precisely and concisely the concepts peculiar to a profession. Within a profession, it meets the criteria for clear, brief, specific communication. When used outside that profession, however, it tries to impress rather than to express. Engineers and other professionals need to be taught when--and when not--to…
Description: Studies have shown that diesel exhaust particles (DEP) worsen respiratory diseases including allergic asthma. The adjuvant effects of DEP in the airways have been widely reported; however, the precise determinants and mechanisms of these effects are ill-defined. S...
Officer Training Research and Implications for Executive Training.
ERIC Educational Resources Information Center
Haverland, Edgar M.
A pragmatic approach to the problem of training military supervisors of technical personnel is suggested for executive training. In the end-product system performance point of view, the job is defined and structured by detailed task description. Training involves the statement of precise and specific objectives. (author/ly)
Accountability: Its Implications for Provincial and State Governments.
ERIC Educational Resources Information Center
Kolesar, H.
This paper examines the implications of the accountability concept for provincial or State authorities. Accountability is defined as a concomitant of an agreement between two parties. The author suggests that, in education, agreements between parties have lacked preciseness and clarity, making it extremely difficult to assess performance and to…
Intern Perceptions of Dialect and Regionalism
ERIC Educational Resources Information Center
O'Hara, Hunter
2005-01-01
Interns at The University of Tampa investigate how perceptions of dialect and regionalism may impact the learning environment and more precisely, the learner. Regionalism is defined as a belief that one's region of origin is a primary determinant of the quality of one's standards of living, social forms, customary beliefs, levels of…
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Sources of the Medical Vocabulary.
ERIC Educational Resources Information Center
Butler, Roy F.
1980-01-01
In an attempt to determine as precisely as possible just how much of medical vocabulary is derived from every source, the vocabulary defined in the 24th edition of "Dorland's Illustrated Medical Dictionary" was analyzed. Results indicate that medical vocabulary is relying increasingly upon the Greek and Latin languages as the sources of…
Automatic Syllabification in English: A Comparison of Different Algorithms
ERIC Educational Resources Information Center
Marchand, Yannick; Adsett, Connie R.; Damper, Robert I.
2009-01-01
Automatic syllabification of words is challenging, not least because the syllable is not easy to define precisely. Consequently, no accepted standard algorithm for automatic syllabification exists. There are two broad approaches: rule-based and data-driven. The rule-based method effectively embodies some theoretical position regarding the…
Clarifying the Consensus Definition of Validity
ERIC Educational Resources Information Center
Newton, Paul E.
2012-01-01
The 1999 "Standards for Educational and Psychological Testing" defines validity as the degree to which evidence and theory support the interpretations of test scores entailed by proposed uses of tests. Although quite explicit, there are ways in which this definition lacks precision, consistency, and clarity. The history of validity has taught us…
ERIC Educational Resources Information Center
Rheingold, Harriet L.
During an experiment on infant behavior, it was observed that young children shared toys with their mothers; therefore, a series of research studies was designed to investigate this phenomenon. The general purposes of the research were: (1) to define sharing more precisely and develop objective measures of frequency and duration for it, (2) to…
ERIC Educational Resources Information Center
Fraser, Bruce
The present paper reviews recent research in the area of nonstandard English: the major results to date, the significance of this research for education, and suggestions for further research. The notion of "standard" English resists precise definition; there is not a simple set of linguistic features which can be said to define it. The term…
Problems for the Average Adult in Understanding Medical Language.
ERIC Educational Resources Information Center
Crismore, Avon
Like legal language, medical language is a private language, a separate stratum containing some words specially defined for medical purposes, some existing only in the medical vocabulary, and some adding precision or solemnity. These characteristics often cause a breakdown in patient-doctor communication. Analysis of data obtained from prototype…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... Grant of Exclusive License: Veterinary Vaccines for Rift Valley Fever Virus AGENCY: Centers for Disease... territories other than Africa, in the field of use of veterinary vaccines, to practice the inventions listed... precisely defined attenuated vaccine constructs that contain complete deletions of critical virulence...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... Grant of Co-Exclusive License: Veterinary Vaccines for Rift Valley Fever Virus AGENCY: Centers for... veterinary vaccines, to practice the inventions listed in the patent applications referred to below to... generation of precisely defined attenuated vaccine constructs that contain complete deletions of critical...
Instrument-induced spatial crosstalk deconvolution algorithm
NASA Technical Reports Server (NTRS)
Wright, Valerie G.; Evans, Nathan L., Jr.
1986-01-01
An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.
Disciplinarity and the Job Search, 1995.
ERIC Educational Resources Information Center
Vandenberg, Peter
The way job positions in English studies are conceptualized, advertised, applied for, and awarded is defined by the conventional contours of literary study. The precision with which the "Job Information List" breaks down literature positions by national and historical categories reflects the desire of a great many departments to hire and…
USDA-ARS?s Scientific Manuscript database
Precision-based agricultural application of insecticide relies on a non-random distribution of pests; tarnished plant bugs (Lygus lineolaris) are known to prefer vigorously growing patches of cotton. Management zones for various crops have been readily defined using NDVI (Normalized Difference Vege...
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Yokum, Jeffrey S.; Pryputniewicz, Ryszard J.
2002-06-01
Sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography based on fiber optics and high-spatial and high-digital resolution cameras, are discussed in this paper. It is shown that sensitivity, accuracy, and precision dependent on both, the effective determination of optical phase and the effective characterization of the illumination-observation conditions. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gages, demonstrating the applicability of quantitative optical metrology techniques to satisfy constantly increasing needs for the study and development of emerging technologies.
Burghardt, N S; Bauer, E P
2013-09-05
Selective serotonin reuptake inhibitors (SSRIs) are widely used for the treatment of a spectrum of anxiety disorders, yet paradoxically they may increase symptoms of anxiety when treatment is first initiated. Despite extensive research over the past 30 years focused on SSRI treatment, the precise mechanisms by which SSRIs exert these opposing acute and chronic effects on anxiety remain unknown. By testing the behavioral effects of SSRI treatment on Pavlovian fear conditioning, a well characterized model of emotional learning, we have the opportunity to identify how SSRIs affect the functioning of specific brain regions, including the amygdala, bed nucleus of the stria terminalis (BNST) and hippocampus. In this review, we first define different stages of learning involved in cued and context fear conditioning and describe the neural circuits underlying these processes. We examine the results of numerous rodent studies investigating how acute SSRI treatment modulates fear learning and relate these effects to the known functions of serotonin in specific brain regions. With these findings, we propose a model by which acute SSRI administration, by altering neural activity in the extended amygdala and hippocampus, enhances both acquisition and expression of cued fear conditioning, but impairs the expression of contextual fear conditioning. Finally, we review the literature examining the effects of chronic SSRI treatment on fear conditioning in rodents and describe how downregulation of N-methyl-d-aspartate (NMDA) receptors in the amygdala and hippocampus may mediate the impairments in fear learning and memory that are reported. While long-term SSRI treatment effectively reduces symptoms of anxiety, their disruptive effects on fear learning should be kept in mind when combining chronic SSRI treatment and learning-based therapies, such as cognitive behavioral therapy. Copyright © 2013 IBRO. Published by Elsevier Ltd. All rights reserved.
van Abswoude, Femke; Nuijen, Nienke B; van der Kamp, John; Steenbergen, Bert
2018-06-01
A large pool of evidence supports the beneficial effect of an external focus of attention on motor skill performance in adults. In children, this effect has been studied less and results are inconclusive. Importantly, individual differences are often not taken into account. We investigated the role of working memory, conscious motor control, and task-specific focus preferences on performance with an internal and external focus of attention in children. Twenty-five children practiced a golf putting task in both an internal focus condition and external focus condition. Performance was defined as the average distance toward the hole in 3 blocks of 10 trials. Task-specific focus preference was determined by asking how much effort it took to apply the instruction in each condition. In addition, working memory capacity and conscious motor control were assessed. Children improved performance in both the internal focus condition and external focus condition (ŋ p 2 = .47), with no difference between conditions (ŋ p 2 = .01). Task-specific focus preference was the only factor moderately related to the difference between performance with an internal focus and performance with an external focus (r = .56), indicating better performance for the preferred instruction in Block 3. Children can benefit from instruction with both an internal and external focus of attention to improve short-term motor performance. Individual, task-specific focus preference influenced the effect of the instructions, with children performing better with their preferred focus. The results highlight that individual differences are a key factor in the effectiveness in children's motor performance. The precise mechanisms underpinning this effect warrant further research.
Reliable low precision simulations in land surface models
NASA Astrophysics Data System (ADS)
Dawson, Andrew; Düben, Peter D.; MacLeod, David A.; Palmer, Tim N.
2017-12-01
Weather and climate models must continue to increase in both resolution and complexity in order that forecasts become more accurate and reliable. Moving to lower numerical precision may be an essential tool for coping with the demand for ever increasing model complexity in addition to increasing computing resources. However, there have been some concerns in the weather and climate modelling community over the suitability of lower precision for climate models, particularly for representing processes that change very slowly over long time-scales. These processes are difficult to represent using low precision due to time increments being systematically rounded to zero. Idealised simulations are used to demonstrate that a model of deep soil heat diffusion that fails when run in single precision can be modified to work correctly using low precision, by splitting up the model into a small higher precision part and a low precision part. This strategy retains the computational benefits of reduced precision whilst preserving accuracy. This same technique is also applied to a full complexity land surface model, resulting in rounding errors that are significantly smaller than initial condition and parameter uncertainties. Although lower precision will present some problems for the weather and climate modelling community, many of the problems can likely be overcome using a straightforward and physically motivated application of reduced precision.
Busse, Harald; Thomas, Michael; Seiwerts, Matthias; Moche, Michael; Busse, Martin W; von Salis-Soglio, Georg; Kahn, Thomas
2008-01-01
To implement a PC-based morphometric analysis platform and to evaluate the feasibility and precision of MRI measurements of glenohumeral translation. Using a vertically open 0.5T MRI scanner, the shoulders of 10 healthy subjects were scanned in apprehension (AP) and in neutral position (NP), respectively. Surface models of the humeral head (HH) and the glenoid cavity (GC) were created from segmented MR images by three readers. Glenohumeral translation was determined by the projection point of the manually fitted HH center on the GC plane defined by the two main principal axes of the GC model. Positional precision, given as mean (extreme value at 95% confidence level), was 0.9 (1.8) mm for the HH center and 0.7 (1.6) mm for the GC centroid; angular GC precision was 1.3 degrees (2.3 degrees ) for the normal and about 4 degrees (7 degrees ) for the anterior and superior coordinate axes. The two-dimensional (2D) precision of the HH projection point was 1.1 (2.2) mm. A significant HH translation between AP and NP was found. Despite a limited quality of the underlying model data, our PC-based analysis platform allows a precise morphometric analysis of the glenohumeral joint. The software is easily extendable and may potentially be used for an objective evaluation of therapeutical measures.
Atomically Precise Interfaces from Non-stoichiometric Deposition
NASA Astrophysics Data System (ADS)
Nie, Yuefeng; Zhu, Ye; Lee, Che-Hui; Kourkoutis, Lena; Mundy, Julia; Junquera, Javier; Ghosez, Philippe; Baek, David; Sung, Suk Hyun; Xi, Xiaoxing; Shen, Kyle; Muller, David; Schlom, Darrell
2015-03-01
Complex oxide heterostructures display some of the most chemically abrupt, atomically precise interfaces, which is advantageous when constructing new interface phases with emergent properties by juxtaposing incompatible ground states. One might assume that atomically precise interfaces result from stoichiometric growth. Here we show that the most precise control is, however, obtained by using deliberate and specific non-stoichiometric growth conditions. For the precise growth of Srn+1TinO3n+1 Ruddlesden-Popper (RP) phases, stoichiometric deposition leads to the loss of the first RP rock-salt double layer, but growing with a strontium-rich surface layer restores the bulk stoichiometry and ordering of the subsurface RP structure. Our results dramatically expand the materials that can be prepared in epitaxial heterostructures with precise interface control--from just the n = 1 end members (perovskites) to the entire RP homologous series--enabling the exploration of novel quantum phenomena at a richer variety of oxide interfaces.
Atomically precise interfaces from non-stoichiometric deposition
NASA Astrophysics Data System (ADS)
Nie, Y. F.; Zhu, Y.; Lee, C.-H.; Kourkoutis, L. F.; Mundy, J. A.; Junquera, J.; Ghosez, Ph.; Baek, D. J.; Sung, S.; Xi, X. X.; Shen, K. M.; Muller, D. A.; Schlom, D. G.
2014-08-01
Complex oxide heterostructures display some of the most chemically abrupt, atomically precise interfaces, which is advantageous when constructing new interface phases with emergent properties by juxtaposing incompatible ground states. One might assume that atomically precise interfaces result from stoichiometric growth. Here we show that the most precise control is, however, obtained by using deliberate and specific non-stoichiometric growth conditions. For the precise growth of Srn+1TinOn+1 Ruddlesden-Popper (RP) phases, stoichiometric deposition leads to the loss of the first RP rock-salt double layer, but growing with a strontium-rich surface layer restores the bulk stoichiometry and ordering of the subsurface RP structure. Our results dramatically expand the materials that can be prepared in epitaxial heterostructures with precise interface control—from just the n=∞ end members (perovskites) to the entire RP homologous series—enabling the exploration of novel quantum phenomena at a richer variety of oxide interfaces.
NASA Astrophysics Data System (ADS)
Monna, F.; Loizeau, J.-L.; Thomas, B. A.; Guéguen, C.; Favarger, P.-Y.
1998-08-01
One of the factors limiting the precision of inductively coupled plasma mass spectrometry is the counting statistics, which depend upon acquisition time and ion fluxes. In the present study, the precision of the isotopic measurements of Pb and Sr is examined. The time of measurement is optimally shared for each isotope, using a mathematical simulation, to provide the lowest theoretical analytical error. Different algorithms of mass bias correction are also taken into account and evaluated in term of improvement of overall precision. Several experiments allow a comparison of real conditions with theory. The present method significantly improves the precision, regardless of the instrument used. However, this benefit is more important for equipment which originally yields a precision close to that predicted by counting statistics. Additionally, the procedure is flexible enough to be easily adapted to other problems, such as isotopic dilution.
Keep calm and carry on: Mental disorder is not more "organic" than any other medical condition.
Micoulaud-Franchi, J A; Quiles, C; Masson, M
2017-10-01
Psychiatry as a discipline should no longer be grounded in the dualistic opposition between organic and mental disorders. This non-dualistic position refusing the partition along functional versus organic lines is in line with Jean Delay, and with Robert Spitzer who wanted to include in the definition of mental disorder discussed by the DSM-III task force the statement that "mental disorders are a subset of medical disorders". However, it is interesting to note that Spitzer and colleagues ingeniously introduced the definition of "mental disorder" in the DSM-III in the following statement: "there is no satisfactory definition that specifies precise boundaries for the concept "mental disorder" (also true for such concepts as physical disorder and mental and physical health)". Indeed, as for "mental disorders", it is as difficult to define what they are as it is to define what constitutes a "physical disorder". The problem is not the words "mental" or "organic" but the word "disorder". In this line, Wakefield has proposed a useful "harmful dysfunction" analysis of mental disorder. They raise the issue of the dualistic opposition between organic and mental disorders, and situate the debate rather between the biological/physiological and the social. The paper provides a brief analysis of this shift on the question of what is a mental disorder, and demonstrates that a mental disorder is not more "organic" than any other medical condition. While establishing a dichotomy between organic and psychiatry is no longer intellectually tenable, the solution is not to reduce psychiatric and non-psychiatric disorders to the level of "organic disorders" but rather to continue to adopt both a critical and clinically pertinent approach to what constitutes a "disorder" in medicine. Copyright © 2017 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Aagten-Murphy, David; Cappagli, Giulia; Burr, David
2014-03-01
Expert musicians are able to time their actions accurately and consistently during a musical performance. We investigated how musical expertise influences the ability to reproduce auditory intervals and how this generalises across different techniques and sensory modalities. We first compared various reproduction strategies and interval length, to examine the effects in general and to optimise experimental conditions for testing the effect of music, and found that the effects were robust and consistent across different paradigms. Focussing on a 'ready-set-go' paradigm subjects reproduced time intervals drawn from distributions varying in total length (176, 352 or 704 ms) or in the number of discrete intervals within the total length (3, 5, 11 or 21 discrete intervals). Overall, Musicians performed more veridical than Non-Musicians, and all subjects reproduced auditory-defined intervals more accurately than visually-defined intervals. However, Non-Musicians, particularly with visual stimuli, consistently exhibited a substantial and systematic regression towards the mean interval. When subjects judged intervals from distributions of longer total length they tended to regress more towards the mean, while the ability to discriminate between discrete intervals within the distribution had little influence on subject error. These results are consistent with a Bayesian model that minimizes reproduction errors by incorporating a central tendency prior weighted by the subject's own temporal precision relative to the current distribution of intervals. Finally a strong correlation was observed between all durations of formal musical training and total reproduction errors in both modalities (accounting for 30% of the variance). Taken together these results demonstrate that formal musical training improves temporal reproduction, and that this improvement transfers from audition to vision. They further demonstrate the flexibility of sensorimotor mechanisms in adapting to different task conditions to minimise temporal estimation errors. © 2013.
Medicalization and overdiagnosis: different but alike.
Hofmann, Bjørn
2016-06-01
Medicalization is frequently defined as a process by which some non-medical aspects of human life become to be considered as medical problems. Overdiagnosis, on the other hand, is most often defined as diagnosing a biomedical condition that in the absence of testing would not cause symptoms or death in the person's lifetime. Medicalization and overdiagnosis are related concepts as both expand the extension of the concept of disease. They are both often used normatively to critique unwarranted or contested expansion of medicine and to address health services that are considered to be unnecessary, futile, or even harmful. However, there are important differences between the concepts, as not all cases of overdiagnosis are medicalizations and not all cases of medicalizations are overdiagnosis. The objective of this article is to clarify the differences between medicalization and overdiagnosis. It will demonstrate how the subject matter of medicalization traditionally has been non-medical (social or cultural everyday life) phenomena, while the subject matter of overdiagnosis has been biological or biomolecular conditions or processes acknowledged being potentially harmful. They also refer to different types of uncertainty: medicalization is concerned with indeterminacy, while overdiagnosis is concerned with lack of prognostic knowledge. Medicalization is dealing with sickness (sick role) while overdiagnosis with disease. Despite these differences, medicalization and overdiagnosis are becoming more alike. Medicalization is expanding, encompassing the more "technical" aspects of overdiagnosis, while overdiagnosis is becoming more ideologized. Moreover, with new trends in modern medicine, such as P4 (preventive, predictive, personal, and participatory) medicine, medicalization will become all-encompassing, while overdiagnosis more or less may dissolve. In the end they may converge in some total "iatrogenization." In doing so, the concepts may lose their precision and critical sting.
Lacruz, Maria Elena; Kluttig, Alexander; Kuss, Oliver; Tiller, Daniel; Medenwald, Daniel; Nuding, Sebastian; Greiser, Karin Halina; Frantz, Stefan; Haerting, Johannes
2017-01-18
Precise blood pressure (BP) measurements are central for the diagnosis of hypertension in clinical and epidemiological studies. The purpose of this study was to quantify the variability in BP associated with arm side, body position, and successive measurements in the setting of a population-based observational study. Additionally, we aimed to evaluate the influence of different measurement conditions on prevalence of hypertension. The sample included 967 men and 812 women aged 45 to 83 years at baseline. BP was measured according to a standardized protocol with oscillometric devices including three sitting measurements at left arm, one simultaneous supine measurement at both arms, and four supine measurements at the arm with the higher BP. Hypertension was defined as systolic BP (SBP) ≥140 mmHg and/or diastolic BP (DBP) ≥90 mmHg. Variability in SBP and DBP were analysed with sex-stratified linear covariance pattern models. We found that overall, no mean BP differences were measured according to arm-side, but substantial higher DBP and for men also higher SBP was observed in sitting than in supine position and there was a clear BP decline by consecutive measurement. Accordingly, the prevalence of hypertension depends strongly on the number and scheme of BP measurements taken to calculate the index values. Thus, BP measurements should only be compared between studies applying equal measurement conditions and index calculation. Moreover, the first BP measurement should not be used to define hypertension since it overestimates BP. The mean of second and third measurement offers the advantage of better reproducibility over single measurements.
NASA Astrophysics Data System (ADS)
Huss, E.; Laabs, B. J.; Leonard, E. M.; Licciardi, J. M.; Plummer, M. A.; Caffee, M. W.
2012-12-01
The timing of glaciation and the changes in climate that occurred both during and after the Last Glacial Maximum (LGM) in the Rocky Mountains are not well defined. Given the sensitivity of mountain glaciers to factors such as temperature, precipitation, and solar radiation, reconstructions of the history and extent of paleo-glaciers can be used to infer paleoclimate. Pine Creek Valley, located in the Absaroka Mountains in southwestern Montana, is an ideal setting for this type of research because it was occupied by a discrete valley glacier, the extent of which is precisely known during the LGM. To determine the pace and timing of ice retreat in this valley, glacially polished bedrock surfaces along the path of deglaciation were sampled at several points for cosmogenic 10Be surface exposure dating. The ages obtained range from 17.9 ± 0.8 to 13.2 ± 0.5 ka. When combined with the reconstructed ice extent during the LGM and subsequent deglaciation, these ages yield maximum and minimum retreat rates of 3.1 m/yr and 1.1 m/yr, respectively. These values constrain how long it took the glacier to retreat into a well-defined cirque from the terminal moraines. Paleoclimate conditions for the LGM were estimated using a two-dimensional, numerical, combined energy and mass balance and ice flow model. Previous qualitative inferences of paleoclimate in southern Montana indicate climate during the local LGM was colder and drier than modern values. If precipitation values were held constant or reduced for the Pine Creek glacier, the model suggests a temperature depression of at least 8°C.
Precise estimates of gaming-related harm should guide regulation of gaming.
Starcevic, Vladan; Billieux, Joël
2018-06-13
Regulation of gaming is largely based on the perception of gaming-related harm. This perception varies from one country to another and does not necessarily correspond to the real gaming-related harm. It is argued that there is a crucial need to define and assess domains of this harm in order to introduce policies that regulate gaming. Such policies would ideally be targeted at individuals at risk for problematic gaming and would be based more on educational efforts than on restrictive measures. The role of gaming industry in the regulation of gaming would depend on the more precise estimates of gaming-related harm.
Computer aided flexible envelope designs
NASA Technical Reports Server (NTRS)
Resch, R. D.
1975-01-01
Computer aided design methods are presented for the design and construction of strong, lightweight structures which require complex and precise geometric definition. The first, flexible structures, is a unique system of modeling folded plate structures and space frames. It is possible to continuously vary the geometry of a space frame to produce large, clear spans with curvature. The second method deals with developable surfaces, where both folding and bending are explored with the observed constraint of available building materials, and what minimal distortion result in maximum design capability. Alternative inexpensive fabrication techniques are being developed to achieve computer defined enclosures which are extremely lightweight and mathematically highly precise.
Mladenovic, Zorica; Vranes, Danijela; Obradovic, Slobodan; Dzudovic, Boris; Angelkov Ristic, Andjelka; Ratkovic, Nenad; Jovic, Zoran; Spasic, Marijan; Maric Kocijancic, Jelena; Djruic, Predrag
2018-06-04
Unicuspid aortic valve (UAV) is a rare congenital anomaly of aorta associated with a faster progress of valvular dysfunction, aortic dilatation and with necessity for more frequent controls and precise evaluation Asymptomatic 35 year old man had abnormal systolic diastolic murmur on aortic valve during routine examination. Initial diagnostic with transthoracic echocardiography (TTE) supposed bicuspid aortic valve, while three-dimensional transesophageal echocardiography (3D TEE) and multidetector computed tomography defined unicuspid, unicomissural aortic valve with moderate aortic stenosis and regurgitation. This case report confirmed that 3D TEE gives us opportunity for early, improved and precise diagnosis of UAV. © 2018 Wiley Periodicals, Inc.
Molecular Imaging and Precision Medicine in Dementia and Movement Disorders.
Mallik, Atul K; Drzezga, Alexander; Minoshima, Satoshi
2017-01-01
Precision medicine (PM) has been defined as "prevention and treatment strategies that take individual variability into account." Molecular imaging (MI) is an ideally suited tool for PM approaches to neurodegenerative dementia and movement disorders (MD). Here we review PM approaches and discuss how they may be applied to other associated neurodegenerative dementia and MD. With ongoing major therapeutic research initiatives that include the use of molecular imaging, we look forward to established interventions targeted to specific molecular pathophysiology and expect the potential benefit of MI PM approaches in neurodegenerative dementia and MD will only increase. Copyright © 2016 Elsevier Inc. All rights reserved.
Early Huntington's Disease Affects Movements in Transformed Sensorimotor Mappings
ERIC Educational Resources Information Center
Boulet, C.; Lemay, M.; Bedard, M.A.; Chouinard, M.J.; Chouinard, S.; Richer, F.
2005-01-01
This study examined the effect of transformed visual feedback on movement control in Huntington's disease (HD). Patients in the early stages of HD and controls performed aiming movements towards peripheral targets on a digitizing tablet and emphasizing precision. In a baseline condition, HD patients were slower but showed few precision problems in…
Feasibility of the Precise Energy Calibration for Fast Neutron Spectrometers
NASA Astrophysics Data System (ADS)
Gaganov, V. V.; Usenko, P. L.; Kryzhanovskaja, M. A.
2017-12-01
Computational studies aimed at improving the accuracy of measurements performed using neutron generators with a tritium target were performed. A measurement design yielding an extremely narrow peak in the energy spectrum of DT neutrons was found. The presence of such a peak establishes the conditions for precise energy calibration of fast-neutron spectrometers.
Raymond, Mark R; Clauser, Brian E; Furman, Gail E
2010-10-01
The use of standardized patients to assess communication skills is now an essential part of assessing a physician's readiness for practice. To improve the reliability of communication scores, it has become increasingly common in recent years to use statistical models to adjust ratings provided by standardized patients. This study employed ordinary least squares regression to adjust ratings, and then used generalizability theory to evaluate the impact of these adjustments on score reliability and the overall standard error of measurement. In addition, conditional standard errors of measurement were computed for both observed and adjusted scores to determine whether the improvements in measurement precision were uniform across the score distribution. Results indicated that measurement was generally less precise for communication ratings toward the lower end of the score distribution; and the improvement in measurement precision afforded by statistical modeling varied slightly across the score distribution such that the most improvement occurred in the upper-middle range of the score scale. Possible reasons for these patterns in measurement precision are discussed, as are the limitations of the statistical models used for adjusting performance ratings.
DNA Nanotechnology for Precise Control over Drug Delivery and Gene Therapy.
Angell, Chava; Xie, Sibai; Zhang, Liangfang; Chen, Yi
2016-03-02
Nanomedicine has been growing exponentially due to its enhanced drug targeting and reduced drug toxicity. It uses the interactions where nanotechnological components and biological systems communicate with each other to facilitate the delivery performance. At this scale, the physiochemical properties of delivery systems strongly affect their capacities. Among current delivery systems, DNA nanotechnology shows many advantages because of its unprecedented engineering abilities. Through molecular recognition, DNA nanotechnology can be used to construct a variety of nanostructures with precisely controllable size, shape, and surface chemistry, which can be appreciated in the delivery process. In this review, different approaches that are currently used for the construction of DNA nanostructures are reported. Further, the utilization of these DNA nanostructures with the well-defined parameters for the precise control in drug delivery and gene therapy is discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ultrathin conformal devices for precise and continuous thermal characterization of human skin
Webb, R. Chad; Bonifas, Andrew P.; Behnaz, Alex; Zhang, Yihui; Yu, Ki Jun; Cheng, Huanyu; Shi, Mingxing; Bian, Zuguang; Liu, Zhuangjian; Kim, Yun-Soung; Yeo, Woon-Hong; Park, Jae Suk; Song, Jizhou; Li, Yuhang; Huang, Yonggang; Gorbach, Alexander M.; Rogers, John A.
2013-01-01
Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples. PMID:24037122
Holographic photolysis of caged neurotransmitters
Lutz, Christoph; Otis, Thomas S.; DeSars, Vincent; Charpak, Serge; DiGregorio, David A.; Emiliani, Valentina
2009-01-01
Stimulation of light-sensitive chemical probes has become a powerful tool for the study of dynamic signaling processes in living tissue. Classically, this approach has been constrained by limitations of lens–based and point-scanning illumination systems. Here we describe a novel microscope configuration that incorporates a nematic liquid crystal spatial light modulator (LC-SLM) to generate holographic patterns of illumination. This microscope can produce illumination spots of variable size and number and patterns shaped to precisely match user-defined elements in a specimen. Using holographic illumination to photolyse caged glutamate in brain slices, we demonstrate that shaped excitation on segments of neuronal dendrites and simultaneous, multi-spot excitation of different dendrites enables precise spatial and rapid temporal control of glutamate receptor activation. By allowing the excitation volume shape to be tailored precisely, the holographic microscope provides an extremely flexible method for activation of various photosensitive proteins and small molecules. PMID:19160517
Ultrathin conformal devices for precise and continuous thermal characterization of human skin
NASA Astrophysics Data System (ADS)
Webb, R. Chad; Bonifas, Andrew P.; Behnaz, Alex; Zhang, Yihui; Yu, Ki Jun; Cheng, Huanyu; Shi, Mingxing; Bian, Zuguang; Liu, Zhuangjian; Kim, Yun-Soung; Yeo, Woon-Hong; Park, Jae Suk; Song, Jizhou; Li, Yuhang; Huang, Yonggang; Gorbach, Alexander M.; Rogers, John A.
2013-10-01
Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples.
Design of Measure and Control System for Precision Pesticide Deploying Dynamic Simulating Device
NASA Astrophysics Data System (ADS)
Liang, Yong; Liu, Pingzeng; Wang, Lu; Liu, Jiping; Wang, Lang; Han, Lei; Yang, Xinxin
A measure and control system for precision deploying pesticide simulating equipment is designed in order to study pesticide deployment technology. The system can simulate every state of practical pesticide deployment, and carry through precise, simultaneous measure to every factor affecting pesticide deployment effects. The hardware and software incorporates a structural design of modularization. The system is divided into many different function modules of hardware and software, and exploder corresponding modules. The modules’ interfaces are uniformly defined, which is convenient for module connection, enhancement of system’s universality, explodes efficiency and systemic reliability, and make the program’s characteristics easily extended and easy maintained. Some relevant hardware and software modules can be adapted to other measures and control systems easily. The paper introduces the design of special numeric control system, the main module of information acquisition system and the speed acquisition module in order to explain the design process of the module.
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points
Wu, Jun; Hu, Xie-he; Chen, Sheng; Chu, Jian
2003-01-01
The closed-loop stability issue of finite-precision realizations was investigated for digital controllers implemented in block-floating-point format. The controller coefficient perturbation was analyzed resulting from using finite word length (FWL) block-floating-point representation scheme. A block-floating-point FWL closed-loop stability measure was derived which considers both the dynamic range and precision. To facilitate the design of optimal finite-precision controller realizations, a computationally tractable block-floating-point FWL closed-loop stability measure was then introduced and the method of computing the value of this measure for a given controller realization was developed. The optimal controller realization is defined as the solution that maximizes the corresponding measure, and a numerical optimization approach was adopted to solve the resulting optimal realization problem. A numerical example was used to illustrate the design procedure and to compare the optimal controller realization with the initial realization.
NASA Technical Reports Server (NTRS)
Leskovar, B.; Turko, B.
1977-01-01
The development of a high precision time interval digitizer is described. The time digitizer is a 10 psec resolution stop watch covering a range of up to 340 msec. The measured time interval is determined as a separation between leading edges of a pair of pulses applied externally to the start input and the stop input of the digitizer. Employing an interpolation techniques and a 50 MHz high precision master oscillator, the equivalent of a 100 GHz clock frequency standard is achieved. Absolute accuracy and stability of the digitizer are determined by the external 50 MHz master oscillator, which serves as a standard time marker. The start and stop pulses are fast 1 nsec rise time signals, according to the Nuclear Instrument means of tunnel diode discriminators. Firing level of the discriminator define start and stop points between which the time interval is digitized.
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.
A low-cost programmable pulse generator for physiology and behavior
Sanders, Joshua I.; Kepecs, Adam
2014-01-01
Precisely timed experimental manipulations of the brain and its sensory environment are often employed to reveal principles of brain function. While complex and reliable pulse trains for temporal stimulus control can be generated with commercial instruments, contemporary options remain expensive and proprietary. We have developed Pulse Pal, an open source device that allows users to create and trigger software-defined trains of voltage pulses with high temporal precision. Here we describe Pulse Pal’s circuitry and firmware, and characterize its precision and reliability. In addition, we supply online documentation with instructions for assembling, testing and installing Pulse Pal. While the device can be operated as a stand-alone instrument, we also provide application programming interfaces in several programming languages. As an inexpensive, flexible and open solution for temporal control, we anticipate that Pulse Pal will be used to address a wide range of instrumentation timing challenges in neuroscience research. PMID:25566051
A Road Map for Precision Medicine in the Epilepsies
2015-01-01
Summary Technological advances have paved the way for accelerated genomic discovery and are bringing precision medicine clearly into view. Epilepsy research in particular is well-suited to serve as a model for the development and deployment of targeted therapeutics in precision medicine because of the rapidly expanding genetic knowledge base in epilepsy, the availability of good in vitro and in vivo model systems to efficiently study the biological consequences of genetic mutations, the ability to turn these models into effective drug screening platforms, and the establishment of collaborative research groups. Moving forward, it is critical that we strengthen these collaborations, particularly through integrated research platforms to provide robust analyses both for accurate personal genome analysis and gene and drug discovery. Similarly, the implementation of clinical trial networks will allow the expansion of patient sample populations with genetically defined epilepsy so that drug discovery can be translated into clinical practice. PMID:26416172
Muraro, A; Lemanske, R F; Castells, M; Torres, M J; Khan, D; Simon, H-U; Bindslev-Jensen, C; Burks, W; Poulsen, L K; Sampson, H A; Worm, M; Nadeau, K C
2017-07-01
This consensus document summarizes the current knowledge on the potential for precision medicine in food allergy, drug allergy, and anaphylaxis under the auspices of the PRACTALL collaboration platform. PRACTALL is a joint effort of the European Academy of Allergy and Clinical Immunology and the American Academy of Allergy, Asthma and Immunology, which aims to synchronize the European and American approaches to allergy care. Precision medicine is an emerging approach for disease treatment based on disease endotypes, which are phenotypic subclasses associated with specific mechanisms underlying the disease. Although significant progress has been made in defining endotypes for asthma, definitions of endotypes for food and drug allergy or for anaphylaxis lag behind. Progress has been made in discovery of biomarkers to guide a precision medicine approach to treatment of food and drug allergy, but further validation and quantification of these biomarkers are needed to allow their translation into practice in the clinical management of allergic disease. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
ERIC Educational Resources Information Center
Rabinak, Christine A.; Orsini, Caitlin A.; Zimmerman, Joshua M.; Maren, Stephen
2009-01-01
The basolateral complex (BLA) and central nucleus (CEA) of the amygdala play critical roles in associative learning, including Pavlovian conditioning. However, the precise role for these structures in Pavlovian conditioning is not clear. Recent work in appetitive conditioning paradigms suggests that the amygdala, particularly the BLA, has an…
On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.
Li, Bing; Chun, Hyonho; Zhao, Hongyu
2014-09-01
We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.
Smith, P; Kronvall, G
2015-07-01
The influence on the precision of disc diffusion data of the conditions under which the tests were performed was examined by analysing multilaboratory data sets generated after incubation at 35 °C for 18 h, at 28 °C for 24 h and 22 °C for 24 h and 48 h. Analyses of these data sets demonstrated that precision was significantly and progressively decreased as the test temperature was reduced from 35 to 22 °C. Analysis of the data obtained at 22 °C also showed the precision was inversely related to the time of incubation. Temperature and time related decreases in precision were not related to differences in the mean zone sizes of the data sets obtained under these test conditions. Analysis of the zone data obtained at 28 and 22 °C as single laboratory sets demonstrated that reductions of incubation temperature resulted in significant increases in both intralaboratory and interlaboratory variation. Increases in incubation time at 22 °C were, however, associated with statistically significant increases in interlaboratory variation but not with any significant increase in intralaboratory variation. The significance of these observations for the establishment of the acceptable limits of precision of data sets that can be used for the setting of valid epidemiological cut-off values is discussed. © 2014 John Wiley & Sons Ltd.
Xing, Li; Hang, Yijun; Xiong, Zhi; Liu, Jianye; Wan, Zhong
2016-01-01
This paper describes a disturbance acceleration adaptive estimate and correction approach for an attitude reference system (ARS) so as to improve the attitude estimate precision under vehicle movement conditions. The proposed approach depends on a Kalman filter, where the attitude error, the gyroscope zero offset error and the disturbance acceleration error are estimated. By switching the filter decay coefficient of the disturbance acceleration model in different acceleration modes, the disturbance acceleration is adaptively estimated and corrected, and then the attitude estimate precision is improved. The filter was tested in three different disturbance acceleration modes (non-acceleration, vibration-acceleration and sustained-acceleration mode, respectively) by digital simulation. Moreover, the proposed approach was tested in a kinematic vehicle experiment as well. Using the designed simulations and kinematic vehicle experiments, it has been shown that the disturbance acceleration of each mode can be accurately estimated and corrected. Moreover, compared with the complementary filter, the experimental results have explicitly demonstrated the proposed approach further improves the attitude estimate precision under vehicle movement conditions. PMID:27754469
Xing, Li; Hang, Yijun; Xiong, Zhi; Liu, Jianye; Wan, Zhong
2016-10-16
This paper describes a disturbance acceleration adaptive estimate and correction approach for an attitude reference system (ARS) so as to improve the attitude estimate precision under vehicle movement conditions. The proposed approach depends on a Kalman filter, where the attitude error, the gyroscope zero offset error and the disturbance acceleration error are estimated. By switching the filter decay coefficient of the disturbance acceleration model in different acceleration modes, the disturbance acceleration is adaptively estimated and corrected, and then the attitude estimate precision is improved. The filter was tested in three different disturbance acceleration modes (non-acceleration, vibration-acceleration and sustained-acceleration mode, respectively) by digital simulation. Moreover, the proposed approach was tested in a kinematic vehicle experiment as well. Using the designed simulations and kinematic vehicle experiments, it has been shown that the disturbance acceleration of each mode can be accurately estimated and corrected. Moreover, compared with the complementary filter, the experimental results have explicitly demonstrated the proposed approach further improves the attitude estimate precision under vehicle movement conditions.
Property-Based Software Engineering Measurement
NASA Technical Reports Server (NTRS)
Briand, Lionel; Morasca, Sandro; Basili, Victor R.
1995-01-01
Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.
NASA Astrophysics Data System (ADS)
Zheng, M.; Zhu, M.; Wang, Y.; Xu, C.; Yang, H.
2018-04-01
As the headstream of the Yellow River, the Yangtze River and the Lantsang River, located in the hinterland of Qinghai-Tibet Plateau, Qinghai province is hugely significant for ecosystem as well as for ecological security and sustainable development in China. With the accomplishment of the first national geographic condition census, the frequent monitoring has begun. The classification indicators of the census and monitoring data are highly correlated with Technical Criterion for Ecosystem Status Evaluation released by Ministry of Environmental Protection in 2015. Based on three years' geographic conditions data (2014-2016), Landsat-8 images and thematic data (water resource, pollution emissions, meteorological data, soil erosion, etc.), a multi-years and high-precision eco-environment status evaluation and spatiotemporal change analysis of Qinghai province has been researched on the basis of Technical Criterion for Ecosystem Status Evaluation in this paper. Unlike the evaluation implemented by environmental protection department, the evaluation unit in this paper is town rather than county. The evaluation result shows that the eco-environment status in Qinghai is generally in a fine condition, and has significant regional differences. The eco-environment status evaluation based on national geographic conditions census and monitoring data can improve both the time and space precision. The eco-environment status with high space precise and multi-indices is a key basis for environment protection decision-making.
Lake Chapala change detection using time series
NASA Astrophysics Data System (ADS)
López-Caloca, Alejandra; Tapia-Silva, Felipe-Omar; Escalante-Ramírez, Boris
2008-10-01
The Lake Chapala is the largest natural lake in Mexico. It presents a hydrological imbalance problem caused by diminishing intakes from the Lerma River, pollution from said volumes, native vegetation and solid waste. This article presents a study that allows us to determine with high precision the extent of the affectation in both extension and volume reduction of the Lake Chapala in the period going from 1990 to 2007. Through satellite images this above-mentioned period was monitored. Image segmentation was achieved through a Markov Random Field model, extending the application towards edge detection. This allows adequately defining the lake's limits as well as determining new zones within the lake, both changes pertaining the Lake Chapala. Detected changes are related to a hydrological balance study based on measuring variables such as storage volumes, evapotranspiration and water balance. Results show that the changes in the Lake Chapala establish frail conditions which pose a future risk situation. Rehabilitation of the lake requires a hydrologic balance in its banks and aquifers.
NASA Astrophysics Data System (ADS)
Xie, Xiaobin; Gao, Guanhui; Kang, Shendong; Lei, Yanhua; Pan, Zhengyin; Shibayama, Tamaki; Cai, Lintao
2017-06-01
Being able to precisely control the morphologies of noble metallic nanostructures is of essential significance for promoting the surface-enhanced Raman scattering (SERS) effect. Herein, we demonstrate an overgrowth strategy for synthesizing Au @ M (M = Au, Ag, Pd, Pt) core-shell heterogeneous nanocrystals with an orientated structural evolution and highly improved properties by using Au nanorods as seeds. With the same reaction condition system applied, we obtain four well-designed heterostructures with diverse shapes, including Au concave nanocuboids (Au CNs), Au @ Ag crystalizing face central cube nanopeanuts, Au @ Pd porous nanocuboids and Au @ Pt nanotrepangs. Subsequently, the exact overgrowth mechanism of the above heterostructural building blocks is further analysed via the systematic optimiziation of a series of fabrications. Remarkably, the well-defined Au CNs and Au @ Ag nanopeanuts both exhibit highly promoted SERS activity. We expect to be able to supply a facile strategy for the fabrication of multimetallic heterogeneous nanostructures, exploring the high SERS effect and catalytic activities.
Generalized quantum theory of recollapsing homogeneous cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James B.
2004-06-01
A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.
Generation of the SCN1A epilepsy mutation in hiPS cells using the TALEN technique
NASA Astrophysics Data System (ADS)
Chen, Wanjuan; Liu, Jingxin; Zhang, Longmei; Xu, Huijuan; Guo, Xiaogang; Deng, Sihao; Liu, Lipeng; Yu, Daiguan; Chen, Yonglong; Li, Zhiyuan
2014-06-01
Human induced pluripotent stem cells (iPSC) can be used to understand the pathological mechanisms of human disease. These cells are a promising source for cell-replacement therapy. However, such studies require genetically defined conditions. Such genetic manipulations can be performed using the novel Transcription Activator-Like Effector Nucleases (TALENs), which generate site-specific double-strand DNA breaks (DSBs) with high efficiency and precision. Combining the TALEN and iPSC methods, we developed two iPS cell lines by generating the point mutation A5768G in the SCN1A gene, which encodes the voltage-gated sodium channel Nav1.1 α subunit. The engineered iPSC maintained pluripotency and successfully differentiated into neurons with normal functional characteristics. The two cell lines differ exclusively at the epilepsy-susceptibility variant. The ability to robustly introduce disease-causing point mutations in normal hiPS cell lines can be used to generate a human cell model for studying epileptic mechanisms and for drug screening.
Accuracy of dynamical-decoupling-based spectroscopy of Gaussian noise
NASA Astrophysics Data System (ADS)
Szańkowski, Piotr; Cywiński, Łukasz
2018-03-01
The fundamental assumption of dynamical-decoupling-based noise spectroscopy is that the coherence decay rate of qubit (or qubits) driven with a sequence of many pulses, is well approximated by the environmental noise spectrum spanned on frequency comb defined by the sequence. Here we investigate the precise conditions under which this commonly used spectroscopic approach is quantitatively correct. To this end we focus on two representative examples of spectral densities: the long-tailed Lorentzian, and finite-ranged Gaussian—both expected to be encountered when using the qubit for nanoscale nuclear resonance imaging. We have found that, in contrast to Lorentz spectrum, for which the corrections to the standard spectroscopic formulas can easily be made negligible, the spectra with finite range are more challenging to reconstruct accurately. For Gaussian line shape of environmental spectral density, direct application of the standard dynamical-decoupling-based spectroscopy leads to erroneous attribution of long-tail behavior to the reconstructed spectrum. Fortunately, artifacts such as this, can be completely avoided with the simple extension to standard reconstruction method.
Quantifying reflexivity in financial markets: Toward a prediction of flash crashes
NASA Astrophysics Data System (ADS)
Filimonov, Vladimir; Sornette, Didier
2012-05-01
We introduce a measure of activity of financial markets that provides a direct access to their level of endogeneity. This measure quantifies how much of price changes is due to endogenous feedback processes, as opposed to exogenous news. For this, we calibrate the self-excited conditional Poisson Hawkes model, which combines in a natural and parsimonious way exogenous influences with self-excited dynamics, to the E-mini S&P 500 futures contracts traded in the Chicago Mercantile Exchange from 1998 to 2010. We find that the level of endogeneity has increased significantly from 1998 to 2010, with only 70% in 1998 to less than 30% since 2007 of the price changes resulting from some revealed exogenous information. Analogous to nuclear plant safety measures concerned with avoiding “criticality,” our measure provides a direct quantification of the distance of the financial market from a critical state defined precisely as the limit of diverging trading activity in the absence of any external driving.
A critical literature review of health economic evaluations of rotavirus vaccination
Aballéa, Samuel; Millier, Aurélie; Quilici, Sibilia; Caroll, Stuart; Petrou, Stavros; Toumi, Mondher
2013-01-01
Two licensed vaccines are available to prevent RVGE in infants. A worldwide critical review of economic evaluations of these vaccines was conducted. The objective was to describe differences in methodologies, assumptions and inputs and determine the key factors driving differences in conclusions. 68 economic evaluations were reviewed. RV vaccination was found to be cost-effective in developing countries, while conclusions varied between studies in developed countries. Many studies found that vaccination was likely to be cost-effective under some scenarios, such as lower prices scenarios, inclusion of herd protection, and/or adoption of a societal perspective. Other reasons for variability included uncertainty around healthcare visits incidence and lack of consensus on quality of life (QoL) valuation for infants and caregivers. New evidence on the vaccination effectiveness in real-world, new ways of modeling herd protection and assessments of QoL in children could help more precisely define the conditions under which RV vaccination would be cost-effective in developed countries. PMID:23571226
Capovilla, G; Lorenzetti, M E; Montagnini, A; Borgatti, R; Piccinelli, P; Giordano, L; Accorsi, P; Caudana, R
2001-05-01
Seckel's syndrome is a rare form of primordial dwarfism, characterized by peculiar facial appearance. In the past, this condition was overdiagnosed, and most attention was given to the facial and skeletal features to define more precise diagnostic criteria. The presence of mental retardation and neurologic signs is one of the peculiar features of this syndrome, but only recently were rare cases of malformation of cortical development described, as documented by magnetic resonance imaging (MRI). Here, we present three new cases of Seckel's syndrome showing different malformations of cortical development (one gyral hypoplasia, one macrogyria and partial corpus callosum agenesis, and one bilateral opercular macrogyria). We hypothesize that the different types of clinical expression of our patients could be explained by different malformation of cortical development types. We think that MRI studies could be performed in malformative syndromes because of the possible correlations between type and extent of the lesion and the clinical picture of any individual case.
Zero Boil-Off Tank (ZBOT) Experiment
NASA Technical Reports Server (NTRS)
Mcquillen, John
2016-01-01
The Zero-Boil-Off Tank (ZBOT) experiment has been developed as a small scale ISS experiment aimed at delineating important fluid flow, heat and mass transport, and phase change phenomena that affect cryogenic storage tank pressurization and pressure control in microgravity. The experiments use a simulant transparent low boiling point fluid (PnP) in a sealed transparent Dewar to study and quantify: (a) fluid flow and thermal stratification during pressurization; (b) mixing, thermal destratification, depressurization, and jet-ullage penetration during pressure control by jet mixing. The experiment will provide valuable microgravity empirical two-phase data associated with the above-mentioned physical phenomena through highly accurate local wall and fluid temperature and pressure measurements, full-field phase-distribution and flow visualization. Moreover, the experiments are performed under tightly controlled and definable heat transfer boundary conditions to provide reliable high-fidelity data and precise input as required for validation verification of state-of-the-art two-phase CFD models developed as part of this research and by other groups in the international scientific and cryogenic fluid management communities.
Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III
2014-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
Seemann, Ralf; Brinkmann, Martin; Pfohl, Thomas; Herminghaus, Stephan
2012-01-01
Droplet based microfluidics is a rapidly growing interdisciplinary field of research combining soft matter physics, biochemistry and microsystems engineering. Its applications range from fast analytical systems or the synthesis of advanced materials to protein crystallization and biological assays for living cells. Precise control of droplet volumes and reliable manipulation of individual droplets such as coalescence, mixing of their contents, and sorting in combination with fast analysis tools allow us to perform chemical reactions inside the droplets under defined conditions. In this paper, we will review available drop generation and manipulation techniques. The main focus of this review is not to be comprehensive and explain all techniques in great detail but to identify and shed light on similarities and underlying physical principles. Since geometry and wetting properties of the microfluidic channels are crucial factors for droplet generation, we also briefly describe typical device fabrication methods in droplet based microfluidics. Examples of applications and reaction schemes which rely on the discussed manipulation techniques are also presented, such as the fabrication of special materials and biophysical experiments.
Sahani, Dushyant; D'souza, Roy; Kadavigere, Rajagopal; Hertl, Martin; McGowan, Jennifer; Saini, Sanjay; Mueller, Peter R
2004-01-01
Liver transplantation from a living donor involves removal of part of the donor liver in a fashion that does not endanger its vascular supply or metabolic function. The radiologist plays an important role in evaluation of the living donor to define the conditions under which graft donation is contraindicated and to identify anatomic variations that may alter the surgical approach. In the past, diagnostic work-up of the donor involved costly and invasive tests. Currently, dynamic contrast material-enhanced computed tomography and magnetic resonance (MR) imaging are the imaging tests performed, each of which has advantages and limitations. MR imaging performed with liver-specific and extravascular contrast agents may be used as a single imaging test for comprehensive noninvasive evaluation of living liver transplant donors. MR imaging provides valuable information about variations in the vascular and biliary anatomy and allows evaluation of the hepatic parenchyma for diffuse or focal abnormalities. Copyright RSNA, 2004
A numerical procedure for transient free surface seepage through fracture networks
NASA Astrophysics Data System (ADS)
Jiang, Qinghui; Ye, Zuyang; Zhou, Chuangbing
2014-11-01
A parabolic variational inequality (PVI) formulation is presented for the transient free surface seepage problem defined for a whole fracture network. Because the seepage faces are specified as Signorini-type conditions, the PVI formulation can effectively eliminate the singularity of spillpoints that evolve with time. By introducing a continuous penalty function to replace the original Heaviside function, a finite element procedure based on the PVI formulation is developed to predict the transient free surface response in the fracture network. The effects of the penalty parameter on the solution precision are analyzed. A relative error formula for evaluating the flow losses at steady state caused by the penalty parameter is obtained. To validate the proposed method, three typical examples are solved. The solutions for the first example are compared with the experimental results. The results from the last two examples further demonstrate that the orientation, extent and density of fractures significantly affect the free surface seepage behavior in the fracture network.
Time to rethink the neural mechanisms of learning and memory.
Gallistel, Charles R; Balsam, Peter D
2014-02-01
Most studies in the neurobiology of learning assume that the underlying learning process is a pairing - dependent change in synaptic strength that requires repeated experience of events presented in close temporal contiguity. However, much learning is rapid and does not depend on temporal contiguity, which has never been precisely defined. These points are well illustrated by studies showing that the temporal relations between events are rapidly learned- even over long delays- and that this knowledge governs the form and timing of behavior. The speed with which anticipatory responses emerge in conditioning paradigms is determined by the information that cues provide about the timing of rewards. The challenge for understanding the neurobiology of learning is to understand the mechanisms in the nervous system that encode information from even a single experience, the nature of the memory mechanisms that can encode quantities such as time, and how the brain can flexibly perform computations based on this information. Copyright © 2013 Elsevier Inc. All rights reserved.
PHB Biosynthesis Counteracts Redox Stress in Herbaspirillum seropedicae.
Batista, Marcelo B; Teixeira, Cícero S; Sfeir, Michelle Z T; Alves, Luis P S; Valdameri, Glaucio; Pedrosa, Fabio de Oliveira; Sassaki, Guilherme L; Steffens, Maria B R; de Souza, Emanuel M; Dixon, Ray; Müller-Santos, Marcelo
2018-01-01
The ability of bacteria to produce polyhydroxyalkanoates such as poly(3-hydroxybutyrate) (PHB) enables provision of a carbon storage molecule that can be mobilized under demanding physiological conditions. However, the precise function of PHB in cellular metabolism has not been clearly defined. In order to determine the impact of PHB production on global physiology, we have characterized the properties of a Δ phaC1 mutant strain of the diazotrophic bacterium Herbaspirillum seropedicae . The absence of PHB in the mutant strain not only perturbs redox balance and increases oxidative stress, but also influences the activity of the redox-sensing Fnr transcription regulators, resulting in significant changes in expression of the cytochrome c -branch of the electron transport chain. The synthesis of PHB is itself dependent on the Fnr1 and Fnr3 proteins resulting in a cyclic dependency that couples synthesis of PHB with redox regulation. Transcriptional profiling of the Δ phaC1 mutant reveals that the loss of PHB synthesis affects the expression of many genes, including approximately 30% of the Fnr regulon.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
[IVF and endometriosis, oocyte donation and fertility preservation].
d'Argent, Emmanuelle Mathieu; Antoine, Jean-Marie
2017-12-01
Endometriosis is a common condition, causing pain and infertility. In infertile woman with superficial peritoneal endometriosis and patent tubes, laparoscopy is recommended, followed by ovarian stimulation alone or in combination with intrauterine inseminations. In case of ovarian or deep endometriosis, the indications of surgery and assisted reproductive technologies remain to be defined precisely. In vitro fertilization is generally proposed after the failure of up to three inseminations, directly for ovarian or deep endometriosis, or in case of an associated factor of infertility, mainly male. Before ovarian stimulation in view to in vitro fertilization, a pretreatment by GnRH agonist for 2 to 6 months or combined contraceptive for 6 to 8 weeks would improve the pregnancy rate. Egg donation is effective in patients with advanced ovarian failure or lack of ovarian response to stimulation. Fertility preservation, especially by oocytes vitrified, must be proposed preventively to women with endometriosis at risk of ovarian failure, without close wish to be pregnant. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
The Microscope Space Mission and the In-Orbit Calibration Plan for its Instrument
NASA Astrophysics Data System (ADS)
Levy, Agnès Touboul, Pierre; Rodrigues, Manuel; Onera, Émilie Hardy; Métris, Gilles; Robert, Alain
2015-01-01
The MICROSCOPE space mission aims at testing the Equivalence Principle (EP) with an accuracy of 10-15. This principle is one of the basis of the General Relativity theory; it states the equivalence between gravitational and inertial mass. The test is based on the precise measurement of a gravitational signal by a differential electrostatic accelerometer which includes two cylindrical test masses made of different materials. The accelerometers constitute the payload accommodated on board a drag-free micro-satellite which is controlled inertial or rotating about the normal to the orbital plane. The acceleration estimates used for the EP test are disturbed by the instruments physical parameters and by the instrument environment conditions on-board the satellite. These parameters are partially measured with ground tests or during the integration of the instrument in the satellite (alignment). Nevertheless, the ground evaluations are not sufficient with respect to the EP test accuracy objectives. An in-orbit calibration is therefore needed to characterize them finely. The calibration process for each parameter has been defined.
[The use nicergoline in the treatment of diabetes mellitus].
Popova, I V; Karpenko, A A
Presented herein is a literature review aimed at investigating the appropriateness and possibility of using nicergoline (sermion) for treatment of patients suffering from diabetes mellitus. The analysis includes the most clinically significant results of scientific studies. The material to be reviewed was retrieved using the following key words: 'nicergoline', 'sermion', and 'diabetes mellitus' (with their respective Russian equivalents) in such databases as Medline, PubMed, ScienceDirect, PMC, Cochrane, as well as archives of both Russian and foreign journals, guidelines (clinical guidelines on rendering medical care for patients with diabetes mellitus, selected lectures on endocrinology). A broad spectrum of action and no significant side effects have made it possible to use this drug in various pathological conditions. At the same time, because of limited experience of using nicergoline for vascular diseases and an insufficient number of the carried out studies the precise role of this therapeutic agent in clinical practice has not yet been conclusively defined. Special attention is given to the analysis of efficacy of nicergoline in atherosclerosis and diabetes mellitus.
The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling
NASA Astrophysics Data System (ADS)
Thornes, Tobias; Duben, Peter; Palmer, Tim
2016-04-01
At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new paradigm would represent a revolution in numerical modelling that could be of great benefit to the world.
Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.
2017-01-01
Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931
Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction
Saunders, Benjamin T.; Richard, Jocelyn M.; Janak, Patricia H.
2015-01-01
Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits. PMID:26240425
Addiction recovery: its definition and conceptual boundaries.
White, William L
2007-10-01
The addiction field's failure to achieve consensus on a definition of "recovery" from severe and persistent alcohol and other drug problems undermines clinical research, compromises clinical practice, and muddles the field's communications to service constituents, allied service professionals, the public, and policymakers. This essay discusses 10 questions critical to the achievement of such a definition and offers a working definition of recovery that attempts to meet the criteria of precision, inclusiveness, exclusiveness, measurability, acceptability, and simplicity. The key questions explore who has professional and cultural authority to define recovery, the defining ingredients of recovery, the boundaries (scope and depth) of recovery, and temporal benchmarks of recovery (when recovery begins and ends). The process of defining recovery touches on some of the most controversial issues within the addictions field.
Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Development of an aerial counting system in oil palm plantations
NASA Astrophysics Data System (ADS)
Zulyma Miserque Castillo, Jhany; Laverde Diaz, Rubbermaid; Rueda Guzmán, Claudia Leonor
2016-07-01
This paper proposes the development of a counting aerial system capable of capturing, process and analyzing images of an oil palm plantation to register the number of cultivated palms. It begins with a study of the available UAV technologies to define the most appropriate model according to the project needs. As result, a DJI Phantom 2 Vision+ is used to capture pictures that are processed by a photogrammetry software to create orthomosaics from the areas of interest, which are handled by the developed software to calculate the number of palms contained in them. The implemented algorithm uses a sliding window technique in image pyramids to generate candidate windows, an LBP descriptor to model the texture of the picture, a logistic regression model to classify the windows and a non-maximum suppression algorithm to refine the decision. The system was tested in different images than the ones used for training and for establishing the set point. As result, the system showed a 95.34% detection rate with a 97.83% precision in mature palms and a 79.26% detection rate with a 97.53% precision in young palms giving an FI score of 0.97 for mature palms and 0.87 for the small ones. The results are satisfactory getting the census and high-quality images from which is possible to get more information from the area of interest. All this, achieved through a low-cost system capable of work even in cloudy conditions.
NASA Astrophysics Data System (ADS)
He, Guang'an; Chen, Rui; Lu, Shushen; Jiang, Chengchun; Liu, Hong; Wang, Chuan
2015-11-01
The predictable significant increase in manufacture and use of engineered nanoparticles (ENPs) will cause their inevitable release into environment, and the potential harmful effects of ENPs have been confirmed. As representative ENPs, sedimentation behavior of nano-titanium dioxide (n-TiO2) should be better understood to control its environmental risk. In this study, an experimental methodology was established to set the sampling area and sampling time of n-TiO2 sedimentation. In addition, we defined a quasi-stable state and a precise index, i.e., sedimentation efficiency (SE) at this state, to describe the n-TiO2 sedimentation behavior. Both alternative concentration determination and conventional size measurement were applied to evaluate the sedimentation behavior of n-TiO2 with fulvic acid. Results showed that the sedimentation behavior described by SE was more precise and in disagreement with those predicted by particle size. Moreover, sedimentation experiments with salicylic acid (SA), under an electric field and different water temperatures or with sulfosalicylic acid under light irradiation were also performed. When the total organic carbon concentration of SA, the voltage of working electrodes, and water temperature increased, or the wavelength of light source decreased, the SE of n-TiO2 increased and n-TiO2 showed a tendency to settle in water. These findings might be important for deepening the understanding of n-TiO2 environmental behavior and exploring sedimentation behavior of other ENPs.
Quantifying biodegradable organic matter in polluted water on the basis of coulombic yield.
Liu, Yuan; Tuo, Ai-Xue; Jin, Xiao-Jun; Li, Xiang-Zhong; Liu, Hong
2018-01-01
Biodegradable organic matter (BOM) in polluted water plays a key role in various biological purification technologies. The five-day biochemical oxygen demand (BOD 5 ) index is often used to determine the amount of BOM. However, standard BOD 5 assays, centering on dissolved oxygen detection, have long testing times and often show severe deviation (error ≥ 15%). In the present study, the coulombic yield (Q) of a bio-electrochemical degradation process was determined, and a new index for BOM quantification was proposed. The Q value represents the quantity of transferred electrons from BOM to oxygen, and the corresponding index was defined as BOM Q . By revealing Q-BOM stoichiometric relationship, we were able to perform a BOM Q assay in a microbial fuel cell involved technical platform. Experimental results verified that 5-500mgL -1 of BOM Q toward artificial wastewater samples could be directly obtained without calibration in several to dozens of hours, leaving less than 5% error. Moreover, the BOM Q assay remained accurate and precise in a wide range of optimized operational conditions. A ratio of approximately 1.0 between the values of BOM Q and BOD 5 toward artificial and real wastewater samples was observed. The rapidity, accuracy, and precision of the measurement results are supported by a solid theoretical foundation. Thus, BOM Q is a promising water quality index for quantifying BOM in polluted water. Copyright © 2017 Elsevier B.V. All rights reserved.
Yadav, Nand K; Raghuvanshi, Ashish; Sharma, Gajanand; Beg, Sarwar; Katare, Om P; Nanda, Sanju
2016-03-01
The current studies entail systematic quality by design (QbD)-based development of simple, precise, cost-effective and stability-indicating high-performance liquid chromatography method for estimation of ketoprofen. Analytical target profile was defined and critical analytical attributes (CAAs) were selected. Chromatographic separation was accomplished with an isocratic, reversed-phase chromatography using C-18 column, pH 6.8, phosphate buffer-methanol (50 : 50v/v) as a mobile phase at a flow rate of 1.0 mL/min and UV detection at 258 nm. Systematic optimization of chromatographic method was performed using central composite design by evaluating theoretical plates and peak tailing as the CAAs. The method was validated as per International Conference on Harmonization guidelines with parameters such as high sensitivity, specificity of the method with linearity ranging between 0.05 and 250 µg/mL, detection limit of 0.025 µg/mL and quantification limit of 0.05 µg/mL. Precision was demonstrated using relative standard deviation of 1.21%. Stress degradation studies performed using acid, base, peroxide, thermal and photolytic methods helped in identifying the degradation products in the proniosome delivery systems. The results successfully demonstrated the utility of QbD for optimizing the chromatographic conditions for developing highly sensitive liquid chromatographic method for ketoprofen. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Experimental Study on the Precise Orbit Determination of the BeiDou Navigation Satellite System
He, Lina; Ge, Maorong; Wang, Jiexian; Wickert, Jens; Schuh, Harald
2013-01-01
The regional service of the Chinese BeiDou satellite navigation system is now in operation with a constellation including five Geostationary Earth Orbit satellites (GEO), five Inclined Geosynchronous Orbit (IGSO) satellites and four Medium Earth Orbit (MEO) satellites. Besides the standard positioning service with positioning accuracy of about 10 m, both precise relative positioning and precise point positioning are already demonstrated. As is well known, precise orbit and clock determination is essential in enhancing precise positioning services. To improve the satellite orbits of the BeiDou regional system, we concentrate on the impact of the tracking geometry and the involvement of MEOs, and on the effect of integer ambiguity resolution as well. About seven weeks of data collected at the BeiDou Experimental Test Service (BETS) network is employed in this experimental study. Several tracking scenarios are defined, various processing schemata are designed and carried out; and then, the estimates are compared and analyzed in detail. The results show that GEO orbits, especially the along-track component, can be significantly improved by extending the tracking network in China along longitude direction, whereas IGSOs gain more improvement if the tracking network extends in latitude. The involvement of MEOs and ambiguity-fixing also make the orbits better. PMID:23529116
Experimental study on the precise orbit determination of the BeiDou navigation satellite system.
He, Lina; Ge, Maorong; Wang, Jiexian; Wickert, Jens; Schuh, Harald
2013-03-01
The regional service of the Chinese BeiDou satellite navigation system is now in operation with a constellation including five Geostationary Earth Orbit satellites (GEO), five Inclined Geosynchronous Orbit (IGSO) satellites and four Medium Earth Orbit (MEO) satellites. Besides the standard positioning service with positioning accuracy of about 10 m, both precise relative positioning and precise point positioning are already demonstrated. As is well known, precise orbit and clock determination is essential in enhancing precise positioning services. To improve the satellite orbits of the BeiDou regional system, we concentrate on the impact of the tracking geometry and the involvement of MEOs, and on the effect of integer ambiguity resolution as well. About seven weeks of data collected at the BeiDou Experimental Test Service (BETS) network is employed in this experimental study. Several tracking scenarios are defined, various processing schemata are designed and carried out; and then, the estimates are compared and analyzed in detail. The results show that GEO orbits, especially the along-track component, can be significantly improved by extending the tracking network in China along longitude direction, whereas IGSOs gain more improvement if the tracking network extends in latitude. The involvement of MEOs and ambiguity-fixing also make the orbits better.
NASA Astrophysics Data System (ADS)
Yang, Ying; Liu, Xiaobao; Wang, Jieci; Jing, Jiliang
2018-03-01
We study how to improve the precision of the quantum estimation of phase for an uniformly accelerated atom in fluctuating electromagnetic field by reflecting boundaries. We find that the precision decreases with increases of the acceleration without the boundary. With the presence of a reflecting boundary, the precision depends on the atomic polarization, position and acceleration, which can be effectively enhanced compared to the case without boundary if we choose the appropriate conditions. In particular, with the presence of two parallel reflecting boundaries, we obtain the optimal precision for atomic parallel polarization and the special distance between two boundaries, as if the atom were shielded from the fluctuation.
Minutes of TOPEX/POSEIDON Science Working Team Meeting and Ocean Tides Workshop
NASA Technical Reports Server (NTRS)
Fu, Lee-Lueng (Editor)
1995-01-01
This third TOPEX/POSEIDON Science Working Team meeting was held on December 4, 1994 to review progress in defining ocean tide models, precision Earth orbits, and various science algorithms. A related workshop on ocean tides convened to select the best models to be used by scientists in the Geophysical Data Records.
ERIC Educational Resources Information Center
Pickel, Andreas
2012-01-01
The social sciences rely on assumptions of a unified self for their explanatory logics. Recent work in the new multidisciplinary field of social neuroscience challenges precisely this unproblematic character of the subjective self as basic, well-defined entity. If disciplinary self-insulation is deemed unacceptable, the philosophical challenge…
Being Precise about Lexical Vagueness. York Papers in Linguistics., No. 6.
ERIC Educational Resources Information Center
Leech, Geoffrey N.
This paper accepts Labov's (1973) criticisms of the categorial approach, i.e., the view that linguistic units are categories which are discrete, invariant, qualitatively distinct, conjunctively defined, and composed of atomic primes, and follows Labov in attempting to develop a non-categorial (or fuzzy-categorial) approach to lexical semantics,…
Early-Years Teachers' Concept Images and Concept Definitions: Triangles, Circles, and Cylinders
ERIC Educational Resources Information Center
Tsamir, Pessia; Tirosh, Dina; Levenson, Esther; Barkai, Ruthi; Tabach, Michal
2015-01-01
This study investigates practicing early-years teachers' concept images and concept definitions for triangles, circles, and cylinders. Teachers were requested to define each figure and then to identify various examples and non-examples of the figure. Teachers' use of correct and precise mathematical language and reference to critical and…
Negotiating Social Membership in the Contemporary World
ERIC Educational Resources Information Center
Hagan, Jacqueline
2006-01-01
One of the defining characteristics of the late 20th and early 21st centuries is the increasing importance of international migration, an epoch Castles and Miller term the "age of migration." The precise size of the international migrant population is unknown. Much of this movement--such as unauthorized and other irregular flows--is not…
40 CFR 53.58 - Operational field precision and blank test.
Code of Federal Regulations, 2011 CFR
2011-07-01
... samplers are also subject to a test for possible deposition of particulate matter on inactive filters... deposition is defined as the mass of material inadvertently deposited on a sample filter that is stored in a... electrical power to accommodate three test samplers are required. (2) Teflon sample filters, as specified in...
Arithmetic Abilities in Children with Developmental Dyslexia: Performance on French ZAREKI-R Test
ERIC Educational Resources Information Center
De Clercq-Quaegebeur, Maryse; Casalis, Séverine; Vilette, Bruno; Lemaitre, Marie-Pierre; Vallée, Louis
2018-01-01
A high comorbidity between reading and arithmetic disabilities has already been reported. The present study aims at identifying more precisely patterns of arithmetic performance in children with developmental dyslexia, defined with severe and specific criteria. By means of a standardized test of achievement in mathematics ("Calculation and…
Quality in Early Education Classrooms: Definitions, Gaps, and Systems
ERIC Educational Resources Information Center
Pianta, Robert; Downer, Jason; Hamre, Bridget
2016-01-01
Parents, professionals, and policymakers agree that quality is crucial for early education. But precise, consistent, and valid definitions of quality have been elusive. In this article, Robert Pianta, Jason Downer, and Bridget Hamre tackle the questions of how to define quality, how to measure it, and how to ensure that more children experience…
ERIC Educational Resources Information Center
Lu, Owen H. T.; Huang, Anna Y. Q.; Huang, Jeff C. H.; Lin, Albert J. Q.; Ogata, Hiroaki; Yang, Stephen J. H.
2018-01-01
Blended learning combines online digital resources with traditional classroom activities and enables students to attain higher learning performance through well-defined interactive strategies involving online and traditional learning activities. Learning analytics is a conceptual framework and is a part of our Precision education used to analyze…
ERIC Educational Resources Information Center
Kearns, Jacqueline Farmer; Towles-Reeves, Elizabeth; Kleinert, Harold L.; Kleinert, Jane O'Regan; Thomas, Megan Kleine-Kracht
2011-01-01
Little research has precisely defined the population of students participating in alternate assessments based on alternate academic achievement standards (AA-AAAS). Therefore, the purpose of this article is twofold: (a) explicate the findings of a multistate study examining the characteristics of the population of students participating in…
A Combined Photochemical and Multicomponent Reaction Approach to Precision Oligomers.
Konrad, Waldemar; Bloesser, Fabian R; Wetzel, Katharina S; Boukis, Andreas C; Meier, Michael A R; Barner-Kowollik, Christopher
2018-03-07
We introduce the convergent synthesis of linear monodisperse sequence-defined oligomers through a unique approach, combining the Passerini three-component reaction (P-3CR) and a Diels-Alder (DA) reaction based on photocaged dienes. A set of oligomers is prepared resting on a Passerini linker unit carrying an isocyano group for chain extension by P-3CR and a maleimide moiety for photoenol conjugation enabling a modular approach for chain growth. Monodisperse oligomers are accessible in a stepwise fashion by switching between both reaction types. Employing sebacic acid as a core unit allows the synthesis of a library of symmetric sequence-defined oligomers. The oligomers consist of alternating P-3CR and photoblocks with molecular weights up to 3532.16 g mol -1 , demonstrating the successful switching from P-3CR to photoenol conjugation. In-depth characterization was carried out including size-exclusion chromatography (SEC), high-resolution electrospray ionization mass spectrometry (ESI-MS) and NMR spectroscopy, evidencing the monodisperse nature of the precision oligomers. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Flat-Lens Focusing of Electron Beams in Graphene
Tang, Yang; Cao, Xiyuan; Guo, Ran; Zhang, Yanyan; Che, Zhiyuan; Yannick, Fouodji T.; Zhang, Weiping; Du, Junjie
2016-01-01
Coupling electron beams carrying information into electronic units is fundamental in microelectronics. This requires precision manipulation of electron beams through a coupler with a good focusing ability. In graphene, the focusing of wide electron beams has been successfully demonstrated by a circular p-n junction. However, it is not favorable for information coupling since the focal length is so small that the focal spot locates inside the circular gated region, rather than in the background region. Here, we demonstrate that an array of gate-defined quantum dots, which has gradually changing lattice spacing in the direction transverse to propagation, can focus electrons outside itself, providing a possibility to make a coupler in graphene. The focusing effect can be understood as due to the gradient change of effective refractive indices, which are defined by the local energy band in a periodic potential. The strong focusing can be achieved by suitably choosing the lattice gradient and the layer number in the incident direction, offering an effective solution to precision manipulation of electron beams with wide electron energy range and high angular tolerance. PMID:27628099
Kinesthetic information facilitates saccades towards proprioceptive-tactile targets.
Voudouris, Dimitris; Goettker, Alexander; Mueller, Stefanie; Fiehler, Katja
2016-05-01
Saccades to somatosensory targets have longer latencies and are less accurate and precise than saccades to visual targets. Here we examined how different somatosensory information influences the planning and control of saccadic eye movements. Participants fixated a central cross and initiated a saccade as fast as possible in response to a tactile stimulus that was presented to either the index or the middle fingertip of their unseen left hand. In a static condition, the hand remained at a target location for the entire block of trials and the stimulus was presented at a fixed time after an auditory tone. Therefore, the target location was derived only from proprioceptive and tactile information. In a moving condition, the hand was first actively moved to the same target location and the stimulus was then presented immediately. Thus, in the moving condition additional kinesthetic information about the target location was available. We found shorter saccade latencies in the moving compared to the static condition, but no differences in accuracy or precision of saccadic endpoints. In a second experiment, we introduced variable delays after the auditory tone (static condition) or after the end of the hand movement (moving condition) in order to reduce the predictability of the moment of the stimulation and to allow more time to process the kinesthetic information. Again, we found shorter latencies in the moving compared to the static condition but no improvement in saccade accuracy or precision. In a third experiment, we showed that the shorter saccade latencies in the moving condition cannot be explained by the temporal proximity between the relevant event (auditory tone or end of hand movement) and the moment of the stimulation. Our findings suggest that kinesthetic information facilitates planning, but not control, of saccadic eye movements to proprioceptive-tactile targets. Copyright © 2016 Elsevier Ltd. All rights reserved.
Feasibility of precise navigation in high and low latitude regions under scintillation conditions
NASA Astrophysics Data System (ADS)
Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Rovira-Garcia, Adrià; Camps, Adriano; Riba, Jaume; Barbosa, José; Blanch, Estefania; Altadill, David; Orus, Raul
2018-02-01
Scintillation is one of the most challenging problems in Global Navigation Satellite Systems (GNSS) navigation. This phenomenon appears when the radio signal passes through ionospheric irregularities. These irregularities represent rapid changes on the refraction index and, depending on their size, they can produce also diffractive effects affecting the signal amplitude and, eventually producing cycle slips. In this work, we show that the scintillation effects on the GNSS signal are quite different in low and high latitudes. For low latitude receivers, the main effects, from the point of view of precise navigation, are the increase of the carrier phase noise (measured by σϕ) and the fade on the signal intensity (measured by S4) that can produce cycle slips in the GNSS signal. With several examples, we show that the detection of these cycle slips is the most challenging problem for precise navigation, in such a way that, if these cycle slips are detected, precise navigation can be achieved in these regions under scintillation conditions. For high-latitude receivers the situation differs. In this region the size of the irregularities is typically larger than the Fresnel length, so the main effects are related with the fast change on the refractive index associated to the fast movement of the irregularities (which can reach velocities up to several km/s). Consequently, the main effect on the GNSS signals is a fast fluctuation of the carrier phase (large σϕ), but with a moderate fade in the amplitude (moderate S4). Therefore, as shown through several examples, fluctuations at high-latitude usually do not produce cycle slips, being the effect quite limited on the ionosphere-free combination and, in general, precise navigation can be achieved also during strong scintillation conditions.
Hollingworth, Andrew; Hwang, Seongmin
2013-10-19
We examined the conditions under which a feature value in visual working memory (VWM) recruits visual attention to matching stimuli. Previous work has suggested that VWM supports two qualitatively different states of representation: an active state that interacts with perceptual selection and a passive (or accessory) state that does not. An alternative hypothesis is that VWM supports a single form of representation, with the precision of feature memory controlling whether or not the representation interacts with perceptual selection. The results of three experiments supported the dual-state hypothesis. We established conditions under which participants retained a relatively precise representation of a parcticular colour. If the colour was immediately task relevant, it reliably recruited attention to matching stimuli. However, if the colour was not immediately task relevant, it failed to interact with perceptual selection. Feature maintenance in VWM is not necessarily equivalent with feature-based attentional selection.
High-precision buffer circuit for suppression of regenerative oscillation
NASA Technical Reports Server (NTRS)
Tripp, John S.; Hare, David A.; Tcheng, Ping
1995-01-01
Precision analog signal conditioning electronics have been developed for wind tunnel model attitude inertial sensors. This application requires low-noise, stable, microvolt-level DC performance and a high-precision buffered output. Capacitive loading of the operational amplifier output stages due to the wind tunnel analog signal distribution facilities caused regenerative oscillation and consequent rectification bias errors. Oscillation suppression techniques commonly used in audio applications were inadequate to maintain the performance requirements for the measurement of attitude for wind tunnel models. Feedback control theory is applied to develop a suppression technique based on a known compensation (snubber) circuit, which provides superior oscillation suppression with high output isolation and preserves the low-noise low-offset performance of the signal conditioning electronics. A practical design technique is developed to select the parameters for the compensation circuit to suppress regenerative oscillation occurring when typical shielded cable loads are driven.
Absolute marine gravimetry with matter-wave interferometry.
Bidel, Y; Zahzam, N; Blanchard, C; Bonnin, A; Cadoret, M; Bresson, A; Rouxel, D; Lequentrec-Lalancette, M F
2018-02-12
Measuring gravity from an aircraft or a ship is essential in geodesy, geophysics, mineral and hydrocarbon exploration, and navigation. Today, only relative sensors are available for onboard gravimetry. This is a major drawback because of the calibration and drift estimation procedures which lead to important operational constraints. Atom interferometry is a promising technology to obtain onboard absolute gravimeter. But, despite high performances obtained in static condition, no precise measurements were reported in dynamic. Here, we present absolute gravity measurements from a ship with a sensor based on atom interferometry. Despite rough sea conditions, we obtained precision below 10 -5 m s -2 . The atom gravimeter was also compared with a commercial spring gravimeter and showed better performances. This demonstration opens the way to the next generation of inertial sensors (accelerometer, gyroscope) based on atom interferometry which should provide high-precision absolute measurements from a moving platform.
Achieving the Heisenberg limit in quantum metrology using quantum error correction.
Zhou, Sisi; Zhang, Mengzhen; Preskill, John; Jiang, Liang
2018-01-08
Quantum metrology has many important applications in science and technology, ranging from frequency spectroscopy to gravitational wave detection. Quantum mechanics imposes a fundamental limit on measurement precision, called the Heisenberg limit, which can be achieved for noiseless quantum systems, but is not achievable in general for systems subject to noise. Here we study how measurement precision can be enhanced through quantum error correction, a general method for protecting a quantum system from the damaging effects of noise. We find a necessary and sufficient condition for achieving the Heisenberg limit using quantum probes subject to Markovian noise, assuming that noiseless ancilla systems are available, and that fast, accurate quantum processing can be performed. When the sufficient condition is satisfied, a quantum error-correcting code can be constructed that suppresses the noise without obscuring the signal; the optimal code, achieving the best possible precision, can be found by solving a semidefinite program.
NASA Astrophysics Data System (ADS)
Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan
2016-02-01
In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.
We evaluated the number of sites that would yield relatively precise estimates of physical, chemical, and biological condition for six raftable rivers 100-200 km long and 20-120 m wide. We used a probability design to select 20 sites on each of two rivers in Washington and four ...
A concept analysis of forensic risk.
Kettles, A M
2004-08-01
Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.
Jiang, Biao; Jia, Yan; He, Congfen
2018-05-11
Traditional skincare involves the subjective classification of skin into 4 categories (oily, dry, mixed, and neutral) prior to skin treatment. Following the development of noninvasive methods in skin and skin imaging technology, scientists have developed efficacy-based skincare products based on the physiological characteristics of skin under different conditions. Currently, the emergence of skinomics and systems biology has facilitated the development of precision skincare. In this article, the evolution of skincare based on the physiological states of the skin (from traditional skincare and efficacy-based skincare to precision skincare) is described. In doing so, we highlight skinomics and systems biology, with particular emphasis on the importance of skin lipidomics and microbiomes in precision skincare. The emerging trends of precision skincare are anticipated. © 2018 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Burns, III, William Wesley (Inventor); Wilson, Thomas George (Inventor)
1978-01-01
This invention provides a method and apparatus for determining a precise switching sequence for the power switching elements of electric power delivery systems of the on-off switching type and which enables extremely fast transient response, precise regulation and highly stable operation. The control utilizes the values of the power delivery system power handling network components, a desired output characteristic, a system timing parameter, and the externally imposed operating conditions to determine where steady state operations should be in order to yield desired output characteristics for the given system specifications. The actual state of the power delivery system is continuously monitored and compared to a state-space boundary which is derived from the desired equilibrium condition, and from the information obtained from this comparison, the system is moved to the desired equilibrium condition in one cycle of switching control. Since the controller continuously monitors the power delivery system's externally imposed operating conditions, a change in the conditions is immediately sensed and a new equilibrium condition is determined and achieved, again in a single cycle of switching control.
de Maat, Gijs Eduard; Vigano, Giorgio; Mariani, Massimo Alessandro; Natour, Ehsan
2017-05-19
The ascending aorta is an uncommon site for non-infective thrombus. In non-aneurysmal and non-atherosclerotic vessels this condition becomes extremely rare, while it represents a source of potential cerebral and peripheral embolic events. Currently, there is no consensus in the guidelines on how to treat a free floating thrombus in ascending aorta, therefore we present our decision making process and therapeutic strategy. A healthy 48-year-old man was hospital admitted with acute abdominal pain. CT-scan showed a right renal embolism in presence of a defect in the distal ascending aorta suggestive for thrombus. After heart team discussion the patient was scheduled for surgery and successfully underwent an emergent thrombus removal. Also, owing to multiple aortic wall insertions, the ascending aorta was replaced. The patient's recovery was uneventful and histological examination showed no signs of connective tissue disorders of aortic wall while confirmed the thrombotic nature of the mass. We present a patient with a floating thrombus in the ascending aorta who underwent an ascending aorta replacement. While angio-CT scan led to a prompt diagnosis, intraoperative epi-aortic echocardiography allowed to define precise location of thrombus, minimizing operative risk. This case demonstrates that multi-disciplinary heart team discussion is essential to define a successful strategy, that surgical treatment is feasible with specific tools such as epi-aortic echocardiography.
Nailfold Capillaroscopy Within and Beyond the Scope of Connective Tissue Diseases.
Lambova, Sevdalina Nikolova; Muller-Ladner, Ulf
2018-04-20
Nailfold capillaroscopy is a noninvasive instrumental method for morphological analysis of the nutritive capillaries in the nailfold area. In rheumatology, it is a method of choice among instrumental modalities for differential diagnosis between primary and secondary Raynaud's phenomenon (RP) in systemic rheumatic diseases. RP is a common diagnostic problem in rheumatology. Defining the proper diagnosis is a prerequisite for administration of the appropriate treatment. Thus, nailfold capillaroscopic examination is of crucial importance for the every-day practice of the rheumatologists and is currently gaining increasing attention. The most specific capillaroscopic changes are observed in Systemic Sclerosis (SSc). Due to the high prevalence of the capillaroscopic changes in this clinical entity and their early appearance, they could be used for early and very early diagnosis of the disease. More recently, "scleroderma" type capillaroscopic changes have been defined as diagnostic criterion in the new EULAR/ACR classification criteria for SSc together with the presence of scleroderma-related autoantibodies, RP, telangiectasia and other clinical signs. Capillaroscopic changes in other connective tissue diseases and in different rheumatic-like conditions like those in diabetes mellitus (e.g., diabetic stiff-hand syndrome) and paraneoplastic syndromes associated with microvascular pathology should be interpreted properly in order to obtain precise diagnosis in the shortest possible differential diagnostic process. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems.
Ouedraogo, Kiswendsida Abel; Beugin, Julie; El-Koursi, El-Miloudi; Clarhaut, Joffrey; Renaux, Dominique; Lisiecki, Frederic
2018-02-02
The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety-related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety-related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety-related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of "emergency brake" subsystems. Some specific SIL allocation rules are also defined and illustrated. © 2018 Society for Risk Analysis.
Li, Xiaohong; Blount, Patricia L; Vaughan, Thomas L; Reid, Brian J
2011-02-01
Aside from primary prevention, early detection remains the most effective way to decrease mortality associated with the majority of solid cancers. Previous cancer screening models are largely based on classification of at-risk populations into three conceptually defined groups (normal, cancer without symptoms, and cancer with symptoms). Unfortunately, this approach has achieved limited successes in reducing cancer mortality. With advances in molecular biology and genomic technologies, many candidate somatic genetic and epigenetic "biomarkers" have been identified as potential predictors of cancer risk. However, none have yet been validated as robust predictors of progression to cancer or shown to reduce cancer mortality. In this Perspective, we first define the necessary and sufficient conditions for precise prediction of future cancer development and early cancer detection within a simple physical model framework. We then evaluate cancer risk prediction and early detection from a dynamic clonal evolution point of view, examining the implications of dynamic clonal evolution of biomarkers and the application of clonal evolution for cancer risk management in clinical practice. Finally, we propose a framework to guide future collaborative research between mathematical modelers and biomarker researchers to design studies to investigate and model dynamic clonal evolution. This approach will allow optimization of available resources for cancer control and intervention timing based on molecular biomarkers in predicting cancer among various risk subsets that dynamically evolve over time.
NASA Astrophysics Data System (ADS)
Wood, Michael J.; Aristizabal, Felipe; Coady, Matthew; Nielson, Kent; Ragogna, Paul J.; Kietzig, Anne-Marie
2018-02-01
The production of millimetric liquid droplets has importance in a wide range of applications both in the laboratory and industrially. As such, much effort has been put forth to devise methods to generate these droplets on command in a manner which results in high diameter accuracy and precision, well-defined trajectories followed by successive droplets and low oscillations in droplet shape throughout their descents. None of the currently employed methods of millimetric droplet generation described in the literature adequately addresses all of these desired droplet characteristics. The reported methods invariably involve the cohesive separation of the desired volume of liquid from the bulk supply in the same step that separates the single droplet from the solid generator. We have devised a droplet generation device which separates the desired volume of liquid within a tee-apparatus in a step prior to the generation of the droplet which has yielded both high accuracy and precision of the diameters of the final droplets produced. Further, we have engineered a generating tip with extreme antiwetting properties which has resulted in reduced adhesion forces between the liquid droplet and the solid tip. This has yielded the ability to produce droplets of low mass without necessitating different diameter generating tips or the addition of surfactants to the liquid, well-defined droplet trajectories, and low oscillations in droplet volume. The trajectories and oscillations of the droplets produced have been assessed and presented quantitatively in a manner that has been lacking in the current literature.
Bosman, Michel; Zhang, Lei; Duan, Huigao; Tan, Shu Fen; Nijhuis, Christian A.; Qiu, Cheng–Wei; Yang, Joel K. W.
2014-01-01
Lithography provides the precision to pattern large arrays of metallic nanostructures with varying geometries, enabling systematic studies and discoveries of new phenomena in plasmonics. However, surface plasmon resonances experience more damping in lithographically–defined structures than in chemically–synthesized nanoparticles of comparable geometries. Grain boundaries, surface roughness, substrate effects, and adhesion layers have been reported as causes of plasmon damping, but it is difficult to isolate these effects. Using monochromated electron energy–loss spectroscopy (EELS) and numerical analysis, we demonstrate an experimental technique that allows the study of these effects individually, to significantly reduce the plasmon damping in lithographically–defined structures. We introduce a method of encapsulated annealing that preserves the shape of polycrystalline gold nanostructures, while their grain-boundary density is reduced. We demonstrate enhanced Q–factors in lithographically–defined nanostructures, with intrinsic damping that matches the theoretical Drude damping limit. PMID:24986023
2016-01-01
Information is a precise concept that can be defined mathematically, but its relationship to what we call ‘knowledge’ is not always made clear. Furthermore, the concepts ‘entropy’ and ‘information’, while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology. PMID:26857663
NASA Astrophysics Data System (ADS)
Sondag, Andrea; Dittus, Hansjörg
2016-08-01
The Weak Equivalence Principle (WEP) is at the basis of General Relativity - the best theory for gravitation today. It has been and still is tested with different methods and accuracies. In this paper an overview of tests of the Weak Equivalence Principle done in the past, developed in the present and planned for the future is given. The best result up to now is derived from the data of torsion balance experiments by Schlamminger et al. (2008). An intuitive test of the WEP consists of the comparison of the accelerations of two free falling test masses of different composition. This has been carried through by Kuroda & Mio (1989, 1990) with the up to date most precise result for this setup. There is still more potential in this method, especially with a longer free fall time and sensors with a higher resolution. Providing a free fall time of 4.74 s (9.3 s using the catapult) the drop tower of the Center of Applied Space Technology and Microgravity (ZARM) at the University of Bremen is a perfect facility for further improvements. In 2001 a free fall experiment with high sensitive SQUID (Superconductive QUantum Interference Device) sensors tested the WEP with an accuracy of 10-7 (Nietzsche, 2001). For optimal conditions one could reach an accuracy of 10-13 with this setup (Vodel et al., 2001). A description of this experiment and its results is given in the next part of this paper. For the free fall of macroscopic test masses it is important to start with precisely defined starting conditions concerning the positions and velocities of the test masses. An Electrostatic Positioning System (EPS) has been developed to this purpose. It is described in the last part of this paper.
Post-orogenic subsidence and uplift of the Carpathian belt: An integrated approach
NASA Astrophysics Data System (ADS)
Bertotti, G.; Matenco, L.; Drijkonigen, G.; Krijgsman, W.; Tarapoanca, M.; Panea, I.; Vasiliev, I.; Milea, M.; Cloetingh, S.
2003-04-01
Several hundred metres thick Pliocene to Quaternary sequences outcropping along the Carpathian front steeply dip away from the mountain belt towards the Carpathian foredeep. They overly the Carpathian fold-and-thrust belt and document that, following the main contractional stages, the orogenic wedge first subsided and was then uplifted. Uplift occurred coeval with substantial subsidence in the basin adjacent to the E, the Focsani Depression. To define the precise kinematics of such movements and thereby constrain these vertical movements taking place in the "wrong" place and in the "wrong" time, the Netherlands Research Center for Integrated Solid Earth Science has launched a large campaign of geological and geophysical investigation. The main components of the project are as follows: 1) acquisition of nearly 100km of seismic data designed to image the uppermost hundred metres of the Earth's crust and thereby making a precise connection between features visible in Industry lines and at the surface 2) paleomagnetic investigations in order to constrain the age of the poorly dated continental to lacustrine sediments 3) A seismic experiment designed to detect 3-D effects on 2-D acquisition 4) Structural work to determine the stress/strain conditions during subsidence and subsequent uplift At a larger scale, these activities are embedded in the effort made by ISES and connected groups to precisely constrain the kinematics of the Pannonian-Carpathian system. Seismic acquisition has been performed during the summer 2002 and has been technically very successful thanks also to the effort of the prospecting company Prospectiunii SA. Lines have been processed and are currently being interpreted. The most apparent feature is the lack of localized deformation demonstrating that subsidence and tilting affected areas of several tens of kilometers and are not related to single faults. Sampling for paleomagnetic studies has been carried out in 2002 along the same section where seismic acquisition took place. Preliminary measurements show good analytical results and will therefore produce relevant results in the coming months.
Tang, Tao; Stevenson, R Jan; Infante, Dana M
2016-10-15
Regional variation in both natural environment and human disturbance can influence performance of ecological assessments. In this study we calculated 5 types of benthic diatom multimetric indices (MMIs) with 3 different approaches to account for variation in ecological assessments. We used: site groups defined by ecoregions or diatom typologies; the same or different sets of metrics among site groups; and unmodeled or modeled MMIs, where models accounted for natural variation in metrics within site groups by calculating an expected reference condition for each metric and each site. We used data from the USEPA's National Rivers and Streams Assessment to calculate the MMIs and evaluate changes in MMI performance. MMI performance was evaluated with indices of precision, bias, responsiveness, sensitivity and relevancy which were respectively measured as MMI variation among reference sites, effects of natural variables on MMIs, difference between MMIs at reference and highly disturbed sites, percent of highly disturbed sites properly classified, and relation of MMIs to human disturbance and stressors. All 5 types of MMIs showed considerable discrimination ability. Using different metrics among ecoregions sometimes reduced precision, but it consistently increased responsiveness, sensitivity, and relevancy. Site specific metric modeling reduced bias and increased responsiveness. Combined use of different metrics among site groups and site specific modeling significantly improved MMI performance irrespective of site grouping approach. Compared to ecoregion site classification, grouping sites based on diatom typologies improved precision, but did not improve overall performance of MMIs if we accounted for natural variation in metrics with site specific models. We conclude that using different metrics among ecoregions and site specific metric modeling improve MMI performance, particularly when used together. Applications of these MMI approaches in ecological assessments introduced a tradeoff with assessment consistency when metrics differed across site groups, but they justified the convenient and consistent use of ecoregions. Copyright © 2016 Elsevier B.V. All rights reserved.
How absolute is zero? An evaluation of historical and current definitions of malaria elimination.
Cohen, Justin M; Moonen, Bruno; Snow, Robert W; Smith, David L
2010-07-22
Decisions to eliminate malaria from all or part of a country involve a complex set of factors, and this complexity is compounded by ambiguity surrounding some of the key terminology, most notably "control" and "elimination." It is impossible to forecast resource and operational requirements accurately if endpoints have not been defined clearly, yet even during the Global Malaria Eradication Program, debate raged over the precise definition of "eradication." Analogous deliberations regarding the meaning of "elimination" and "control" are basically nonexistent today despite these terms' core importance to programme planning. To advance the contemporary debate about these issues, this paper presents a historical review of commonly used terms, including control, elimination, and eradication, to help contextualize current understanding of these concepts. The review has been supported by analysis of the underlying mathematical concepts on which these definitions are based through simple branching process models that describe the proliferation of malaria cases following importation. Through this analysis, the importance of pragmatic definitions that are useful for providing malaria control and elimination programmes with a practical set of strategic milestones is emphasized, and it is argued that current conceptions of elimination in particular fail to achieve these requirements. To provide all countries with precise targets, new conceptual definitions are suggested to more precisely describe the old goals of "control" - here more exactly named "controlled low-endemic malaria" - and "elimination." Additionally, it is argued that a third state, called "controlled non-endemic malaria," is required to describe the epidemiological condition in which endemic transmission has been interrupted, but malaria resulting from onwards transmission from imported infections continues to occur at a sufficiently high level that elimination has not been achieved. Finally, guidelines are discussed for deriving the separate operational definitions and metrics that will be required to make these concepts relevant, measurable, and achievable for a particular environment.
Indulski, J A
1997-01-01
The World Bank in its document under the title 'Investing in Health' (1993) states that the health status of the population, including the working population, and working conditions in individual countries depend essentially on the value of gross national product per capita. The attitudes towards the role and objectives of occupational medicine have changed significantly over the last three decades. A high priority given to primary prevention reflects the mainstream of a new approach to preventive measures. Advancements in technology, production and services, common use of computers and flattening of work organisation structures have brought about the need for workers' active participation in planning of activities and shaping working conditions in own enterprise. At the same time, workers are required to possess much higher qualifications facilitating their participation in applying new technologies and using new information systems, which resulted in a fierce competition on the labour market. In the countries in the political, social and economic transition, the conditions for introducing a new system of sustained development, described by Gustavsen at the 25th International Congress on Occupational Health have not as yet been established. A procedure-based system involving negotiations between employers and workers' representatives failed to be successful in improving working conditions as the roles of the state, employers and trade unions had not been defined precisely. It is expected that further health promotion at the worksites in these countries will depend mainly on the economic progress and the reformed system of education.
GPR Mapping of a Buried Paleolithic Landscape Near Kathu, South Africa
NASA Astrophysics Data System (ADS)
Papadimitrios, K. S.; Balduino-Sollaci, C.; Vaughn, S.; Edwards, S.; Bank, C. G.; Chazan, M.
2014-12-01
he Bestwood site located near Kathu, South Africa is a 30 Ha sand-filled valley where tens of thousands of artifacts (hand axes and stone tools) dating from 700Kya to 1Mya mark a previously inhabited Stone Age site. By running close to 10 km of GPR lines and two 20mX20m grids with a 200 MHz antenna we were effectively able to sample the subsurface of the area and thus recreate the paleolandscape. Raw radargrams were viewed and processed through the MATLAB and GPR-SLICE programs. Both the raw and processed radargrams showed a distinct boundary as the GPR was moved from gravel hill conditions to sandy valley conditions. This boundary was interpreted to be an interface in the subsurface between sand on top and gravel layers below. A sinuous, meandering depression within that boundary that progressed down valley was interpreted as a paleochannel incised into the local gravels. Within the 20mx20m grids two 1mx1m test pits were dug on the assumed banks of the paleochannel. The pits served both to ground truth the inferred transition from sand to gravel and to help define a precise velocity of radar propagation for the sand layer. Our results aid archaeologists in understanding the interactions of early humans with the local landscape.