Overcoming gaps and bottlenecks to advance precision agriculture
USDA-ARS?s Scientific Manuscript database
Maintaining a clear understanding of the technology gaps, knowledge needs, and training bottlenecks is required for improving adoption of precision agriculture. As an industry, precision agriculture embraces tools, methods, and practices that are constantly changing, requiring industry, education, a...
Station-Keeping Requirements for Astronomical Imaging with Constellations of Free-Flying Collectors
NASA Technical Reports Server (NTRS)
Allen, Ronald J.
2004-01-01
The requirements on station-keeping for constellations of free-flying collectors coupled as (future) imaging arrays in space for astrophysics applications are discussed. The typical knowledge precision required in the plane of the array depends on the angular size of the targets of interest; it is generally at a level of tens of centimeters for typical stellar targets, becoming of order centimeters only for the widest attainable fields of view. In the "piston" direction, perpendicular to the array, the typical knowledge precision required depends on the bandwidth of the signal, and is at a level of tens of wavelengths for narrow approx. 1% signal bands, becoming of order one wavelength only for the broadest bandwidths expected to be useful. The significance of this result is that, at this level of precision, it may be possible to provide the necessary knowledge of array geometry without the use of signal photons, thereby allowing observations of faint targets. "Closure-phase" imaging is a technique which has been very successfully applied to surmount instabilities owing to equipment and to the atmosphere, and which appears to be directly applicable to space imaging arrays where station-keeping drifts play the same role as (slow) atmospheric and equipment instabilities.
Practice innovation: the need for nimble data platforms to implement precision oncology care.
Elfiky, Aymen; Zhang, Dongyang; Krishnan Nair, Hari K
2015-01-01
Given the drive toward personalized, value-based, and coordinated cancer care delivery, modern knowledge-based practice is being shaped within the context of an increasingly technology-driven healthcare landscape. The ultimate promise of 'precision medicine' is predicated on taking advantage of the range of new capabilities for integrating disease- and individual-specific data to define new taxonomies as part of a systems-based knowledge network. Specifically, with cancer being a constantly evolving complex disease process, proper care of an individual will require the ability to seamlessly integrate multi-dimensional 'omic' and clinical data. Importantly, however, the challenges of curating knowledge from multiple dynamic data sources and translating to practice at the point-of-care highlight parallel needs. As patients, caregivers, and their environments become more proactive in clinical care and management, practical success of precision medicine is equally dependent on the development of proper infrastructures for evolving data integration, platforms for knowledge representation in a clinically-relevant context, and implementation within a provider's work-life and workflow.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharp, J.K.
1997-11-01
This seminar describes a process and methodology that uses structured natural language to enable the construction of precise information requirements directly from users, experts, and managers. The main focus of this natural language approach is to create the precise information requirements and to do it in such a way that the business and technical experts are fully accountable for the results. These requirements can then be implemented using appropriate tools and technology. This requirement set is also a universal learning tool because it has all of the knowledge that is needed to understand a particular process (e.g., expense vouchers, projectmore » management, budget reviews, tax, laws, machine function).« less
EOS-AM precision pointing verification
NASA Technical Reports Server (NTRS)
Throckmorton, A.; Braknis, E.; Bolek, J.
1993-01-01
The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.
The Nab Spectrometer, Precision Field Mapping, and Associated Systematic Effects
NASA Astrophysics Data System (ADS)
Fry, Jason; Nab Collaboration
2017-09-01
The Nab experiment will make precision measurements of a, the e- ν correlation parameter, and b, the Fierz interference term, in neutron beta decay, aiming to deliver an independent determination of the ratio λ =GA /GV to sensitively test CKM unitarity. Nab utilizes a novel, long asymmetric spectrometer to measure the proton TOF and electron energy. We extract a from the slope of the measured TOF distribution for different electron energies. A reliable relation of the measured proton TOF to a requires detailed knowledge of the effective proton pathlength, which in turn imposes further requirements on the precision of the magnetic fields in the Nab spectrometer. The Nab spectrometer, magnetometry, and associated systematics will be discussed.
Effective environmental policy decisions benefit from stream habitat information that is accurate, precise, and relevant. The recent National Wadeable Streams Assessment (NWSA) carried out by the U.S. EPA required physical habitat information sufficiently comprehensive to facilit...
[Precision medicine : a required approach for the general internist].
Waeber, Gérard; Cornuz, Jacques; Gaspoz, Jean-Michel; Guessous, Idris; Mooser, Vincent; Perrier, Arnaud; Simonet, Martine Louis
2017-01-18
The general internist cannot be a passive bystander of the anticipated medical revolution induced by precision medicine. This latter aims to improve the predictive and/or clinical course of an individual by integrating all biological, genetic, environmental, phenotypic and psychosocial knowledge of a person. In this article, national and international initiatives in the field of precision medicine are discussed as well as the potential financial, ethical and limitations of personalized medicine. The question is not to know if precision medicine will be part of everyday life but rather to integrate early the general internist in multidisciplinary teams to ensure optimal information and shared-decision process with patients and individuals.
Canceling the Gravity Gradient Phase Shift in Atom Interferometry.
D'Amico, G; Rosi, G; Zhan, S; Cacciapuoti, L; Fattori, M; Tino, G M
2017-12-22
Gravity gradients represent a major obstacle in high-precision measurements by atom interferometry. Controlling their effects to the required stability and accuracy imposes very stringent requirements on the relative positioning of freely falling atomic clouds, as in the case of precise tests of Einstein's equivalence principle. We demonstrate a new method to exactly compensate the effects introduced by gravity gradients in a Raman-pulse atom interferometer. By shifting the frequency of the Raman lasers during the central π pulse, it is possible to cancel the initial position- and velocity-dependent phase shift produced by gravity gradients. We apply this technique to simultaneous interferometers positioned along the vertical direction and demonstrate a new method for measuring local gravity gradients that does not require precise knowledge of the relative position between the atomic clouds. Based on this method, we also propose an improved scheme to determine the Newtonian gravitational constant G towards the 10 ppm relative uncertainty.
Canceling the Gravity Gradient Phase Shift in Atom Interferometry
NASA Astrophysics Data System (ADS)
D'Amico, G.; Rosi, G.; Zhan, S.; Cacciapuoti, L.; Fattori, M.; Tino, G. M.
2017-12-01
Gravity gradients represent a major obstacle in high-precision measurements by atom interferometry. Controlling their effects to the required stability and accuracy imposes very stringent requirements on the relative positioning of freely falling atomic clouds, as in the case of precise tests of Einstein's equivalence principle. We demonstrate a new method to exactly compensate the effects introduced by gravity gradients in a Raman-pulse atom interferometer. By shifting the frequency of the Raman lasers during the central π pulse, it is possible to cancel the initial position- and velocity-dependent phase shift produced by gravity gradients. We apply this technique to simultaneous interferometers positioned along the vertical direction and demonstrate a new method for measuring local gravity gradients that does not require precise knowledge of the relative position between the atomic clouds. Based on this method, we also propose an improved scheme to determine the Newtonian gravitational constant G towards the 10 ppm relative uncertainty.
The Estimation of Precisions in the Planning of Uas Photogrammetric Surveys
NASA Astrophysics Data System (ADS)
Passoni, D.; Federici, B.; Ferrando, I.; Gagliolo, S.; Sguerso, D.
2018-05-01
The Unmanned Aerial System (UAS) is widely used in the photogrammetric surveys both of structures and of small areas. Geomatics focuses the attention on the metric quality of the final products of the survey, creating several 3D modelling applications from UAS images. As widely known, the quality of results derives from the quality of images acquisition phase, which needs an a priori estimation of the expected precisions. The planning phase is typically managed using dedicated tools, adapted from the traditional aerial-photogrammetric flight plan. But UAS flight has features completely different from the traditional one. Hence, the use of UAS for photogrammetric applications today requires a growth in knowledge in planning. The basic idea of this research is to provide a drone photogrammetric flight planning tools considering the required metric precisions, given a priori the classical parameters of a photogrammetric planning: flight altitude, overlaps and geometric parameters of the camera. The created "office suite" allows a realistic planning of a photogrammetric survey, starting from an approximate knowledge of the Digital Surface Model (DSM), and the effective attitude parameters, changing along the route. The planning products are the overlapping of the images, the Ground Sample Distance (GSD) and the precision on each pixel taking into account the real geometry. The different tested procedures, the obtained results and the solution proposed for the a priori estimates of the precisions in the particular case of UAS surveys are here reported.
Strategies for implementation of an effective pharmacogenomics program in pharmacy education.
Rao, U Subrahmanyeswara; Mayhew, Susan L; Rao, Prema S
2015-07-01
Sequencing of the human genome and the evidence correlating specific genetic variations to diseases have opened up the potential of genomics to more effective and less harmful interventions of human diseases. A wealth of pharmacogenomics knowledge is in place for the practice of precision medicine. However, this knowledge is not fully realized in clinical practice. One reason for this impasse is the lack of in-depth understanding of the potential of pharmacogenomics among the healthcare professionals. Pharmacists are the point-of-care providers and are expected to advise clinicians on matters relating to the implementation of pharmacogenomics in patient care. However, current pharmacogenomics instruction in pharmacy schools fails to produce pharmacists with the required knowledge or practical training in this discipline. In this perspective, we provide several strategies to overcome limitations faced by pharmacy schools. Once implemented, pharmacy schools will produce precision medicine-ready pharmacists.
[Precision Nursing: Individual-Based Knowledge Translation].
Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung
2016-12-01
U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweany, Melinda
2017-10-01
This is a high-risk effort to leverage knowledge gained from previous work, which focused on detector development leading to better energy resolution and reconstruction errors. This work seeks to enable applications that require precise elemental characterization of materials, such as chemical munitions remediation, offering the potential to close current detection gaps.
Correlating the cold flow and melting properties of fatty acid methyl ester (FAME) mixtures
USDA-ARS?s Scientific Manuscript database
Fatty acid methyl ester (FAME) mixtures derived from plant oils or animal fats are used to make biodiesel, lubricants, surfactants, plasticizers, ink solvents, paint strippers and other products. Processing requires a precise knowledge of the physico-chemical properties of mixtures with diverse and ...
Hoffman, James M; Dunnenberger, Henry M; Kevin Hicks, J; Caudle, Kelly E; Whirl Carrillo, Michelle; Freimuth, Robert R; Williams, Marc S; Klein, Teri E; Peterson, Josh F
2016-07-01
To move beyond a select few genes/drugs, the successful adoption of pharmacogenomics into routine clinical care requires a curated and machine-readable database of pharmacogenomic knowledge suitable for use in an electronic health record (EHR) with clinical decision support (CDS). Recognizing that EHR vendors do not yet provide a standard set of CDS functions for pharmacogenetics, the Clinical Pharmacogenetics Implementation Consortium (CPIC) Informatics Working Group is developing and systematically incorporating a set of EHR-agnostic implementation resources into all CPIC guidelines. These resources illustrate how to integrate pharmacogenomic test results in clinical information systems with CDS to facilitate the use of patient genomic data at the point of care. Based on our collective experience creating existing CPIC resources and implementing pharmacogenomics at our practice sites, we outline principles to define the key features of future knowledge bases and discuss the importance of these knowledge resources for pharmacogenomics and ultimately precision medicine. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Williams, Marc S; Buchanan, Adam H; Davis, F Daniel; Faucett, W Andrew; Hallquist, Miranda L G; Leader, Joseph B; Martin, Christa L; McCormick, Cara Z; Meyer, Michelle N; Murray, Michael F; Rahm, Alanna K; Schwartz, Marci L B; Sturm, Amy C; Wagner, Jennifer K; Williams, Janet L; Willard, Huntington F; Ledbetter, David H
2018-05-01
Health care delivery is increasingly influenced by the emerging concepts of precision health and the learning health care system. Although not synonymous with precision health, genomics is a key enabler of individualized care. Delivering patient-centered, genomics-informed care based on individual-level data in the current national landscape of health care delivery is a daunting challenge. Problems to overcome include data generation, analysis, storage, and transfer; knowledge management and representation for patients and providers at the point of care; process management; and outcomes definition, collection, and analysis. Development, testing, and implementation of a genomics-informed program requires multidisciplinary collaboration and building the concepts of precision health into a multilevel implementation framework. Using the principles of a learning health care system provides a promising solution. This article describes the implementation of population-based genomic medicine in an integrated learning health care system-a working example of a precision health program.
Correcting for time-dependent field inhomogeneities in a time orbiting potential magnetic trap
NASA Astrophysics Data System (ADS)
Fallon, Adam; Berl, Seth; Sackett, Charles
2017-04-01
Many experiments use a Time Orbiting Potential (TOP) magnetic trap to confine a Bose-condensate. An advantage of the TOP trap is that it is relatively insensitive to deviations and errors in the magnetic field. However, precision experiments using the trapped atoms often do require the rotating field to be well characterized. For instance, precision spectroscopy requires accurate knowledge of both the field magnitude and field direction relative to the polarization of a probe laser beam. We have developed an RF spectroscopic technique to measure the magnitude of the field at arbitrary times within the TOP trap rotation period. From the time-variation mapped out, various imperfections can be isolated and measured, including asymmetries in the applied trap field and static environmental fields. By compensating for these imperfections, field control at the 10 mG level or better is achievable, for a bias field of 10 G or more. This should help enable more precision experiments using trapped condensates, including precision measurements of tune-out wavelengths and possibly parity-violation measurements. Supported by the National Science Foundation, the Jefferson Scholars Foundation, and NASA.
Learning receptor positions from imperfectly known motions
NASA Technical Reports Server (NTRS)
Ahumada, Albert J., Jr.; Mulligan, Jeffrey B.
1990-01-01
An algorithm is described for learning image interpolation functions for sensor arrays whose sensor positions are somewhat disordered. The learning is based on failures of translation invariance, so it does not require knowledge of the images being presented to the visual system. Previously reported implementations of the method assumed the visual system to have precise knowledge of the translations. It is demonstrated that translation estimates computed from the imperfectly interpolated images can have enough accuracy to allow the learning process to converge to a correct interpolation.
Beckmann, Jacques S; Lew, Daniel
2016-12-19
This era of groundbreaking scientific developments in high-resolution, high-throughput technologies is allowing the cost-effective collection and analysis of huge, disparate datasets on individual health. Proper data mining and translation of the vast datasets into clinically actionable knowledge will require the application of clinical bioinformatics. These developments have triggered multiple national initiatives in precision medicine-a data-driven approach centering on the individual. However, clinical implementation of precision medicine poses numerous challenges. Foremost, precision medicine needs to be contrasted with the powerful and widely used practice of evidence-based medicine, which is informed by meta-analyses or group-centered studies from which mean recommendations are derived. This "one size fits all" approach can provide inadequate solutions for outliers. Such outliers, which are far from an oddity as all of us fall into this category for some traits, can be better managed using precision medicine. Here, we argue that it is necessary and possible to bridge between precision medicine and evidence-based medicine. This will require worldwide and responsible data sharing, as well as regularly updated training programs. We also discuss the challenges and opportunities for achieving clinical utility in precision medicine. We project that, through collection, analyses and sharing of standardized medically relevant data globally, evidence-based precision medicine will shift progressively from therapy to prevention, thus leading eventually to improved, clinician-to-patient communication, citizen-centered healthcare and sustained well-being.
40 CFR 799.6786 - TSCA water solubility: Generator column method.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...
40 CFR 799.6786 - TSCA water solubility: Generator column method.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...
40 CFR 799.6786 - TSCA water solubility: Generator column method.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator.... Finally, the design of most chemical tests and many ecological and health tests requires precise knowledge..., molality, and mole fraction. For example, to convert from weight/volume to molarity molecular mass is...
An overview of NASA's ASCENDS Mission's Lidar Measurement Requirements
NASA Astrophysics Data System (ADS)
Abshire, J. B.; Browell, E. V.; Menzies, R. T.; Lin, B.; Spiers, G. D.; Ismail, S.
2014-12-01
The objectives of NASA's ASCENDS mission are to improve the knowledge of global CO2 sources and sinks by precisely measuring the tropospheric column abundance of atmospheric CO2 and O2. The mission will use a continuously operating nadir-pointed integrated path differential absorption (IPDA) lidar in a polar orbit. The lidar offers a number of important new capabilities and will measure atmospheric CO2 globally over a wide range of challenging conditions, including at night, at high latitudes, through hazy and thin cloud conditions, and to cloud tops. The laser source enables a measurement of range, so that the absorption path length to the scattering surface will be always accurately known. The lidar approach also measures consistently in a nadir-zenith path and the narrow laser linewidth allows weighting the measurement to the lower troposphere. Using these measurements with atmospheric and flux models will allow improved estimates of CO2 fluxes and hence better understanding of the processes that exchange CO2 between the surface and atmosphere. The ASCENDS formulation team has developed a preliminary set of requirements for the lidar measurements. These were developed based on experience gained from the numerous ASCENDS airborne campaigns that have used different candidate lidar measurement techniques. They also take into account the complexity of making precise measurement of atmospheric gas columns when viewing the Earth from space. Some of the complicating factors are the widely varying reflectance and topographic heights of the Earth's land and ocean surfaces, the variety of cloud types, and the degree of cloud and aerosol absorption and scattering in the atmosphere. The requirements address the precision and bias in the measured column mixing ratio, the dynamic range of the expected surface reflected signal, the along-track sampling resolution, measurements made through thin clouds, measurements to forested and slope surfaces, range precision, measurements to cloud tops, knowledge of the laser spot position, and off-nadir pointing. These requirements are independent of the measurement approach, and are consistent with the initial mission simulation studies performed by the formulation team. This presentation will summarize the requirements along with examples that have guided their selection.
The Cook Agronomy Farm LTAR: Knowledge Intensive Precision Agro-ecology
NASA Astrophysics Data System (ADS)
Huggins, D. R.
2015-12-01
Drowning in data and starving for knowledge, agricultural decision makers require evidence-based information to enlighten sustainable intensification. The agro-ecological footprint of the Cook Agronomy Farm (CAF) Long-Term Agro-ecosystem Research (LTAR) site is embedded within 9.4 million ha of diverse land uses primarily cropland (2.9 million ha) and rangeland (5.3 million ha) that span a wide annual precipitation gradient (150 mm through 1400 mm) with diverse social and natural capital (see Figure). Sustainable intensification hinges on the development and adoption of precision agro-ecological practices that rely on meaningful spatio-temporal data relevant to land use decisions at within-field to regional scales. Specifically, the CAF LTAR will provide the scientific foundation (socio-economical and bio-physical) for enhancing decision support for precision and conservation agriculture and synergistic cropping system intensification and diversification. Long- and short-term perspectives that recognize and assess trade-offs in ecosystem services inherent in any land use decision will be considered so as to promote the development of more sustainable agricultural systems. Presented will be current and future CAF LTAR research efforts required for the development of sustainable agricultural systems including cropping system cycles and flows of nutrients, water, carbon, greenhouse gases and other biotic and abiotic factors. Evaluation criteria and metrics associated with long-term agro-ecosystem provisioning, supporting, and regulating services will be emphasized.
Schmidt, Susanne; Seiberl, Wolfgang; Schwirtz, Ansgar
2015-01-01
Ergonomic design requirements are needed to develop optimum vehicle interfaces for the driver. The majority of the current specifications consider only anthropometric conditions and subjective evaluations of comfort. This paper examines specific biomechanical aspects to improve the current ergonomic requirements. Therefore, a research which involved 40 subjects was carried out to obtain more knowledge in the field of steering movement while driving a car. Five different shoulder-elbow joint configurations were analyzed using a driving simulator to find optimum posture for driving in respect of steering precision and steering velocity. Therefore, a 20 s precision test and a test to assess maximum steering velocity over a range of 90° steering motion have been conducted. The results show that driving precision, as well as maximum steering velocity, are significantly increased in mid-positions (elbow angles of 95° and 120°) compared to more flexed (70°) or extended (145° and 160°) postures. We conclude that driver safety can be enhanced by implementing these data in the automotive design process because faster and highly precise steering can be important during evasive actions and in accident situations. In addition, subjective comfort rating, analyzed with questionnaires, confirmed experimental results. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhou, Y.; Ambarish, C. V.; Gruenke, R.; Jaeckel, F. T.; Kripps, K. L.; McCammon, D.; Morgan, K. M.; Wulf, D.; Zhang, S.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Datesman, A. M.; Eckart, M. E.; Ewin, A. J.; Finkbeiner, F. M.; Kelley, R. L.; Kilbourne, C. A.; Miniussi, A. R.; Porter, F. S.; Sadleir, J. E.; Sakai, K.; Smith, S. J.; Wakeham, N. A.; Wassell, E. J.; Yoon, W.
2018-05-01
We have specialized astronomical applications for X-ray microcalorimeters with superconducting transition edge sensors (TESs) that require exceptionally good TES performance, but which operate in the small-signal regime. We have therefore begun a program to carefully characterize the entire transition surface of TESs with and without the usual zebra stripes to see if there are reproducible local "sweet spots" where the performance is much better than average. These measurements require precise knowledge of the circuit parameters. Here, we show how the Shapiro effect can be used to precisely calibrate the value of the shunt resistor. We are also investigating the effects of stress and external magnetic fields to better understand reproducibility problems.
Is There Space for the Objective Force?
2003-04-07
force through the combination of precision weapons and knowledge-based warfare. Army forces will survive through information dominance , provided by a...Objective Forces. Space-based systems will be foundational building blocks for the Objective Force to achieve information dominance and satellite...communications required for information dominance across a distributed battlefield? Second, what exists to provide the Objective Force information
Utilizing AI in Temporal, Spatial, and Resource Scheduling
NASA Technical Reports Server (NTRS)
Stottler, Richard; Kalton, Annaka; Bell, Aaron
2006-01-01
Aurora is a software system enabling the rapid, easy solution of complex scheduling problems involving spatial and temporal constraints among operations and scarce resources (such as equipment, workspace, and human experts). Although developed for use in the International Space Station Processing Facility, Aurora is flexible enough that it can be easily customized for application to other scheduling domains and adapted as the requirements change or become more precisely known over time. Aurora s scheduling module utilizes artificial-intelligence (AI) techniques to make scheduling decisions on the basis of domain knowledge, including knowledge of constraints and their relative importance, interdependencies among operations, and possibly frequent changes in governing schedule requirements. Unlike many other scheduling software systems, Aurora focuses on resource requirements and temporal scheduling in combination. For example, Aurora can accommodate a domain requirement to schedule two subsequent operations to locations adjacent to a shared resource. The graphical interface allows the user to quickly visualize the schedule and perform changes reflecting additional knowledge or alterations in the situation. For example, the user might drag the activity corresponding to the start of operations to reflect a late delivery.
Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard
2017-01-01
An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.
A primer on precision medicine informatics.
Sboner, Andrea; Elemento, Olivier
2016-01-01
In this review, we describe key components of a computational infrastructure for a precision medicine program that is based on clinical-grade genomic sequencing. Specific aspects covered in this review include software components and hardware infrastructure, reporting, integration into Electronic Health Records for routine clinical use and regulatory aspects. We emphasize informatics components related to reproducibility and reliability in genomic testing, regulatory compliance, traceability and documentation of processes, integration into clinical workflows, privacy requirements, prioritization and interpretation of results to report based on clinical needs, rapidly evolving knowledge base of genomic alterations and clinical treatments and return of results in a timely and predictable fashion. We also seek to differentiate between the use of precision medicine in germline and cancer. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Sliding mode control of magnetic suspensions for precision pointing and tracking applications
NASA Technical Reports Server (NTRS)
Misovec, Kathleen M.; Flynn, Frederick J.; Johnson, Bruce G.; Hedrick, J. Karl
1991-01-01
A recently developed nonlinear control method, sliding mode control, is examined as a means of advancing the achievable performance of space-based precision pointing and tracking systems that use nonlinear magnetic actuators. Analytic results indicate that sliding mode control improves performance compared to linear control approaches. In order to realize these performance improvements, precise knowledge of the plant is required. Additionally, the interaction of an estimating scheme and the sliding mode controller has not been fully examined in the literature. Estimation schemes were designed for use with this sliding mode controller that do not seriously degrade system performance. The authors designed and built a laboratory testbed to determine the feasibility of utilizing sliding mode control in these types of applications. Using this testbed, experimental verification of the authors' analyses is ongoing.
Facial trauma: general principles of management.
Hollier, Larry H; Sharabi, Safa E; Koshy, John C; Stal, Samuel
2010-07-01
Facial fractures are common problems encountered by the plastic surgeon. Although ubiquitous in nature, their optimal treatment requires precise knowledge of the most recent evidence-based and technologically advanced recommendations. This article discusses a variety of contemporary issues regarding facial fractures, including physical and radiologic diagnosis, treatment pearls and caveats, and the role of various synthetic materials and plating technologies for optimal facial fracture fixation.
RDF SKETCH MAPS - KNOWLEDGE COMPLEXITY REDUCTION FOR PRECISION MEDICINE ANALYTICS.
Thanintorn, Nattapon; Wang, Juexin; Ersoy, Ilker; Al-Taie, Zainab; Jiang, Yuexu; Wang, Duolin; Verma, Megha; Joshi, Trupti; Hammer, Richard; Xu, Dong; Shin, Dmitriy
2016-01-01
Realization of precision medicine ideas requires significant research effort to be able to spot subtle differences in complex diseases at the molecular level to develop personalized therapies. It is especially important in many cases of highly heterogeneous cancers. Precision diagnostics and therapeutics of such diseases demands interrogation of vast amounts of biological knowledge coupled with novel analytic methodologies. For instance, pathway-based approaches can shed light on the way tumorigenesis takes place in individual patient cases and pinpoint to novel drug targets. However, comprehensive analysis of hundreds of pathways and thousands of genes creates a combinatorial explosion, that is challenging for medical practitioners to handle at the point of care. Here we extend our previous work on mapping clinical omics data to curated Resource Description Framework (RDF) knowledge bases to derive influence diagrams of interrelationships of biomarker proteins, diseases and signal transduction pathways for personalized theranostics. We present RDF Sketch Maps - a computational method to reduce knowledge complexity for precision medicine analytics. The method of RDF Sketch Maps is inspired by the way a sketch artist conveys only important visual information and discards other unnecessary details. In our case, we compute and retain only so-called RDF Edges - places with highly important diagnostic and therapeutic information. To do this we utilize 35 maps of human signal transduction pathways by transforming 300 KEGG maps into highly processable RDF knowledge base. We have demonstrated potential clinical utility of RDF Sketch Maps in hematopoietic cancers, including analysis of pathways associated with Hairy Cell Leukemia (HCL) and Chronic Myeloid Leukemia (CML) where we achieved up to 20-fold reduction in the number of biological entities to be analyzed, while retaining most likely important entities. In experiments with pathways associated with HCL a generated RDF Sketch Map of the top 30% paths retained important information about signaling cascades leading to activation of proto-oncogene BRAF, which is usually associated with a different cancer, melanoma. Recent reports of successful treatments of HCL patients by the BRAF-targeted drug vemurafenib support the validity of the RDF Sketch Maps findings. We therefore believe that RDF Sketch Maps will be invaluable for hypothesis generation for precision diagnostics and therapeutics as well as drug repurposing studies.
Precise measurement of the performance of thermoelectric modules
NASA Astrophysics Data System (ADS)
Díaz-Chao, Pablo; Muñiz-Piniella, Andrés; Selezneva, Ekaterina; Cuenat, Alexandre
2016-08-01
The potential exploitation of thermoelectric modules into mass market applications such as exhaust gas heat recovery in combustion engines requires an accurate knowledge of their performance. Further expansion of the market will also require confidence on the results provided by suppliers to end-users. However, large variation in performance and maximum operating point is observed for identical modules when tested by different laboratories. Here, we present the first metrological study of the impact of mounting and testing procedures on the precision of thermoelectric modules measurement. Variability in the electrical output due to mechanical pressure or type of thermal interface materials is quantified for the first time. The respective contribution of the temperature difference and the mean temperature to the variation in the output performance is quantified. The contribution of these factors to the total uncertainties in module characterisation is detailed.
Evaluation of SAPHIRE: an automated approach to indexing and retrieving medical literature.
Hersh, W.; Hickam, D. H.; Haynes, R. B.; McKibbon, K. A.
1991-01-01
An analysis of SAPHIRE, an experimental information retrieval system featuring automated indexing and natural language retrieval, was performed on MEDLINE references using data previously generated for a MEDLINE evaluation. Compared with searches performed by novice and expert physicians using MEDLINE, SAPHIRE achieved comparable recall and precision. While its combined recall and precision performance did not equal the level of librarians, SAPHIRE did achieve a significantly higher level of absolute recall. SAPHIRE has other potential advantages over existing MEDLINE systems. Its natural language interface does not require knowledge of MeSH, and it provides relevance ranking of retrieved references. PMID:1807718
Solar axion search technique with correlated signals from multiple detectors
Xu, Wenqin; Elliott, Steven R.
2017-01-25
The coherent Bragg scattering of photons converted from solar axions inside crystals would boost the signal for axion-photon coupling enhancing experimental sensitivity for these hypothetical particles. Knowledge of the scattering angle of solar axions with respect to the crystal lattice is required to make theoretical predications of signal strength. Hence, both the lattice axis angle within a crystal and the absolute angle between the crystal and the Sun must be known. In this paper, we examine how the experimental sensitivity changes with respect to various experimental parameters. We also demonstrate that, in a multiple-crystal setup, knowledge of the relative axismore » orientation between multiple crystals can improve the experimental sensitivity, or equivalently, relax the precision on the absolute solar angle measurement. However, if absolute angles of all crystal axes are measured, we find that a precision of 2°–4° will suffice for an energy resolution of σ E = 0.04E and a flat background. Lastly, we also show that, given a minimum number of detectors, a signal model averaged over angles can substitute for precise crystal angular measurements, with some loss of sensitivity.« less
Last Glacial Maximum Salinity Reconstruction
NASA Astrophysics Data System (ADS)
Homola, K.; Spivack, A. J.
2016-12-01
It has been previously demonstrated that salinity can be reconstructed from sediment porewater. The goal of our study is to reconstruct high precision salinity during the Last Glacial Maximum (LGM). Salinity is usually determined at high precision via conductivity, which requires a larger volume of water than can be extracted from a sediment core, or via chloride titration, which yields lower than ideal precision. It has been demonstrated for water column samples that high precision density measurements can be used to determine salinity at the precision of a conductivity measurement using the equation of state of seawater. However, water column seawater has a relatively constant composition, in contrast to porewater, where variations from standard seawater composition occur. These deviations, which affect the equation of state, must be corrected for through precise measurements of each ion's concentration and knowledge of apparent partial molar density in seawater. We have developed a density-based method for determining porewater salinity that requires only 5 mL of sample, achieving density precisions of 10-6 g/mL. We have applied this method to porewater samples extracted from long cores collected along a N-S transect across the western North Atlantic (R/V Knorr cruise KN223). Density was determined to a precision of 2.3x10-6 g/mL, which translates to salinity uncertainty of 0.002 gms/kg if the effect of differences in composition is well constrained. Concentrations of anions (Cl-, and SO4-2) and cations (Na+, Mg+, Ca+2, and K+) were measured. To correct salinities at the precision required to unravel LGM Meridional Overturning Circulation, our ion precisions must be better than 0.1% for SO4-/Cl- and Mg+/Na+, and 0.4% for Ca+/Na+, and K+/Na+. Alkalinity, pH and Dissolved Inorganic Carbon of the porewater were determined to precisions better than 4% when ratioed to Cl-, and used to calculate HCO3-, and CO3-2. Apparent partial molar densities in seawater were determined experimentally. We compare the high precision salinity profiles determined using our new method to profiles determined from the traditional chloride titrations of parallel samples. Our technique provides a more accurate reconstruction of past salinity, informing questions of water mass composition and distribution during the LGM.
Precision Attitude Control for the BETTII Balloon-Borne Interferometer
NASA Technical Reports Server (NTRS)
Benford, Dominic J.; Fixsen, Dale J.; Rinehart. Stephen
2012-01-01
The Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII) is an 8-meter baseline far-infrared interferometer to fly on a high altitude balloon. Operating at wavelengths of 30-90 microns, BETTII will obtain spatial and spectral information on science targets at angular resolutions down to less than half an arcsecond, a capability unmatched by other far-infrared facilities. This requires attitude control at a level ofless than a tenth of an arcsecond, a great challenge for a lightweight balloon-borne system. We have designed a precision attitude determination system to provide gondola attitude knowledge at a level of 2 milliarcseconds at rates up to 100Hz, with accurate absolute attitude determination at the half arcsecond level at rates of up to 10Hz. A mUlti-stage control system involving rigid body motion and tip-tilt-piston correction provides precision pointing stability to the level required for the far-infrared instrument to perform its spatial/spectral interferometry in an open-loop control. We present key aspects of the design of the attitude determination and control and its development status.
Requirements and Solutions for Personalized Health Systems.
Blobel, Bernd; Ruotsalainen, Pekka; Lopez, Diego M; Oemig, Frank
2017-01-01
Organizational, methodological and technological paradigm changes enable a precise, personalized, predictive, preventive and participative approach to health and social services supported by multiple actors from different domains at diverse level of knowledge and skills. Interoperability has to advance beyond Information and Communication Technologies (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. The paper introduces and compares personalized health definitions, summarizes requirements and principles for pHealth systems, and considers intelligent interoperability. It addresses knowledge representation and harmonization, decision intelligence, and usability as crucial issues in pHealth. On this basis, a system-theoretical, ontology-based, policy-driven reference architecture model for open and intelligent pHealth ecosystems and its transformation into an appropriate ICT design and implementation is proposed.
Mars Transportation Environment Definition Document
NASA Technical Reports Server (NTRS)
Alexander, M. (Editor)
2001-01-01
This document provides a compilation of environments knowledge about the planet Mars. Information is divided into three catagories: (1) interplanetary space environments (environments required by the technical community to travel to and from Mars); (2) atmospheric environments (environments needed to aerocapture, aerobrake, or use aeroassist for precision trajectories down to the surface); and (3) surface environments (environments needed to have robots or explorers survive and work on the surface).
Precision medicine at the crossroads.
Olson, Maynard V
2017-10-11
There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.
Self-Knowledge and Risk in Stratified Medicine
Hordern, Joshua
2017-01-01
This article considers why and how self-knowledge is important to communication about risk and behaviour change by arguing for four claims. First, it is doubtful that genetic knowledge should properly be called ‘self-knowledge’ when its ordinary effects on self-motivation and behaviour change seem so slight. Second, temptations towards a reductionist, fatalist, construal of persons’ futures through a ‘molecular optic’ should be resisted. Third, any plausible effort to change people's behaviour must engage with cultural self-knowledge, values and beliefs, catalysed by the communication of genetic risk. For example, while a Judaeo-Christian notion of self-knowledge is distinctively theological, people's self-knowledge is plural in its insight and sources. Fourth, self-knowledge is found in compassionate, if tense, communion which yields freedom from determinism even amidst suffering. Stratified medicine thus offers a newly precise kind of humanising health care through societal solidarity with the riskiest. However, stratification may also mean that molecularly unstratified, ‘B’ patients’ experience involves accentuated suffering and disappointment, a concern requiring further research. PMID:28517991
A novel knowledge-based potential for RNA 3D structure evaluation
NASA Astrophysics Data System (ADS)
Yang, Yi; Gu, Qi; Zhang, Ben-Gong; Shi, Ya-Zhou; Shao, Zhi-Gang
2018-03-01
Ribonucleic acids (RNAs) play a vital role in biology, and knowledge of their three-dimensional (3D) structure is required to understand their biological functions. Recently structural prediction methods have been developed to address this issue, but a series of RNA 3D structures are generally predicted by most existing methods. Therefore, the evaluation of the predicted structures is generally indispensable. Although several methods have been proposed to assess RNA 3D structures, the existing methods are not precise enough. In this work, a new all-atom knowledge-based potential is developed for more accurately evaluating RNA 3D structures. The potential not only includes local and nonlocal interactions but also fully considers the specificity of each RNA by introducing a retraining mechanism. Based on extensive test sets generated from independent methods, the proposed potential correctly distinguished the native state and ranked near-native conformations to effectively select the best. Furthermore, the proposed potential precisely captured RNA structural features such as base-stacking and base-pairing. Comparisons with existing potential methods show that the proposed potential is very reliable and accurate in RNA 3D structure evaluation. Project supported by the National Science Foundation of China (Grants Nos. 11605125, 11105054, 11274124, and 11401448).
NIAC Phase I Study Final Report on Large Ultra-Lightweight Photonic Muscle Space Structures
NASA Technical Reports Server (NTRS)
Ritter, Joe
2016-01-01
The research goal is to develop new tools support NASA's mission of understanding of the Cosmos by developing cost effective solutions that yield a leap in performance and science data. 'Maikalani' in Hawaiian translates to, "knowledge we gain from the cosmos." Missions like Hubble have fundamentally changed humanity's view of the cosmos. Last year's Nobel prize in physics was a result of astronomical discoveries. $9B class JWST size (6.5 meter diameter) space telescopes, when launched are anticipated to rewrite our knowledge of physics. Here we report on a neoteric meta-material telescope mirror technology designed to enable a factor of 100 or more reduction in areal density, a factor of 100 reduction in telescope production and launch costs as well as other advantages; a leap to enable missions to image the cosmos in unprecedented detail, with the associated gain in knowledge. Whether terahertz, visible or X-ray, reflectors used for high quality electromagnetic imaging require shape accuracy (surface figure) to far better than 1 wavelength (lambda) of the incident photons, more typically lambda/10 or better. Imaging visible light therefore requires mirror surfaces that approximate a desired curve (e.g. a sphere or paraboloid) with smooth shape deviation of th less than approximately 1/1000 the diameter of a human hair. This requires either thick high modulus material like glass or metal, or actuators to control mirror shape. During Phase I our team studied a novel solution to this systems level design mass/shape tradespace requirement both to advance the innovative space technology concept and also to help NASA and other agencies meet current operational and future mission requirements. Extreme and revolutionary NASA imaging missions such as Terrestrial Planet Imager (TPI) require lightweight mirrors with minimum diameters of 20 to 40 meters. For reference, NASA's great achievement; the Hubble space telescope, is only 2.4 meters in diameter. What is required is a way to make large inexpensive deployable mirrors where the cost is measured in millions, not billions like current efforts. For example we seek an interim goal within 10 years of a Hubble size (2.4m) primary mirror weighing 1 pound at a cost of 10K in materials. Described here is a technology using thin ultra lightweight materials where shape can be controlled simply with a beam of light, allowing imaging with incredibly low mass yet precisely shaped mirrors. These " Photonic Muscle" substrates will eventually make precision control of giant s p a c e apertures (mirrors) possible. OCCAM substrates make precision control of giant ultra light-weight mirror apertures possible. This technology is posed to create a revolution in remote sensing by making large ultra lightweight space telescopes a fiscal and material reality over the next decade.
Gao, Qinqin; Tang, Jiaqi; Li, Na; Liu, Bailin; Zhang, Mengshu; Sun, Miao; Xu, Zhice
2018-02-01
It is widely accepted that placental ischemia is central in the evolution of hypertension in pregnancy. Many studies and reviews have targeted placental ischemia to explain mechanisms for initiating pregnancy hypertension. The placenta is rich in blood vessels, which are the basis for developing placental ischemia. However, is the physiology of placental vessels the same as that of nonplacental vessels? What is the pathophysiology of placental vessels in development of pregnancy hypertension? This review aims to provide a comprehensive summary of special features of placental vascular regulations and the pathophysiological changes linked to preeclamptic conditions. Interestingly, some popular theories or accepted concepts could be based on our limited knowledge and evidence regarding placental vascular physiology, pharmacology and pathophysiology. New views raised could offer interesting ideas for future investigation of mechanisms as well as targets for pregnancy hypertension. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tauscher, Keith; Rapetti, David; Burns, Jack O.; Switzer, Eric
2018-02-01
The sky-averaged (global) highly redshifted 21 cm spectrum from neutral hydrogen is expected to appear in the VHF range of ∼20–200 MHz and its spectral shape and strength are determined by the heating properties of the first stars and black holes, by the nature and duration of reionization, and by the presence or absence of exotic physics. Measurements of the global signal would therefore provide us with a wealth of astrophysical and cosmological knowledge. However, the signal has not yet been detected because it must be seen through strong foregrounds weighted by a large beam, instrumental calibration errors, and ionospheric, ground, and radio-frequency-interference effects, which we collectively refer to as “systematics.” Here, we present a signal extraction method for global signal experiments which uses Singular Value Decomposition of “training sets” to produce systematics basis functions specifically suited to each observation. Instead of requiring precise absolute knowledge of the systematics, our method effectively requires precise knowledge of how the systematics can vary. After calculating eigenmodes for the signal and systematics, we perform a weighted least square fit of the corresponding coefficients and select the number of modes to include by minimizing an information criterion. We compare the performance of the signal extraction when minimizing various information criteria and find that minimizing the Deviance Information Criterion most consistently yields unbiased fits. The methods used here are built into our widely applicable, publicly available Python package, pylinex, which analytically calculates constraints on signals and systematics from given data, errors, and training sets.
Lüscher, T F
2014-08-01
The publication of scientific manuscripts is an essential part in the research process and in the attempt to produce novel knowledge: only what is published exists. It is the aim of research to produce reproducible and sustainable knowledge. Reproducible knowledge is based on precise observation, the use of modern methodologies and an appropriate statistical analysis. As a consequence, it must be the intention of any scientist to report the truth and nothing but the truth. This principle requires precision and honesty. Deviation from such a behavior may lead to scientific misconduct: It encompasses the use of inappropriate methods and/or statistics, double publication of data, sloppy data presentation and processing, up to data massaging, manipulation, data theft or fabrication. Famous examples can be found throughout the history of research but it appears that such behavior has recently become more common possibly due to excessive competition, the crucial role of grants for scientific productivity and funding as well as promotion. Accordingly, in the training of researchers it seems essential to emphasize the importance of precise data acquisition and analysis to ascertain reproducible data. Similarly, it must be assured that data sets are only published once, that authors have contributed technically and/or intellectually in an important manner and that the work of other scientists is cited appropriately. Editors and reviewers should carefully assess the quality of submitted manuscripts. In fact, it is the aim of the peer review process to assure as much as possible that the quality of submitted manuscripts meets current methodological as well as ethical standards.
Relationship Between Climate Change Impact, Migration and Socioeconomic Development
NASA Astrophysics Data System (ADS)
Sann Oo, Kyaw
2016-06-01
Geospatial data are available in raster and vector formats and some of them are available in open data form. The technique and tools to handle those data are also available in open source. Though it is free of charge, the knowledge to utilize those data is limited to non-educated in the specific field. The data and technology should be promoted to those levels to utilize in required fields with priceless in developing countries. Before utilize open data, which are required to verify with local knowledge to become usable information for the local people as priceless data resources. Developing country, which economic is based in agriculture, required more information about precise weather data and weather variation by the climate change impact for their socioeconomic development. This study found that rural to urban migration occurs in the developing countries such agriculture based country likes Myanmar when the agriculture economic are affected by unpredictable impact by the climate change. The knowledge sharing using open data resources to non-educated local people is one of the curable solutions for the agriculture economy development in the country. Moreover, the study will find ways to reduce the rural to urban migration.
The Science and Art of Eyebrow Transplantation by Follicular Unit Extraction
Gupta, Jyoti; Kumar, Amrendra; Chouhan, Kavish; Ariganesh, C; Nandal, Vinay
2017-01-01
Eyebrows constitute a very important and prominent feature of the face. With growing information, eyebrow transplant has become a popular procedure. However, though it is a small area it requires a lot of precision and knowledge regarding anatomy, designing of brows, extraction and implantation technique. This article gives a comprehensive view regarding eyebrow transplant with special emphasis on follicular unit extraction technique, which has become the most popular technique. PMID:28852290
Integrating Disparate Information
2009-04-21
are intended to encapsulate some loosely articulated notions about the unknowns. The purpose of this paper is to propose a framework that is able to...show how each of these terms can be made precise, so that each reflects a distinct meaning. To construct our framework , we use a basic scenario upon...practice, namely our proposed framework , is the novel aspect of this paper. To appreciate all this, we require of the reader a knowledge of the calculus of
Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle
NASA Technical Reports Server (NTRS)
VanEepoel, John; Thienel, Julie; Sanner, Robert M.
2006-01-01
In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.
Precision agriculture: Data to knowledge decision
USDA-ARS?s Scientific Manuscript database
From the development of the first viable variable-rate fertilizer systems in the upper Midwest USA, precision agriculture is now about two decades old. In that time, new technologies have come into play, but the overall goal of using spatial data to create actionable knowledge that can then be used ...
Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle
NASA Technical Reports Server (NTRS)
Thienel, Julie K.; Sanner, Robert M.
2006-01-01
Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.
Curigliano, Giuseppe
2014-01-01
The recognition that cancer is a 'spectrum' of diseases, and that medical oncologists should achieve 'convergence' from 'divergence' to treat cancer patients was the main theme of the 2014 European Society of Medical Oncology (ESMO) Congress. The meeting assembled 19,859 participants from nearly 134 countries worldwide. The educational content was centered on precision medicine in cancer care, from mutational burden to the immunome, through the epigenome and the proteome. Precision medicine has been defined as the tailoring of medical treatment to the characteristics of an individual patient. Knowing an individual's genomics has created a remarkable and unprecedented opportunity to improve medical treatment and develop preventative strategies to preserve health. Clinical oncologists across the range of diseases recognise that for precision medicine to take hold, it will require intensive, rigorous validation that these new approaches do indeed improve patient outcomes. Not all molecular alterations are predictive of response to a specific targeted treatment nor are they all druggable, raising issues of cost-benefit, validation of specific biomarkers, and of managing patient expectations. Addressing all these issues will be essential for the medical community to embrace any given opportunities. Along with it, it will also require educational programmes that squarely address the knowledge chasm that currently exists for practicing physicians. The promise of genomic and precision medicine has created greater demands for both those providing the scientific expertise-bioinformatics, statisticians, molecular biologists-and those delivering clinical care-physicians, nurses, psychologists-to the patients. This ESMO 2014 report will highlight the major findings of this outstanding meeting.
Decentralized Adaptive Control For Robots
NASA Technical Reports Server (NTRS)
Seraji, Homayoun
1989-01-01
Precise knowledge of dynamics not required. Proposed scheme for control of multijointed robotic manipulator calls for independent control subsystem for each joint, consisting of proportional/integral/derivative feedback controller and position/velocity/acceleration feedforward controller, both with adjustable gains. Independent joint controller compensates for unpredictable effects, gravitation, and dynamic coupling between motions of joints, while forcing joints to track reference trajectories. Scheme amenable to parallel processing in distributed computing system wherein each joint controlled by relatively simple algorithm on dedicated microprocessor.
Le Floc’h, Simon; Tracqui, Philippe; Finet, Gérard; Gharib, Ahmed M.; Maurice, Roch L.; Cloutier, Guy; Pettigrew, Roderic I.
2016-01-01
It is now recognized that prediction of the vulnerable coronary plaque rupture requires not only an accurate quantification of fibrous cap thickness and necrotic core morphology but also a precise knowledge of the mechanical properties of plaque components. Indeed, such knowledge would allow a precise evaluation of the peak cap-stress amplitude, which is known to be a good biomechanical predictor of plaque rupture. Several studies have been performed to reconstruct a Young’s modulus map from strain elastograms. It seems that the main issue for improving such methods does not rely on the optimization algorithm itself, but rather on preconditioning requiring the best estimation of the plaque components’ contours. The present theoretical study was therefore designed to develop: 1) a preconditioning model to extract the plaque morphology in order to initiate the optimization process, and 2) an approach combining a dynamic segmentation method with an optimization procedure to highlight the modulogram of the atherosclerotic plaque. This methodology, based on the continuum mechanics theory prescribing the strain field, was successfully applied to seven intravascular ultrasound coronary lesion morphologies. The reconstructed cap thickness, necrotic core area, calcium area, and the Young’s moduli of the calcium, necrotic core, and fibrosis were obtained with mean relative errors of 12%, 4% and 1%, 43%, 32%, and 2%, respectively. PMID:19164080
Integration of Temporal and Ordinal Information During Serial Interception Sequence Learning
Gobel, Eric W.; Sanchez, Daniel J.; Reber, Paul J.
2011-01-01
The expression of expert motor skills typically involves learning to perform a precisely timed sequence of movements (e.g., language production, music performance, athletic skills). Research examining incidental sequence learning has previously relied on a perceptually-cued task that gives participants exposure to repeating motor sequences but does not require timing of responses for accuracy. Using a novel perceptual-motor sequence learning task, learning a precisely timed cued sequence of motor actions is shown to occur without explicit instruction. Participants learned a repeating sequence through practice and showed sequence-specific knowledge via a performance decrement when switched to an unfamiliar sequence. In a second experiment, the integration of representation of action order and timing sequence knowledge was examined. When either action order or timing sequence information was selectively disrupted, performance was reduced to levels similar to completely novel sequences. Unlike prior sequence-learning research that has found timing information to be secondary to learning action sequences, when the task demands require accurate action and timing information, an integrated representation of these types of information is acquired. These results provide the first evidence for incidental learning of fully integrated action and timing sequence information in the absence of an independent representation of action order, and suggest that this integrative mechanism may play a material role in the acquisition of complex motor skills. PMID:21417511
Motor skill depends on knowledge of facts.
Stanley, Jason; Krakauer, John W
2013-01-01
Those in 20th century philosophy, psychology, and neuroscience who have discussed the nature of skilled action have, for the most part, accepted the view that being skilled at an activity is independent of knowing facts about that activity, i.e., that skill is independent of knowledge of facts. In this paper we question this view of motor skill. We begin by situating the notion of skill in historical and philosophical context. We use the discussion to explain and motivate the view that motor skill depends upon knowledge of facts. This conclusion seemingly contradicts well-known results in cognitive science. It is natural, on the face of it, to take the case of H.M., the seminal case in cognitive neuroscience that led to the discovery of different memory systems, as providing powerful evidence for the independence of knowledge and skill acquisition. After all, H.M. seems to show that motor learning is retained even when previous knowledge about the activity has been lost. Improvements in skill generally require increased precision of selected actions, which we call motor acuity. Motor acuity may indeed not require propositional knowledge and has direct parallels with perceptual acuity. We argue, however, that reflection on the specifics of H.M.'s case, as well as other research on the nature of skill, indicates that learning to become skilled at a motor task, for example tennis, depends also on knowledge-based selection of the right actions. Thus skilled activity requires both acuity and knowledge, with both increasing with practice. The moral of our discussion ranges beyond debates about motor skill; we argue that it undermines any attempt to draw a distinction between practical and theoretical activities. While we will reject the independence of skill and knowledge, our discussion leaves open several different possible relations between knowledge and skill. Deciding between them is a task to be resolved by future research.
Motor skill depends on knowledge of facts
Stanley, Jason; Krakauer, John W.
2013-01-01
Those in 20th century philosophy, psychology, and neuroscience who have discussed the nature of skilled action have, for the most part, accepted the view that being skilled at an activity is independent of knowing facts about that activity, i.e., that skill is independent of knowledge of facts. In this paper we question this view of motor skill. We begin by situating the notion of skill in historical and philosophical context. We use the discussion to explain and motivate the view that motor skill depends upon knowledge of facts. This conclusion seemingly contradicts well-known results in cognitive science. It is natural, on the face of it, to take the case of H.M., the seminal case in cognitive neuroscience that led to the discovery of different memory systems, as providing powerful evidence for the independence of knowledge and skill acquisition. After all, H.M. seems to show that motor learning is retained even when previous knowledge about the activity has been lost. Improvements in skill generally require increased precision of selected actions, which we call motor acuity. Motor acuity may indeed not require propositional knowledge and has direct parallels with perceptual acuity. We argue, however, that reflection on the specifics of H.M.'s case, as well as other research on the nature of skill, indicates that learning to become skilled at a motor task, for example tennis, depends also on knowledge-based selection of the right actions. Thus skilled activity requires both acuity and knowledge, with both increasing with practice. The moral of our discussion ranges beyond debates about motor skill; we argue that it undermines any attempt to draw a distinction between practical and theoretical activities. While we will reject the independence of skill and knowledge, our discussion leaves open several different possible relations between knowledge and skill. Deciding between them is a task to be resolved by future research. PMID:24009571
Combining rules, background knowledge and change patterns to maintain semantic annotations.
Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric
2017-01-01
Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system.
Combining rules, background knowledge and change patterns to maintain semantic annotations
Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric
2017-01-01
Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system. PMID:29854115
Digital sun sensor multi-spot operation.
Rufino, Giancarlo; Grassi, Michele
2012-11-28
The operation and test of a multi-spot digital sun sensor for precise sun-line determination is described. The image forming system consists of an opaque mask with multiple pinhole apertures producing multiple, simultaneous, spot-like images of the sun on the focal plane. The sun-line precision can be improved by averaging multiple simultaneous measures. Nevertheless, the sensor operation on a wide field of view requires acquiring and processing images in which the number of sun spots and the related intensity level are largely variable. To this end, a reliable and robust image acquisition procedure based on a variable shutter time has been considered as well as a calibration function exploiting also the knowledge of the sun-spot array size. Main focus of the present paper is the experimental validation of the wide field of view operation of the sensor by using a sensor prototype and a laboratory test facility. Results demonstrate that it is possible to keep high measurement precision also for large off-boresight angles.
Artificial Intelligence in Precision Cardiovascular Medicine.
Krittanawong, Chayakrit; Zhang, HongJu; Wang, Zhen; Aydar, Mehmet; Kitai, Takeshi
2017-05-30
Artificial intelligence (AI) is a field of computer science that aims to mimic human thought processes, learning capacity, and knowledge storage. AI techniques have been applied in cardiovascular medicine to explore novel genotypes and phenotypes in existing diseases, improve the quality of patient care, enable cost-effectiveness, and reduce readmission and mortality rates. Over the past decade, several machine-learning techniques have been used for cardiovascular disease diagnosis and prediction. Each problem requires some degree of understanding of the problem, in terms of cardiovascular medicine and statistics, to apply the optimal machine-learning algorithm. In the near future, AI will result in a paradigm shift toward precision cardiovascular medicine. The potential of AI in cardiovascular medicine is tremendous; however, ignorance of the challenges may overshadow its potential clinical impact. This paper gives a glimpse of AI's application in cardiovascular clinical care and discusses its potential role in facilitating precision cardiovascular medicine. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Imprecision in the Era of Precision Medicine in Non-Small Cell Lung Cancer
Sundar, Raghav; Chénard-Poirier, Maxime; Collins, Dearbhaile Catherine; Yap, Timothy A.
2017-01-01
Over the past decade, major advances have been made in the management of advanced non-small cell lung cancer (NSCLC). There has been a particular focus on the identification and targeting of putative driver aberrations, which has propelled NSCLC to the forefront of precision medicine. Several novel molecularly targeted agents have now achieved regulatory approval, while many others are currently in late-phase clinical trial testing. These antitumor therapies have significantly impacted the clinical outcomes of advanced NSCLC and provided patients with much hope for the future. Despite this, multiple deficiencies still exist in our knowledge of this complex disease, and further research is urgently required to overcome these critical issues. This review traces the path undertaken by the different therapeutics assessed in NSCLC and the impact of precision medicine in this disease. We also discuss the areas of “imprecision” that still exist in NSCLC and the modern hypothesis-testing studies being conducted to address these key challenges. PMID:28443282
Microwave, Millimeter, Submillimeter, and Far Infrared Spectral Databases
NASA Technical Reports Server (NTRS)
Pearson, J. C.; Pickett, H. M.; Drouin, B. J.; Chen, P.; Cohen, E. A.
2002-01-01
The spectrum of most known astrophysical molecules is derived from transitions between a few hundred to a few hundred thousand energy levels populated at room temperature. In the microwave and millimeter wave regions. spectroscopy is almost always performed with traditional microwave techniques. In the submillimeter and far infrared microwave technique becomes progressively more technologically challenging and infrared techniques become more widely employed as the wavelength gets shorter. Infrared techniques are typically one to two orders of magnitude less precise but they do generate all the strong features in the spectrum. With microwave technique, it is generally impossible and rarely necessary to measure every single transition of a molecular species, so careful fitting of quantum mechanical Hamiltonians to the transitions measured are required to produce the complete spectral picture of the molecule required by astronomers. The fitting process produces the most precise data possible and is required in the interpret heterodyne observations. The drawback of traditional microwave technique is that precise knowledge of the band origins of low lying excited states is rarely gained. The fitting of data interpolates well for the range of quantum numbers where there is laboratory data, but extrapolation is almost never precise. The majority of high resolution spectroscopic data is millimeter or longer in wavelength and a very limited number of molecules have ever been studied with microwave techniques at wavelengths shorter than 0.3 millimeters. The situation with infrared technique is similarly dire in the submillimeter and far infrared because the black body sources used are competing with a very significant thermal background making the signal to noise poor. Regardless of the technique used the data must be archived in a way useful for the interpretation of observations.
Distal Tracheal Resection and Reconstruction: State of the Art and Lessons Learned.
Mathisen, Douglas
2018-05-01
Tracheal disease is an infrequent problem requiring surgery. A high index of suspicion is necessary to correctly diagnose the problems. Primary concerns are safe control and assessment of the airway, familiarity with the principles of airway surgery, preserving tracheal blood supply, and avoiding anastomotic tension. A precise reproducible anastomotic technique must be mastered. Operation requires close cooperation with a knowledgeable anesthesia team. The surgeon must understand how to achieve the least tension on the anastomosis to avoid. It is advisable to examine the airway before discharge to check for normal healing and airway patency. Copyright © 2018 Elsevier Inc. All rights reserved.
Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic
NASA Technical Reports Server (NTRS)
Hjermstad, Chris
1986-01-01
Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.
Aufderheide, Helge; Rudolf, Lars; Gross, Thilo; Lafferty, Kevin D.
2013-01-01
Recent attempts to predict the response of large food webs to perturbations have revealed that in larger systems increasingly precise information on the elements of the system is required. Thus, the effort needed for good predictions grows quickly with the system's complexity. Here, we show that not all elements need to be measured equally well, suggesting that a more efficient allocation of effort is possible. We develop an iterative technique for determining an efficient measurement strategy. In model food webs, we find that it is most important to precisely measure the mortality and predation rates of long-lived, generalist, top predators. Prioritizing the study of such species will make it easier to understand the response of complex food webs to perturbations.
Exploring Flavor Physics with Lattice QCD
NASA Astrophysics Data System (ADS)
Du, Daping; Fermilab/MILC Collaborations Collaboration
2016-03-01
The Standard Model has been a very good description of the subatomic particle physics. In the search for physics beyond the Standard Model in the context of flavor physics, it is important to sharpen our probes using some gold-plated processes (such as B rare decays), which requires the knowledge of the input parameters, such as the Cabibbo-Kobayashi-Maskawa (CKM) matrix elements and other nonperturbative quantities, with sufficient precision. Lattice QCD is so far the only first-principle method which could compute these quantities with competitive and systematically improvable precision using the state of the art simulation techniques. I will discuss the recent progress of lattice QCD calculations on some of these nonpurturbative quantities and their applications in flavor physics. I will also discuss the implications and future perspectives of these calculations in flavor physics.
Smíd, Michal; Ferda, Jirí; Baxa, Jan; Cech, Jakub; Hájek, Tomás; Kreuzberg, Boris; Rokyta, Richard
2010-04-01
Precise determination of the aortic annulus size constitutes an integral part of the preoperative evaluation prior to aortic valve replacement. It enables the estimation of the size of prosthesis to be implanted. Knowledge of the size of the ascending aorta is required in the preoperative analysis and monitoring of its dilation enables the precise timing of the operation. Our goal was to compare the precision of measurement of the aortic annulus and ascending aorta using magnetic resonance (MR), multidetector-row computed tomography (MDCT), transthoracic echocardiography (TTE), and transoesophageal echocardiography (TEE) in patients with degenerative aortic stenosis. A total of 15 patients scheduled to have aortic valve replacement were enrolled into this prospective study. TTE was performed in all patients and was supplemented with TEE, CT and MR in the majority of patients. The values obtained were compared with perioperative measurements. For the measurement of aortic annulus, MR was found to be the most precise technique, followed by MDCT, TTE, and TEE. For the measurement of ascending aorta, MR again was found to be the most precise technique, followed by MDCT, TEE, and TTE. In our study, magnetic resonance was found to be the most precise technique for the measurement of aortic annulus and ascending aorta in patients with severe degenerative aortic stenosis. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
Heliostat kinematic system calibration using uncalibrated cameras
NASA Astrophysics Data System (ADS)
Burisch, Michael; Gomez, Luis; Olasolo, David; Villasante, Cristobal
2017-06-01
The efficiency of the solar field greatly depends on the ability of the heliostats to precisely reflect solar radiation onto a central receiver. To control the heliostats with such a precision accurate knowledge of the motion of each of them modeled as a kinematic system is required. Determining the parameters of this system for each heliostat by a calibration system is crucial for the efficient operation of the solar field. For small sized heliostats being able to make such a calibration in a fast and automatic manner is imperative as the solar field potentially contain tens or even hundreds of thousands of them. A calibration system which can rapidly recalibrate a whole solar field would also allow reducing costs. Heliostats are generally designed to provide stability over a large period of time. Being able to relax this requirement and compensate any occurring error by adapting parameters in a model, the costs of the heliostat can be reduced. The presented method describes such an automatic calibration system using uncalibrated cameras rigidly attached to each heliostat. The cameras are used to observe targets spread out through the solar field; based on this the kinematic system of the heliostat can be estimated with high precision. A comparison of this approach to similar solutions shows the viability of the proposed solution.
Duewer, D L; Lalonde, S A; Aubin, R A; Fourney, R M; Reeder, D J
1998-05-01
Knowledge of the expected uncertainty in restriction fragment length polymorphism (RFLP) measurements is required for confident exchange of such data among different laboratories. The total measurement uncertainty among all Technical Working Group for DNA Analysis Methods laboratories has previously been characterized and found to be acceptably small. Casework cell line control measurements provided by six Royal Canadian Mounted Police (RCMP) and 30 U.S. commercial, local, state, and Federal forensic laboratories enable quantitative determination of the within-laboratory precision and among-laboratory concordance components of measurement uncertainty typical of both sets of laboratories. Measurement precision is the same in the two countries for DNA fragments of size 1000 base pairs (bp) to 10,000 bp. However, the measurement concordance among the RCMP laboratories is clearly superior to that within the U.S. forensic community. This result is attributable to the use of a single analytical protocol in all RCMP laboratories. Concordance among U.S. laboratories cannot be improved through simple mathematical adjustments. Community-wide efforts focused on improved concordance may be the most efficient mechanism for further reduction of among-laboratory RFLP measurement uncertainty, should the resources required to fully evaluate potential cross-jurisdictional matches become burdensome as the number of RFLP profiles on record increases.
Castaneda, Christian; Nalley, Kip; Mannion, Ciaran; Bhattacharyya, Pritish; Blake, Patrick; Pecora, Andrew; Goy, Andre; Suh, K Stephen
2015-01-01
As research laboratories and clinics collaborate to achieve precision medicine, both communities are required to understand mandated electronic health/medical record (EHR/EMR) initiatives that will be fully implemented in all clinics in the United States by 2015. Stakeholders will need to evaluate current record keeping practices and optimize and standardize methodologies to capture nearly all information in digital format. Collaborative efforts from academic and industry sectors are crucial to achieving higher efficacy in patient care while minimizing costs. Currently existing digitized data and information are present in multiple formats and are largely unstructured. In the absence of a universally accepted management system, departments and institutions continue to generate silos of information. As a result, invaluable and newly discovered knowledge is difficult to access. To accelerate biomedical research and reduce healthcare costs, clinical and bioinformatics systems must employ common data elements to create structured annotation forms enabling laboratories and clinics to capture sharable data in real time. Conversion of these datasets to knowable information should be a routine institutionalized process. New scientific knowledge and clinical discoveries can be shared via integrated knowledge environments defined by flexible data models and extensive use of standards, ontologies, vocabularies, and thesauri. In the clinical setting, aggregated knowledge must be displayed in user-friendly formats so that physicians, non-technical laboratory personnel, nurses, data/research coordinators, and end-users can enter data, access information, and understand the output. The effort to connect astronomical numbers of data points, including '-omics'-based molecular data, individual genome sequences, experimental data, patient clinical phenotypes, and follow-up data is a monumental task. Roadblocks to this vision of integration and interoperability include ethical, legal, and logistical concerns. Ensuring data security and protection of patient rights while simultaneously facilitating standardization is paramount to maintaining public support. The capabilities of supercomputing need to be applied strategically. A standardized, methodological implementation must be applied to developed artificial intelligence systems with the ability to integrate data and information into clinically relevant knowledge. Ultimately, the integration of bioinformatics and clinical data in a clinical decision support system promises precision medicine and cost effective and personalized patient care.
[The pathology of salivary glands. Tumors of the salivary glands].
Mahy, P; Reychler, H
2006-01-01
The management of benign and malignant neoplasms of the salivary glands requires precise knowledge of tumor histogenesis and classification as well as surgical skills. Pleomorphic adenoma and Whartin's tumor are the most frequent tumors in parotid glands while the probability for malignant tumors is higher in other glands, especially in sublingual and minor salivary glands. Those malignant salivary glands tumors are rare and necessitate multidisciplinar staging and management in close collaboration with the pathologist and the radiation oncologist.
Precision GPS ephemerides and baselines
NASA Technical Reports Server (NTRS)
1992-01-01
The required knowledge of the Global Positioning System (GPS) satellite position accuracy can vary depending on a particular application. Application to relative positioning of receiver locations on the ground to infer Earth's tectonic plate motion requires the most accurate knowledge of the GPS satellite orbits. Research directed towards improving and evaluating the accuracy of GPS satellite orbits was conducted at the University of Texas Center for Space Research (CSR). Understanding and modeling the forces acting on the satellites was a major focus of the research. Other aspects of orbit determination, such as the reference frame, time system, measurement modeling, and parameterization, were also investigated. Gravitational forces were modeled by truncated versions of extant gravity fields such as, Goddard Earth Model (GEM-L2), GEM-T1, TEG-2, and third body perturbations due to the Sun and Moon. Nongravitational forces considered were the solar radiation pressure, and perturbations due to thermal venting and thermal imbalance. At the GPS satellite orbit accuracy level required for crustal dynamic applications, models for the nongravitational perturbation play a critical role, since the gravitational forces are well understood and are modeled adequately for GPS satellite orbits.
Orthonormal filters for identification in active control systems
NASA Astrophysics Data System (ADS)
Mayer, Dirk
2015-12-01
Many active noise and vibration control systems require models of the control paths. When the controlled system changes slightly over time, adaptive digital filters for the identification of the models are useful. This paper aims at the investigation of a special class of adaptive digital filters: orthonormal filter banks possess the robust and simple adaptation of the widely applied finite impulse response (FIR) filters, but at a lower model order, which is important when considering implementation on embedded systems. However, the filter banks require prior knowledge about the resonance frequencies and damping of the structure. This knowledge can be supposed to be of limited precision, since in many practical systems, uncertainties in the structural parameters exist. In this work, a procedure using a number of training systems to find the fixed parameters for the filter banks is applied. The effect of uncertainties in the prior knowledge on the model error is examined both with a basic example and in an experiment. Furthermore, the possibilities to compensate for the imprecise prior knowledge by a higher filter order are investigated. Also comparisons with FIR filters are implemented in order to assess the possible advantages of the orthonormal filter banks. Numerical and experimental investigations show that significantly lower computational effort can be reached by the filter banks under certain conditions.
Lepage, D; Parratte, B; Tatu, L; Vuiller, F; Monnier, G
2005-12-01
Hypertonia of the upper limb due to spasticity causes pronation of the forearm and flexion of wrist and fingers. Nowadays this spasticity is often treated with injections of botulinum toxin and sometimes with selective fascicular neurotomy. To correctly perform this microsurgical technique, it is necessary to get precise knowledge of the extramuscular nerve branching in order to be better able to select the motor branches which supply the muscles involved in spasticity. The same knowledge is required for botulinum toxin injections which must be made as near as possible to the zones where intramuscular nerve endings are the densest, which is also where neuromuscular junctions are the most numerous. Thus, it is necessary to better know these zones, but their knowledge remains today imprecise. The muscles of the anterior compartment of 30 forearms were dissected, first macroscopically, then microscopically, to study the extra- and intramuscular nerve supply and the distribution of terminal nerve ramifications. The results were then linked to surface topographical landmarks to indicate the precise location of motor branches for each muscle with the aim of proposing appropriate surgical approaches for selective neurotomies. Then for each muscle, the zones with the highest density of nerve endings were divided into segments, thus determining the optimal zones for botulinim toxin injections.
H3Africa: current perspectives
Mulder, Nicola; Abimiku, Alash’le; Adebamowo, Sally N; de Vries, Jantina; Matimba, Alice; Olowoyo, Paul; Ramsay, Michele; Skelton, Michelle; Stein, Dan J
2018-01-01
Precision medicine is being enabled in high-income countries by the growing availability of health data, increasing knowledge of the genetic determinants of disease and variation in response to treatment (pharmacogenomics), and the decreasing costs of data generation, which promote routine application of genomic technologies in the health sector. However, there is uncertainty about the feasibility of applying precision medicine approaches in low- and middle-income countries, due to the lack of population-specific knowledge, skills, and resources. The Human Heredity and Health in Africa (H3Africa) initiative was established to drive new research into the genetic and environmental basis for human diseases of relevance to Africans as well as to build capacity for genomic research on the continent. Precision medicine requires this capacity, in addition to reference data on local populations, and skills to analyze and interpret genomic data from the bedside. The H3Africa consortium is collectively processing samples and data for over 70,000 participants across the continent, accompanied in most cases by rich clinical information on a variety of non-communicable and infectious diseases. These projects are increasingly providing novel insights into the genetic basis of diseases in indigenous populations, insights that have the potential to drive the development of new diagnostics and treatments. The consortium has also invested significant resources into establishing high-quality biorepositories in Africa, a bioinformatic network, and a strong training program that has developed skills in genomic data analysis and interpretation among bioinformaticians, wet-lab researchers, and health-care professionals. Here, we describe the current perspectives of the H3Africa consortium and how it can contribute to making precision medicine in Africa a reality. PMID:29692621
Towards precision medicine; a new biomedical cosmology.
Vegter, M W
2018-02-10
Precision Medicine has become a common label for data-intensive and patient-driven biomedical research. Its intended future is reflected in endeavours such as the Precision Medicine Initiative in the USA. This article addresses the question whether it is possible to discern a new 'medical cosmology' in Precision Medicine, a concept that was developed by Nicholas Jewson to describe comprehensive transformations involving various dimensions of biomedical knowledge and practice, such as vocabularies, the roles of patients and physicians and the conceptualisation of disease. Subsequently, I will elaborate my assessment of the features of Precision Medicine with the help of Michel Foucault, by exploring how precision medicine involves a transformation along three axes: the axis of biomedical knowledge, of biomedical power and of the patient as a self. Patients are encouraged to become the managers of their own health status, while the medical domain is reframed as a data-sharing community, characterised by changing power relationships between providers and patients, producers and consumers. While the emerging Precision Medicine cosmology may surpass existing knowledge frameworks; it obscures previous traditions and reduces research-subjects to mere data. This in turn, means that the individual is both subjected to the neoliberal demand to share personal information, and at the same time has acquired the positive 'right' to become a member of the data-sharing community. The subject has to constantly negotiate the meaning of his or her data, which can either enable self-expression, or function as a commanding Superego.
Current knowledge of microRNA-mediated regulation of drug metabolism in humans.
Nakano, Masataka; Nakajima, Miki
2018-05-01
Understanding the factors causing inter- and intra-individual differences in drug metabolism potencies is required for the practice of personalized or precision medicine, as well as for the promotion of efficient drug development. The expression of drug-metabolizing enzymes is controlled by transcriptional regulation by nuclear receptors and transcriptional factors, epigenetic regulation, such as DNA methylation and histone acetylation, and post-translational modification. In addition to such regulation mechanisms, recent studies revealed that microRNAs (miRNAs), endogenous ~22-nucleotide non-coding RNAs that regulate gene expression through the translational repression and degradation of mRNAs, significantly contribute to post-transcriptional regulation of drug-metabolizing enzymes. Areas covered: This review summarizes the current knowledge regarding miRNAs-dependent regulation of drug-metabolizing enzymes and transcriptional factors and its physiological and clinical significance. We also describe recent advances in miRNA-dependent regulation research, showing that the presence of pseudogenes, single-nucleotide polymorphisms, and RNA editing affects miRNA targeting. Expert opinion: It is unwavering fact that miRNAs are critical factors causing inter- and intra-individual differences in the expression of drug-metabolizing enzymes. Consideration of miRNA-dependent regulation would be a helpful tool for optimizing personalized and precision medicine.
Rudimentary Cleaning Compared to Level 300A
NASA Technical Reports Server (NTRS)
Arpin, Christina Y. Pina; Stoltzfus, Joel
2012-01-01
A study was done to characterize the cleanliness level achievable when using a rudimentary cleaning process, and results were compared to JPR 5322.1G Level 300A. While it is not ideal to clean in a shop environment, some situations (e.g., field combat operations) require oxygen system hardware to be maintained and cleaned to prevent a fire hazard, even though it cannot be sent back to a precision cleaning facility. This study measured the effectiveness of basic shop cleaning. Initially, three items representing parts of an oxygen system were contaminated: a metal plate, valve body, and metal oxygen bottle. The contaminants chosen were those most likely to be introduced to the system during normal use: oil, lubricant, metal shavings/powder, sand, fingerprints, tape, lip balm, and hand lotion. The cleaning process used hot water, soap, various brushes, gaseous nitrogen, water nozzle, plastic trays, scouring pads, and a controlled shop environment. Test subjects were classified into three groups: technical professionals having an appreciation for oxygen hazards; professional precision cleaners; and a group with no previous professional knowledge of oxygen or precision cleaning. Three test subjects were in each group, and each was provided with standard cleaning equipment, a cleaning procedure, and one of each of the three test items to clean. The results indicated that the achievable cleanliness level was independent of the technical knowledge or proficiency of the personnel cleaning the items. Results also showed that achieving a Level 300 particle count was more difficult than achieving a Level A nonvolatile residue amount.
Mapping the knowledge utilization field in nursing from 1945 to 2004: a bibliometric analysis.
Scott, Shannon D; Profetto-McGrath, Joanne; Estabrooks, Carole A; Winther, Connie; Wallin, Lars; Lavis, John N
2010-12-01
The field of knowledge utilization has been hampered by several issues including: the synonymous use of multiple terms with little attempt at definition precision; an overexamination of knowledge utilization as product, rather than a process; and a lack of progress to cross disciplinary boundaries to advance knowledge development. In order to address the challenges and current knowledge gaps in the knowledge utilization field in nursing, a comprehensive picture of the current state of the field is required. Bibliometric analyses were used to map knowledge utilization literature in nursing as an international field of study, and to identify the structure of its scientific community. Analyses of bibliographic data for 433 articles from the period 1945-2004 demonstrated three trends: (1) there has been significant recent growth and interest in this field, (2) the structure of the scientific knowledge utilization community is evolving, and (3) the Web of Science does not index the majority of journals where this literature is published. In order to enhance the accessibility and profile of this literature, and nursing's scientific literature at large, we encourage the International Academy of Nursing Editors to work collaboratively to increase the number of journals indexed in the Web of Science. ©2010 Sigma Theta Tau International.
Shi, Longxiang; Li, Shijian; Yang, Xiaoran; Qi, Jiaheng; Pan, Gang; Zhou, Binbin
2017-01-01
With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK), which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective.
Yang, Xiaoran; Qi, Jiaheng; Pan, Gang; Zhou, Binbin
2017-01-01
With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK), which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective. PMID:28299322
Precise Determination of the Zero-Gravity Surface Figure of a Mirror without Gravity-Sag Modeling
NASA Technical Reports Server (NTRS)
Bloemhof, Eric E.; Lam, Jonathan C.; Feria, V. Alfonso; Chang, Zensheu
2007-01-01
The zero-gravity surface figure of optics used in spaceborne astronomical instruments must be known to high accuracy, but earthbound metrology is typically corrupted by gravity sag. Generally, inference of the zero-gravity surface figure from a measurement made under normal gravity requires finite-element analysis (FEA), and for accurate results the mount forces must be well characterized. We describe how to infer the zero-gravity surface figure very precisely using the alternative classical technique of averaging pairs of measurements made with the direction of gravity reversed. We show that mount forces as well as gravity must be reversed between the two measurements and discuss how the St. Venant principle determines when a reversed mount force may be considered to be applied at the same place in the two orientations. Our approach requires no finite-element modeling and no detailed knowledge of mount forces other than the fact that they reverse and are applied at the same point in each orientation. If mount schemes are suitably chosen, zero-gravity optical surfaces may be inferred much more simply and more accurately than with FEA.
[Proton imaging applications for proton therapy: state of the art].
Amblard, R; Floquet, V; Angellier, G; Hannoun-Lévi, J M; Hérault, J
2015-04-01
Proton therapy allows a highly precise tumour volume irradiation with a low dose delivered to the healthy tissues. The steep dose gradients observed and the high treatment conformity require a precise knowledge of the proton range in matter and the target volume position relative to the beam. Thus, proton imaging allows an improvement of the treatment accuracy, and thereby, in treatment quality. Initially suggested in 1963, radiographic imaging with proton is still not used in clinical routine. The principal difficulty is the lack of spatial resolution, induced by the multiple Coulomb scattering of protons with nuclei. Moreover, its realization for all clinical locations requires relatively high energies that are previously not considered for clinical routine. Abandoned for some time in favor of X-ray technologies, research into new imaging methods using protons is back in the news because of the increase of proton radiation therapy centers in the world. This article exhibits a non-exhaustive state of the art in proton imaging. Copyright © 2015 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
NASA Technical Reports Server (NTRS)
Brink, Jeffrey S.
2005-01-01
The space shuttle Aft Propulsion System (APS) pod requires precision alignment to be installed onto the orbiter deck. The Ground Support Equipment (GSE) used to perform this task cannot be manipulated along a single Cartesian axis without causing motion along the other Cartesian axes. As a result, manipulations required to achieve a desired motion are not intuitive. My study calculated the joint angles required to align the APS pod, using reverse kinematic analysis techniques. Knowledge of these joint angles will allow the ground support team to align the APS pod more safely and efficiently. An uncertainty analysis was also performed to estimate the accuracy associated with this approach and to determine whether any inexpensive modifications can be made to further improve accuracy.
[Anatomy of the liver: what you need to know].
Lafortune, M; Denys, A; Sauvanet, A; Schmidt, S
2007-01-01
A precise knowledge of arterial, portal, hepatic and biliary anatomical variations is mandatory when a liver intervention is planned. However, only certain variations must be searched when a precise intervention is planned. The basic liver anatomy as well as the most relevant malformations will be precised.
What Is Trust? Ethics and Risk Governance in Precision Medicine and Predictive Analytics
Adjekum, Afua; Ienca, Marcello
2017-01-01
Abstract Trust is a ubiquitous term used in emerging technology (e.g., Big Data, precision medicine), innovation policy, and governance literatures in particular. But what exactly is trust? Even though trust is considered a critical requirement for the successful deployment of precision medicine initiatives, nonetheless, there is a need for further conceptualization with regard to what qualifies as trust, and what factors might establish and sustain trust in precision medicine, predictive analytics, and large-scale biology. These new fields of 21st century medicine and health often deal with the “futures” and hence, trust gains a temporal and ever-present quality for both the present and the futures anticipated by new technologies and predictive analytics. We address these conceptual gaps that have important practical implications in the way we govern risk and unknowns associated with emerging technologies in biology, medicine, and health broadly. We provide an in-depth conceptual analysis and an operative definition of trust dynamics in precision medicine. In addition, we identify three main types of “trust facilitators”: (1) technical, (2) ethical, and (3) institutional. This three-dimensional framework on trust is necessary to building and maintaining trust in 21st century knowledge-based innovations that governments and publics invest for progressive societal change, development, and sustainable prosperity. Importantly, we analyze, identify, and deliberate on the dimensions of precision medicine and large-scale biology that have carved out trust as a pertinent tool to its success. Moving forward, we propose a “points to consider” on how best to enhance trust in precision medicine and predictive analytics. PMID:29257733
What Is Trust? Ethics and Risk Governance in Precision Medicine and Predictive Analytics.
Adjekum, Afua; Ienca, Marcello; Vayena, Effy
2017-12-01
Trust is a ubiquitous term used in emerging technology (e.g., Big Data, precision medicine), innovation policy, and governance literatures in particular. But what exactly is trust? Even though trust is considered a critical requirement for the successful deployment of precision medicine initiatives, nonetheless, there is a need for further conceptualization with regard to what qualifies as trust, and what factors might establish and sustain trust in precision medicine, predictive analytics, and large-scale biology. These new fields of 21st century medicine and health often deal with the "futures" and hence, trust gains a temporal and ever-present quality for both the present and the futures anticipated by new technologies and predictive analytics. We address these conceptual gaps that have important practical implications in the way we govern risk and unknowns associated with emerging technologies in biology, medicine, and health broadly. We provide an in-depth conceptual analysis and an operative definition of trust dynamics in precision medicine. In addition, we identify three main types of "trust facilitators": (1) technical, (2) ethical, and (3) institutional. This three-dimensional framework on trust is necessary to building and maintaining trust in 21st century knowledge-based innovations that governments and publics invest for progressive societal change, development, and sustainable prosperity. Importantly, we analyze, identify, and deliberate on the dimensions of precision medicine and large-scale biology that have carved out trust as a pertinent tool to its success. Moving forward, we propose a "points to consider" on how best to enhance trust in precision medicine and predictive analytics.
NASA Technical Reports Server (NTRS)
Kawa, Stephan R.; Baker, David Frank; Schuh, Andrew E.; Abshire, James Brice; Browell, Edward V.; Michalak, Anna M.
2012-01-01
The NASA ASCENDS mission (Active Sensing of Carbon Emissions, Nights, Days, and Seasons) is envisioned as the next generation of dedicated, space-based CO2 observing systems, currently planned for launch in about the year 2022. Recommended by the US National Academy of Sciences Decadal Survey, active (lidar) sensing of CO2 from space has several potentially significant advantages, in comparison to current and planned passive CO2 instruments, that promise to advance CO2 measurement capability and carbon cycle understanding into the next decade. Assessment and testing of possible lidar instrument technologies indicates that such sensors are more than feasible, however, the measurement precision and accuracy requirements remain at unprecedented levels of stringency. It is, therefore, important to quantitatively and consistently evaluate the measurement capabilities and requirements for the prospective active system in the context of advancing our knowledge of carbon flux distributions and their dependence on underlying physical processes. This amounts to establishing minimum requirements for precision, relative accuracy, spatial/temporal coverage and resolution, vertical information content, interferences, and possibly the tradeoffs among these parameters, while at the same time framing a mission that can be implemented within a constrained budget. Here, we present results of observing system simulation studies, commissioned by the ASCENDS Science Requirements Definition Team, for a range of possible mission implementation options that are intended to substantiate science measurement requirements for a laser-based CO2 space instrument.
Data and Time Transfer Using SONET Radio
NASA Technical Reports Server (NTRS)
Graceffo, Gary M.
1996-01-01
The need for precise knowledge of time and frequency has become ubiquitous throughout our society. The areas of astronomy, navigation, and high speed wide-area networks are among a few of the many consumers of this type of information. The Global Positioning System (GPS) has the potential to be the most comprehensive source of precise timing information developed to date; however, the introduction of selective availability has made it difficult for many users to recover this information from the GPS system with the precision required for today's systems. The system described in this paper is a 'Synchronous Optical NetWORK (SONET) Radio Data and Time Transfer System'. The objective of this system is to provide precise time and frequency information to a variety of end-users using a two-way data and time-transfer system. Although time and frequency transfers have been done for many years, this system is unique in that time and frequency information are embedded into existing communications traffic. This eliminates the need to make the transfer of time and frequency informatio a dedicated function of the communications system. For this system SONET has been selected as the transport format from which precise time is derived. SONET has been selected because of its high data rates and its increasing acceptance throughout the industry. This paper details a proof-of-concept initiative to perform embedded time and frequency transfers using SONET Radio.
Localizing on-scalp MEG sensors using an array of magnetic dipole coils.
Pfeiffer, Christoph; Andersen, Lau M; Lundqvist, Daniel; Hämäläinen, Matti; Schneiderman, Justin F; Oostenveld, Robert
2018-01-01
Accurate estimation of the neural activity underlying magnetoencephalography (MEG) signals requires co-registration i.e., determination of the position and orientation of the sensors with respect to the head. In modern MEG systems, an array of hundreds of low-Tc SQUID sensors is used to localize a set of small, magnetic dipole-like (head-position indicator, HPI) coils that are attached to the subject's head. With accurate prior knowledge of the positions and orientations of the sensors with respect to one another, the HPI coils can be localized with high precision, and thereby the positions of the sensors in relation to the head. With advances in magnetic field sensing technologies, e.g., high-Tc SQUIDs and optically pumped magnetometers (OPM), that require less extreme operating temperatures than low-Tc SQUID sensors, on-scalp MEG is on the horizon. To utilize the full potential of on-scalp MEG, flexible sensor arrays are preferable. Conventional co-registration is impractical for such systems as the relative positions and orientations of the sensors to each other are subject-specific and hence not known a priori. Herein, we present a method for co-registration of on-scalp MEG sensors. We propose to invert the conventional co-registration approach and localize the sensors relative to an array of HPI coils on the subject's head. We show that given accurate prior knowledge of the positions of the HPI coils with respect to one another, the sensors can be localized with high precision. We simulated our method with realistic parameters and layouts for sensor and coil arrays. Results indicate co-registration is possible with sub-millimeter accuracy, but the performance strongly depends upon a number of factors. Accurate calibration of the coils and precise determination of the positions and orientations of the coils with respect to one another are crucial. Finally, we propose methods to tackle practical challenges to further improve the method.
Kuperstein, Inna; Grieco, Luca; Cohen, David P A; Thieffry, Denis; Zinovyev, Andrei; Barillot, Emmanuel
2015-03-01
Several decades of molecular biology research have delivered a wealth of detailed descriptions of molecular interactions in normal and tumour cells. This knowledge has been functionally organised and assembled into dedicated biological pathway resources that serve as an invaluable tool, not only for structuring the information about molecular interactions but also for making it available for biological, clinical and computational studies. With the advent of high-throughput molecular profiling of tumours, close to complete molecular catalogues of mutations, gene expression and epigenetic modifications are available and require adequate interpretation. Taking into account the information about biological signalling machinery in cells may help to better interpret molecular profiles of tumours. Making sense out of these descriptions requires biological pathway resources for functional interpretation of the data. In this review, we describe the available biological pathway resources, their characteristics in terms of construction mode, focus, aims and paradigms of biological knowledge representation. We present a new resource that is focused on cancer-related signalling, the Atlas of Cancer Signalling Networks. We briefly discuss current approaches for data integration, visualisation and analysis, using biological networks, such as pathway scoring, guilt-by-association and network propagation. Finally, we illustrate with several examples the added value of data interpretation in the context of biological networks and demonstrate that it may help in analysis of high-throughput data like mutation, gene expression or small interfering RNA screening and can guide in patients stratification. Finally, we discuss perspectives for improving precision medicine using biological network resources and tools. Taking into account the information about biological signalling machinery in cells may help to better interpret molecular patterns of tumours and enable to put precision oncology into general clinical practice. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Localizing on-scalp MEG sensors using an array of magnetic dipole coils
Andersen, Lau M.; Lundqvist, Daniel; Hämäläinen, Matti; Schneiderman, Justin F.; Oostenveld, Robert
2018-01-01
Accurate estimation of the neural activity underlying magnetoencephalography (MEG) signals requires co-registration i.e., determination of the position and orientation of the sensors with respect to the head. In modern MEG systems, an array of hundreds of low-Tc SQUID sensors is used to localize a set of small, magnetic dipole-like (head-position indicator, HPI) coils that are attached to the subject’s head. With accurate prior knowledge of the positions and orientations of the sensors with respect to one another, the HPI coils can be localized with high precision, and thereby the positions of the sensors in relation to the head. With advances in magnetic field sensing technologies, e.g., high-Tc SQUIDs and optically pumped magnetometers (OPM), that require less extreme operating temperatures than low-Tc SQUID sensors, on-scalp MEG is on the horizon. To utilize the full potential of on-scalp MEG, flexible sensor arrays are preferable. Conventional co-registration is impractical for such systems as the relative positions and orientations of the sensors to each other are subject-specific and hence not known a priori. Herein, we present a method for co-registration of on-scalp MEG sensors. We propose to invert the conventional co-registration approach and localize the sensors relative to an array of HPI coils on the subject’s head. We show that given accurate prior knowledge of the positions of the HPI coils with respect to one another, the sensors can be localized with high precision. We simulated our method with realistic parameters and layouts for sensor and coil arrays. Results indicate co-registration is possible with sub-millimeter accuracy, but the performance strongly depends upon a number of factors. Accurate calibration of the coils and precise determination of the positions and orientations of the coils with respect to one another are crucial. Finally, we propose methods to tackle practical challenges to further improve the method. PMID:29746486
McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F
2015-01-01
Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.
Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.
2015-01-01
Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079
PINT, A Modern Software Package for Pulsar Timing
NASA Astrophysics Data System (ADS)
Luo, Jing; Ransom, Scott M.; Demorest, Paul; Ray, Paul S.; Stovall, Kevin; Jenet, Fredrick; Ellis, Justin; van Haasteren, Rutger; Bachetti, Matteo; NANOGrav PINT developer team
2018-01-01
Pulsar timing, first developed decades ago, has provided an extremely wide range of knowledge about our universe. It has been responsible for many important discoveries, such as the discovery of the first exoplanet and the orbital period decay of double neutron star systems. Currently pulsar timing is the leading technique for detecting low frequency (about 10^-9 Hertz) gravitational waves (GW) using an array of pulsars as the detectors. To achieve this goal, high precision pulsar timing data, at about nanoseconds level, is required. Most high precision pulsar timing data are analyzed using the widely adopted software TEMPO/TEMPO2. But for a robust and believable GW detection, it is important to have independent software that can cross-check the result. In this poster we present the new generation pulsar timing software PINT. This package will provide a robust system to cross check high-precision timing results, completely independent of TEMPO and TEMPO2. In addition, PINT is designed to be a package that is easy to extend and modify, through use of flexible code architecture and a modern programming language, Python, with modern technology and libraries.
Estimation of satellite position, clock and phase bias corrections
NASA Astrophysics Data System (ADS)
Henkel, Patrick; Psychas, Dimitrios; Günther, Christoph; Hugentobler, Urs
2018-05-01
Precise point positioning with integer ambiguity resolution requires precise knowledge of satellite position, clock and phase bias corrections. In this paper, a method for the estimation of these parameters with a global network of reference stations is presented. The method processes uncombined and undifferenced measurements of an arbitrary number of frequencies such that the obtained satellite position, clock and bias corrections can be used for any type of differenced and/or combined measurements. We perform a clustering of reference stations. The clustering enables a common satellite visibility within each cluster and an efficient fixing of the double difference ambiguities within each cluster. Additionally, the double difference ambiguities between the reference stations of different clusters are fixed. We use an integer decorrelation for ambiguity fixing in dense global networks. The performance of the proposed method is analysed with both simulated Galileo measurements on E1 and E5a and real GPS measurements of the IGS network. We defined 16 clusters and obtained satellite position, clock and phase bias corrections with a precision of better than 2 cm.
Heliostat calibration using attached cameras and artificial targets
NASA Astrophysics Data System (ADS)
Burisch, Michael; Sanchez, Marcelino; Olarra, Aitor; Villasante, Cristobal
2016-05-01
The efficiency of the solar field greatly depends on the ability of the heliostats to precisely reflect solar radiation onto a central receiver. To control the heliostats with such a precision requires the accurate knowledge of the motion of each of them. The motion of each heliostat can be described by a set of parameters, most notably the position and axis configuration. These parameters have to be determined individually for each heliostat during a calibration process. With the ongoing development of small sized heliostats, the ability to automatically perform such a calibration becomes more and more crucial as possibly hundreds of thousands of heliostats are involved. Furthermore, efficiency becomes an important factor as small sized heliostats potentially have to be recalibrated far more often, due to the limited stability of the components. In the following we present an automatic calibration procedure using cameras attached to each heliostat which are observing different targets spread throughout the solar field. Based on a number of observations of these targets under different heliostat orientations, the parameters describing the heliostat motion can be estimated with high precision.
ERIC Educational Resources Information Center
National Alliance of Business, Inc., Washington, DC.
CertainTeed's Precision Strike training program was designed to close the gaps between the current status of its workplace and where that work force needed to be to compete successfully in global markets. Precision Strike included Skills and Knowledge in Lifelong Learning (SKILL) customized, computerized lessons in basic skills, one-on-one…
The Trojan Horse Method in nuclear astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spitaleri, C., E-mail: spitaleri@lns.infn.it; Mukhamedzhanov, A. M.; Blokhintsev, L. D.
2011-12-15
The study of energy production and nucleosynthesis in stars requires an increasingly precise knowledge of the nuclear reaction rates at the energies of interest. To overcome the experimental difficulties arising from the small cross sections at those energies and from the presence of the electron screening, the Trojan Horse Method has been introduced. The method provides a valid alternative path to measure unscreened low-energy cross sections of reactions between charged particles, and to retrieve information on the electron screening potential when ultra-low energy direct measurements are available.
Motion measurement for synthetic aperture radar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doerry, Armin W.
Synthetic Aperture Radar (SAR) measures radar soundings from a set of locations typically along the flight path of a radar platform vehicle. Optimal focusing requires precise knowledge of the sounding source locations in 3-D space with respect to the target scene. Even data driven focusing techniques (i.e. autofocus) requires some degree of initial fidelity in the measurements of the motion of the radar. These requirements may be quite stringent especially for fine resolution, long ranges, and low velocities. The principal instrument for measuring motion is typically an Inertial Measurement Unit (IMU), but these instruments have inherent limi ted precision andmore » accuracy. The question is %22How good does an IMU need to be for a SAR across its performance space?%22 This report analytically relates IMU specifications to parametric requirements for SAR. - 4 - Acknowledgements Th e preparation of this report is the result of a n unfunded research and development activity . Although this report is an independent effort, it draws heavily from limited - release documentation generated under a CRADA with General Atomics - Aeronautical System, Inc. (GA - ASI), and under the Joint DoD/DOE Munitions Program Memorandum of Understanding. Sandia National Laboratories is a multi - program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of En ergy's National Nuclear Security Administration under contract AC04-94AL85000.« less
KnowLife: a versatile approach for constructing a large knowledge graph for biomedical sciences.
Ernst, Patrick; Siu, Amy; Weikum, Gerhard
2015-05-14
Biomedical knowledge bases (KB's) have become important assets in life sciences. Prior work on KB construction has three major limitations. First, most biomedical KBs are manually built and curated, and cannot keep up with the rate at which new findings are published. Second, for automatic information extraction (IE), the text genre of choice has been scientific publications, neglecting sources like health portals and online communities. Third, most prior work on IE has focused on the molecular level or chemogenomics only, like protein-protein interactions or gene-drug relationships, or solely address highly specific topics such as drug effects. We address these three limitations by a versatile and scalable approach to automatic KB construction. Using a small number of seed facts for distant supervision of pattern-based extraction, we harvest a huge number of facts in an automated manner without requiring any explicit training. We extend previous techniques for pattern-based IE with confidence statistics, and we combine this recall-oriented stage with logical reasoning for consistency constraint checking to achieve high precision. To our knowledge, this is the first method that uses consistency checking for biomedical relations. Our approach can be easily extended to incorporate additional relations and constraints. We ran extensive experiments not only for scientific publications, but also for encyclopedic health portals and online communities, creating different KB's based on different configurations. We assess the size and quality of each KB, in terms of number of facts and precision. The best configured KB, KnowLife, contains more than 500,000 facts at a precision of 93% for 13 relations covering genes, organs, diseases, symptoms, treatments, as well as environmental and lifestyle risk factors. KnowLife is a large knowledge base for health and life sciences, automatically constructed from different Web sources. As a unique feature, KnowLife is harvested from different text genres such as scientific publications, health portals, and online communities. Thus, it has the potential to serve as one-stop portal for a wide range of relations and use cases. To showcase the breadth and usefulness, we make the KnowLife KB accessible through the health portal (http://knowlife.mpi-inf.mpg.de).
Sentiment classification technology based on Markov logic networks
NASA Astrophysics Data System (ADS)
He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe
2016-07-01
With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.
Arthroscopic approach and anatomy of the hip.
Aprato, Alessandro; Giachino, Matteo; Masse, Alessandro
2016-01-01
Hip arthroscopy has gained popularity among the orthopedic community and a precise assessment of indications, techniques and results is constantly brought on. In this chapter the principal standard entry portals for central and peripheral compartment are discussed. The description starts from the superficial landmarks for portals placement and continues with the deep layers. For each entry point an illustration of the main structures encountered is provided and the principal structures at risk for different portals are accurately examined. Articular anatomical description is carried out from the arthroscope point of view and sub-divided into central and peripheral compartment. The two compartments are systematically analyzed and the accessible articular areas for each portal explained. Moreover, some anatomical variations that can be found in the normal hip are reported. The anatomical knowledge of the hip joint along with a precise notion of the structures encountered with the arthroscope is an essential requirement for a secure and successful surgery. Level of evidence: V.
Knowledge Modeling in Prior Art Search
NASA Astrophysics Data System (ADS)
Graf, Erik; Frommholz, Ingo; Lalmas, Mounia; van Rijsbergen, Keith
This study explores the benefits of integrating knowledge representations in prior art patent retrieval. Key to the introduced approach is the utilization of human judgment available in the form of classifications assigned to patent documents. The paper first outlines in detail how a methodology for the extraction of knowledge from such an hierarchical classification system can be established. Further potential ways of integrating this knowledge with existing Information Retrieval paradigms in a scalable and flexible manner are investigated. Finally based on these integration strategies the effectiveness in terms of recall and precision is evaluated in the context of a prior art search task for European patents. As a result of this evaluation it can be established that in general the proposed knowledge expansion techniques are particularly beneficial to recall and, with respect to optimizing field retrieval settings, further result in significant precision gains.
Investigation of outside visual cues required for low speed and hover
NASA Technical Reports Server (NTRS)
Hoh, R. H.
1985-01-01
Knowledge of the visual cues required in the performance of stabilized hover in VTOL aircraft is a prerequisite for the development of both cockpit displays and ground-based simulation systems. Attention is presently given to the viability of experimental test flight techniques as the bases for the identification of essential external cues in aggressive and precise low speed and hovering tasks. The analysis and flight test program conducted employed a helicopter and a pilot wearing lenses that could be electronically fogged, where the primary variables were field-of-view, large object 'macrotexture', and fine detail 'microtexture', in six different fields-of-view. Fundamental metrics are proposed for the quantification of the visual field, to allow comparisons between tests, simulations, and aircraft displays.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bechtstedt, U.; Hacker, U.; Maier, R.
1995-02-01
The past decades have seen a tremendous development in nuclear, middle, and high energy physics. This advance was in a great part promoted by the availability of newer and more powerful instruments. Over time, these instruments grew in size as well as in sophistication and precision. Nearly all these devices had one fundamental thing in common - magnetic fields produced with currents and iron. The precision demanded by the new experiments and machines did bring the magnet technology to new frontiers requiring the utmost in the accuracy of magnetic fields. The complex properties of the iron challenged innumerable physicists inmore » the attempt to force the magnetic fields into the desired shape. Experience and analytical insight were the pillars for coping with those problems and only few mastered the skills and were in addition able to communicate their intricate knowledge. It was a fortuitous situation that the authors got to know Klaus Halbach who belonged to those few and who shared his knowledge contributing thus largely to the successful completion of two large instruments that were built at the Forschungszentrum Juelich, KFA, for nuclear and middle energy physics. In one case the efforts went to the large spectrometer named BIG KARL whose design phase started in the early 70`s. In the second case the work started in the early 80`s with the task to build a high precision 2.5 GeV proton accelerator for cooled stored and extracted beams known as COSY-Juelich.« less
Orbit Determination Strategy and Simulation Performance for OSIRIS-REx Proximity Operations
NASA Technical Reports Server (NTRS)
Leonard, Jason M.; Antreasian, Peter G.; Jackman, Coralie D.; Page, Brian; Wibben, Daniel R.; Moreau, Michael C.
2017-01-01
The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRISREx)is a NASA New Frontiers mission to the near-earth asteroid Bennu that will rendez vousin 2018, create a comprehensive and detailed set of observations over several years, collect a regolith sample, and return the sample to Earth in 2023. The Orbit Determination (OD) team isa sub-section of the Flight Dynamics System responsible for generating precise reconstructions and predictions of the spacecraft trajectory. The OD team processes radiometric data, LIDAR, as well as center-finding and landmark-based Optical Navigation images throughout the proximity operations phase to estimate and predict the spacecraft location within several meters. Stringent knowledge requirements stress the OD teams concept of operations and procedures to produce verified and consistent high quality solutions for observation planning, maneuver planning, and onboard sequencing. This paper will provide insight into the OD concept of operations and summarize the OD performance expected during the approach and early proximity operation phases,based on our pre-encounter knowledge of Bennu. Strategies and methods used to compare and evaluate predicted and reconstructed solutions are detailed. The use of high fidelity operational tests during early 2017 will stress the teams concept of operations and ability to produce precise OD solutions with minimal turn-around delay.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fetterly, K; Mathew, V
Purpose: Transcatheter aortic valve replacement (TAVR) procedures provide a method to implant a prosthetic aortic valve via a minimallyinvasive, catheter-based procedure. TAVR procedures require use of interventional fluoroscopy c-arm projection angles which are aligned with the aortic valve plane to minimize prosthetic valve positioning error due to x-ray imaging parallax. The purpose of this work is to calculate the continuous range of interventional fluoroscopy c-arm projection angles which are aligned with the aortic valve plane from a single planar image of a valvuloplasty balloon inflated across the aortic valve. Methods: Computational methods to measure the 3D angular orientation of themore » aortic valve were developed. Required inputs include a planar x-ray image of a known valvuloplasty balloon inflated across the aortic valve and specifications of x-ray imaging geometry from the DICOM header of the image. A-priori knowledge of the species-specific typical range of aortic orientation is required to specify the sign of the angle of the long axis of the balloon with respect to the x-ray beam. The methods were validated ex-vivo and in a live pig. Results: Ex-vivo experiments demonstrated that the angular orientation of a stationary inflated valvuloplasty balloon can be measured with precision less than 1 degree. In-vivo pig experiments demonstrated that cardiac motion contributed to measurement variability, with precision less than 3 degrees. Error in specification of x-ray geometry directly influences measurement accuracy. Conclusion: This work demonstrates that the 3D angular orientation of the aortic valve can be calculated precisely from a planar image of a valvuloplasty balloon inflated across the aortic valve and known x-ray geometry. This method could be used to determine appropriate c-arm angular projections during TAVR procedures to minimize x-ray imaging parallax and thereby minimize prosthetic valve positioning errors.« less
Can you trust the parametric standard errors in nonlinear least squares? Yes, with provisos.
Tellinghuisen, Joel
2018-04-01
Questions about the reliability of parametric standard errors (SEs) from nonlinear least squares (LS) algorithms have led to a general mistrust of these precision estimators that is often unwarranted. The importance of non-Gaussian parameter distributions is illustrated by converting linear models to nonlinear by substituting e A , ln A, and 1/A for a linear parameter a. Monte Carlo (MC) simulations characterize parameter distributions in more complex cases, including when data have varying uncertainty and should be weighted, but weights are neglected. This situation leads to loss of precision and erroneous parametric SEs, as is illustrated for the Lineweaver-Burk analysis of enzyme kinetics data and the analysis of isothermal titration calorimetry data. Non-Gaussian parameter distributions are generally asymmetric and biased. However, when the parametric SE is <10% of the magnitude of the parameter, both the bias and the asymmetry can usually be ignored. Sometimes nonlinear estimators can be redefined to give more normal distributions and better convergence properties. Variable data uncertainty, or heteroscedasticity, can sometimes be handled by data transforms but more generally requires weighted LS, which in turn require knowledge of the data variance. Parametric SEs are rigorously correct in linear LS under the usual assumptions, and are a trustworthy approximation in nonlinear LS provided they are sufficiently small - a condition favored by the abundant, precise data routinely collected in many modern instrumental methods. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
Metadynamic metainference: Enhanced sampling of the metainference ensemble using metadynamics
Bonomi, Massimiliano; Camilloni, Carlo; Vendruscolo, Michele
2016-01-01
Accurate and precise structural ensembles of proteins and macromolecular complexes can be obtained with metainference, a recently proposed Bayesian inference method that integrates experimental information with prior knowledge and deals with all sources of errors in the data as well as with sample heterogeneity. The study of complex macromolecular systems, however, requires an extensive conformational sampling, which represents a separate challenge. To address such challenge and to exhaustively and efficiently generate structural ensembles we combine metainference with metadynamics and illustrate its application to the calculation of the free energy landscape of the alanine dipeptide. PMID:27561930
On HQET and NRQCD operators of dimension 8 and above
Gunawardana, Ayesh; Paz, Gil
2017-07-27
Effective field theories such as Heavy Quark Effective Theory (HQET) and Non Relativistic Quantum Chromo-(Electro-) dynamics NRQCD (NRQED) are indispensable tools in controlling the effects of the strong interaction. The increasing experimental precision requires the knowledge of higher dimensional operators. We present a general method that allows for an easy construction of HQET or NRQCD (NRQED) operators that contain two heavy quark or non-relativistic fields and any number of covariant derivatives. As an application of our method, we list these terms in the 1/M 4 NRQCD Lagrangian, where M is the mass of of the spin-half field.
Continued Development of a Precision Cryogenic Dilatometer for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Karlmann, Paul B.; Dudik, Matthew J.; Halverson, Peter G.; Levine, Marie; Marcin, Martin; Peters, Robert D.; Shaklan, Stuart; VanBuren, David
2004-01-01
As part of the James Webb Space Telescope (JWST) materials working group, a novel cryogenic dilatometer was designed and built at NASA Jet Propulsion Laboratory to help address stringent coefficient of thermal expansion (CTE) knowledge requirements. Previously reported results and error analysis have estimated a CTE measurement accuracy for ULE of 1.7 ppb/K with a 20K thermal load and 0.1 ppb/K with a 280K thermal load. Presented here is a further discussion of the cryogenic dilatometer system and a description of recent work including system modifications and investigations.
A new initiative on precision medicine.
Collins, Francis S; Varmus, Harold
2015-02-26
President Obama has announced a research initiative that aims to accelerate progress toward a new era of precision medicine, with a near-term focus on cancers and a longer-term aim to generate knowledge applicable to the whole range of health and disease.
Development and Evaluation of a Diagnostic Documentation Support System using Knowledge Processing
NASA Astrophysics Data System (ADS)
Makino, Kyoko; Hayakawa, Rumi; Terai, Koichi; Fukatsu, Hiroshi
In this paper, we will introduce a system which supports creating diagnostic reports. Diagnostic reports are documents by doctors of radiology describing the existence and nonexistence of abnormalities from the inspection images, such as CT and MRI, and summarize a patient's state and disease. Our system indicates insufficiencies in these reports created by younger doctors, by using knowledge processing based on a medical knowledge dictionary. These indications are not only clerical errors, but the system also analyzes the purpose of the inspection and determines whether a comparison with a former inspection is required, or whether there is any shortage in description. We verified our system by using actual data of 2,233 report pairs, a pair comprised of a report written by a younger doctor and a check result of the report by an experienced doctor. The results of the verification showed that the rules of string analysis for detecting clerical errors and sentence wordiness obtained a recall of over 90% and a precision of over 75%. Moreover, the rules based on a medical knowledge dictionary for detecting the lack of required comparison with a former inspection and the shortage in description for the inspection purpose obtained a recall of over 70%. From these results, we confirmed that our system contributes to the quality improvement of diagnostic reports. We expect that our system can comprehensively support diagnostic documentations by cooperating with the interface which refers to inspection images or past reports.
NASA Astrophysics Data System (ADS)
Ferri, F.; CMS Collaboration
2016-04-01
The precise determination of the mass, the width and the couplings of the particle discovered in 2012 with a mass around 125 GeV is of capital importance to clarify the nature of such a particle, in particular to establish precisely if it is a Standard Model Higgs boson. In several new physics scenarios, in fact, the Higgs boson may behave differently with respect to the Standard Model one, or may not be unique, i.e. there can be more than one Higgs boson. In order to achieve the precision needed to discriminate between different models, the energy resolution, the scale uncertainty and the position resolution for electrons and photons are required to be as good as possible. The CMS scintillating lead-tungstate electromagnetic calorimeter (ECAL) was built as a precise tool with an exceptional energy resolution and a very good position resolution that improved over the years with the knowledge of the detector. Moreover, thanks to the fact that most of the lead-tungstate scintillation light is emitted in about 25 ns, the ECAL can be used to accurately determine the time of flight of photons. We present the current performance of the CMS ECAL, with a special emphasis on the impact on the measurement of the properties of the Higgs boson and on searches for new physics.
Packaging films for electronic and space-related hardware
NASA Astrophysics Data System (ADS)
Shon, E. M.; Hamberg, O.
1985-08-01
Flexible packaging films are used to bag and/or wrap precision cleaned electronic or space hardware to protect them from environmental degradation during shipping and storage. Selection of packaging films depends on a knowledge of product requirements and packaging film characteristics. The literature presently available on protective packaging films has been updated to include new materials and to amplify space-related applications. Presently available packaging film materials are compared for their various characteristics: electrostatic discharge (ESD) control, flame retardancy, water vapor transmission rate, particulate shedding, molecular contamination, and transparency. The tradeoff between product requirements and the characteristics of the packaging films available are discussed. Selection considerations are given for the application of specific materials of space hardware-related applications. Applications for intimate, environmental, and electrostatic protective packaging are discussed.
Matter power spectrum and the challenge of percent accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, Aurel; Teyssier, Romain; Potter, Doug
2016-04-01
Future galaxy surveys require one percent precision in the theoretical knowledge of the power spectrum over a large range including very nonlinear scales. While this level of accuracy is easily obtained in the linear regime with perturbation theory, it represents a serious challenge for small scales where numerical simulations are required. In this paper we quantify the precision of present-day N -body methods, identifying main potential error sources from the set-up of initial conditions to the measurement of the final power spectrum. We directly compare three widely used N -body codes, Ramses, Pkdgrav3, and Gadget3 which represent three main discretisationmore » techniques: the particle-mesh method, the tree method, and a hybrid combination of the two. For standard run parameters, the codes agree to within one percent at k ≤1 h Mpc{sup −1} and to within three percent at k ≤10 h Mpc{sup −1}. We also consider the bispectrum and show that the reduced bispectra agree at the sub-percent level for k ≤ 2 h Mpc{sup −1}. In a second step, we quantify potential errors due to initial conditions, box size, and resolution using an extended suite of simulations performed with our fastest code Pkdgrav3. We demonstrate that the simulation box size should not be smaller than L =0.5 h {sup −1}Gpc to avoid systematic finite-volume effects (while much larger boxes are required to beat down the statistical sample variance). Furthermore, a maximum particle mass of M {sub p}=10{sup 9} h {sup −1}M{sub ⊙} is required to conservatively obtain one percent precision of the matter power spectrum. As a consequence, numerical simulations covering large survey volumes of upcoming missions such as DES, LSST, and Euclid will need more than a trillion particles to reproduce clustering properties at the targeted accuracy.« less
Detecting misinformation and knowledge conflicts in relational data
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian
2014-06-01
Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).
Swarm- Validation of Star Tracker and Accelerometer Data
NASA Astrophysics Data System (ADS)
Schack, Peter; Schlicht, Anja; Pail, Roland; Gruber, Thomas
2016-08-01
The ESA Swarm mission is designed to advance studies in the field of magnetosphere, thermosphere and gravity field. To be fortunate on this task precise knowledge of the orientation of the Swarm satellites is required together with knowledge about external forces acting on the satellites. The key sensors providing this information are the star trackers and the accelerometers. Based on star tracker studies conducted by the Denmark Technical University (DTU), we found interesting patterns in the interboresight angles on all three satellites, which are partly induced by temperature alterations. Additionally, structures of horizontal stripes seem to be caused by the unique distribution of observed stars on the charge-coupled device of the star trackers. Our accelerometer analyses focus on spikes and pulses in the observations. Those short term events on Swarm might originate from electrical processes introduced by sunlight illuminating the nadir foil. Comparisons to GOCE and GRACE are included.
Jóźwik, Jagoda; Kałużna-Czaplińska, Joanna
2016-01-01
Currently, analysis of various human body fluids is one of the most essential and promising approaches to enable the discovery of biomarkers or pathophysiological mechanisms for disorders and diseases. Analysis of these fluids is challenging due to their complex composition and unique characteristics. Development of new analytical methods in this field has made it possible to analyze body fluids with higher selectivity, sensitivity, and precision. The composition and concentration of analytes in body fluids are most often determined by chromatography-based techniques. There is no doubt that proper use of knowledge that comes from a better understanding of the role of body fluids requires the cooperation of scientists of diverse specializations, including analytical chemists, biologists, and physicians. This article summarizes current knowledge about the application of different chromatographic methods in analyses of a wide range of compounds in human body fluids in order to diagnose certain diseases and disorders.
NASA Astrophysics Data System (ADS)
Bloom, A. Anthony; Lauvaux, Thomas; Worden, John; Yadav, Vineet; Duren, Riley; Sander, Stanley P.; Schimel, David S.
2016-12-01
Understanding the processes controlling terrestrial carbon fluxes is one of the grand challenges of climate science. Carbon cycle process controls are readily studied at local scales, but integrating local knowledge across extremely heterogeneous biota, landforms and climate space has proven to be extraordinarily challenging. Consequently, top-down or integral flux constraints at process-relevant scales are essential to reducing process uncertainty. Future satellite-based estimates of greenhouse gas fluxes - such as CO2 and CH4 - could potentially provide the constraints needed to resolve biogeochemical process controls at the required scales. Our analysis is focused on Amazon wetland CH4 emissions, which amount to a scientifically crucial and methodologically challenging case study. We quantitatively derive the observing system (OS) requirements for testing wetland CH4 emission hypotheses at a process-relevant scale. To distinguish between hypothesized hydrological and carbon controls on Amazon wetland CH4 production, a satellite mission will need to resolve monthly CH4 fluxes at a ˜ 333 km resolution and with a ≤ 10 mg CH4 m-2 day-1 flux precision. We simulate a range of low-earth orbit (LEO) and geostationary orbit (GEO) CH4 OS configurations to evaluate the ability of these approaches to meet the CH4 flux requirements. Conventional LEO and GEO missions resolve monthly ˜ 333 km Amazon wetland fluxes at a 17.0 and 2.7 mg CH4 m-2 day-1 median uncertainty level. Improving LEO CH4 measurement precision by
Precision Airdrop (Largage de precision)
2005-12-01
NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM
NASA Astrophysics Data System (ADS)
Lachat, E.; Landes, T.; Grussenmeyer, P.
2017-08-01
Handheld 3D scanners can be used to complete large scale models with the acquisition of occluded areas or small artefacts. This may be of interest for digitization projects in the field of Cultural Heritage, where detailed areas may require a specific treatment. Such sensors present the advantage of being easily portable in the field, and easily usable even without particular knowledge. In this paper, the Freestyle3D handheld scanner launched on the market in 2015 by FARO is investigated. Different experiments are described, covering various topics such as the influence of range or color on the measurements, but also the precision achieved for geometrical primitive digitization. These laboratory experiments are completed by acquisitions performed on engraved and sculpted stone blocks. This practical case study is useful to investigate which acquisition protocol seems to be the more adapted and leads to precise results. The produced point clouds will be compared to photogrammetric surveys for the purpose of their accuracy assessment.
Biomanufacturing: a US-China National Science Foundation-sponsored workshop.
Sun, Wei; Yan, Yongnian; Lin, Feng; Spector, Myron
2006-05-01
A recent US-China National Science Foundation-sponsored workshop on biomanufacturing reviewed the state-of-the-art of an array of new technologies for producing scaffolds for tissue engineering, providing precision multi-scale control of material, architecture, and cells. One broad category of such techniques has been termed solid freeform fabrication. The techniques in this category include: stereolithography, selected laser sintering, single- and multiple-nozzle deposition and fused deposition modeling, and three-dimensional printing. The precise and repetitive placement of material and cells in a three-dimensional construct at the micrometer length scale demands computer control. These novel computer-controlled scaffold production techniques, when coupled with computer-based imaging and structural modeling methods for the production of the templates for the scaffolds, define an emerging field of computer-aided tissue engineering. In formulating the questions that remain to be answered and discussing the knowledge required to further advance the field, the Workshop provided a basis for recommendations for future work.
Optimized merging of search coil and fluxgate data for MMS
NASA Astrophysics Data System (ADS)
Fischer, David; Magnes, Werner; Hagen, Christian; Dors, Ivan; Chutter, Mark W.; Needell, Jerry; Torbert, Roy B.; Le Contel, Olivier; Strangeway, Robert J.; Kubin, Gernot; Valavanoglou, Aris; Plaschke, Ferdinand; Nakamura, Rumi; Mirioni, Laurent; Russell, Christopher T.; Leinweber, Hannes K.; Bromund, Kenneth R.; Le, Guan; Kepko, Lawrence; Anderson, Brian J.; Slavin, James A.; Baumjohann, Wolfgang
2016-11-01
The Magnetospheric Multiscale mission (MMS) targets the characterization of fine-scale current structures in the Earth's tail and magnetopause. The high speed of these structures, when traversing one of the MMS spacecraft, creates magnetic field signatures that cross the sensitive frequency bands of both search coil and fluxgate magnetometers. Higher data quality for analysis of these events can be achieved by combining data from both instrument types and using the frequency bands with best sensitivity and signal-to-noise ratio from both sensors. This can be achieved by a model-based frequency compensation approach which requires the precise knowledge of instrument gain and phase properties. We discuss relevant aspects of the instrument design and the ground calibration activities, describe the model development and explain the application on in-flight data. Finally, we show the precision of this method by comparison of in-flight data. It confirms unity gain and a time difference of less than 100 µs between the different magnetometer instruments.
How Bright is the Proton? A Precise Determination of the Photon Parton Distribution Function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manohar, Aneesh; Nason, Paolo; Salam, Gavin P.
2016-12-09
It has become apparent in recent years that it is important, notably for a range of physics studies at the Large Hadron Collider, to have accurate knowledge on the distribution of photons in the proton. We show how the photon parton distribution function (PDF) can be determined in a model-independent manner, using electron-proton (ep) scattering data, in effect viewing the ep → e + X process as an electron scattering off the photon field of the proton. To this end, we consider an imaginary, beyond the Standard Model process with a flavor changing photon-lepton vertex. We write its cross sectionmore » in two ways: one in terms of proton structure functions, the other in terms of a photon distribution. Requiring their equivalence yields the photon distribution as an integral over proton structure functions. As a result of the good precision of ep data, we constrain the photon PDF at the level of 1%–2% over a wide range of momentum fractions.« less
How Bright is the Proton? A Precise Determination of the Photon Parton Distribution Function.
Manohar, Aneesh; Nason, Paolo; Salam, Gavin P; Zanderighi, Giulia
2016-12-09
It has become apparent in recent years that it is important, notably for a range of physics studies at the Large Hadron Collider, to have accurate knowledge on the distribution of photons in the proton. We show how the photon parton distribution function (PDF) can be determined in a model-independent manner, using electron-proton (ep) scattering data, in effect viewing the ep→e+X process as an electron scattering off the photon field of the proton. To this end, we consider an imaginary, beyond the Standard Model process with a flavor changing photon-lepton vertex. We write its cross section in two ways: one in terms of proton structure functions, the other in terms of a photon distribution. Requiring their equivalence yields the photon distribution as an integral over proton structure functions. As a result of the good precision of ep data, we constrain the photon PDF at the level of 1%-2% over a wide range of momentum fractions.
Arthroscopic approach and anatomy of the hip
Aprato, Alessandro; Giachino, Matteo; Masse, Alessandro
2016-01-01
Summary Background Hip arthroscopy has gained popularity among the orthopedic community and a precise assessment of indications, techniques and results is constantly brought on. Methods In this chapter the principal standard entry portals for central and peripheral compartment are discussed. The description starts from the superficial landmarks for portals placement and continues with the deep layers. For each entry point an illustration of the main structures encountered is provided and the principal structures at risk for different portals are accurately examined. Articular anatomical description is carried out from the arthroscope point of view and sub-divided into central and peripheral compartment. The two compartments are systematically analyzed and the accessible articular areas for each portal explained. Moreover, some anatomical variations that can be found in the normal hip are reported. Conclusion The anatomical knowledge of the hip joint along with a precise notion of the structures encountered with the arthroscope is an essential requirement for a secure and successful surgery. Level of evidence: V. PMID:28066735
Ship navigation using Navstar GPS - An application study
NASA Technical Reports Server (NTRS)
Mohan, S. N.
1982-01-01
Ocean current measurement applications in physical oceanography require knowledge of inertial ship velocity to a precision of 1-2 cm/sec over a typical five minute averaging interval. The navigation accuracy must be commensurate with data precision obtainable from ship borne acoustic profilers used in sensing ocean currents. The Navstar Global Positioning System is viewed as a step in user technological simplification, extension in coverage availability, and enhancement in performance accuracy as well as reliability over the existing systems, namely, Loran-C, Transit, and Omega. Error analyses have shown the possibility of attaining the 1-2 cm/sec accuracy during active GPS coverage at a data rate of four position fixes per minute under varying sea-states. This paper is intended to present results of data validation exercises leading to design of an experiment at sea for deployment of both a GPS y-set and a direct Doppler measurement system as the autonomous navigation system used in conjunction with an acoustic Doppler as the sensor for ocean current measurement.
NASA Astrophysics Data System (ADS)
Aguilar, M.; Ali Cavasonza, L.; Ambrosi, G.; Arruda, L.; Attig, N.; Aupetit, S.; Azzarello, P.; Bachlechner, A.; Barao, F.; Barrau, A.; Barrin, L.; Bartoloni, A.; Basara, L.; Başeǧmez-du Pree, S.; Battarbee, M.; Battiston, R.; Becker, U.; Behlmann, M.; Beischer, B.; Berdugo, J.; Bertucci, B.; Bindel, K. F.; Bindi, V.; Boella, G.; de Boer, W.; Bollweg, K.; Bonnivard, V.; Borgia, B.; Boschini, M. J.; Bourquin, M.; Bueno, E. F.; Burger, J.; Cadoux, F.; Cai, X. D.; Capell, M.; Caroff, S.; Casaus, J.; Castellini, G.; Cervelli, F.; Chae, M. J.; Chang, Y. H.; Chen, A. I.; Chen, G. M.; Chen, H. S.; Cheng, L.; Chou, H. Y.; Choumilov, E.; Choutko, V.; Chung, C. H.; Clark, C.; Clavero, R.; Coignet, G.; Consolandi, C.; Contin, A.; Corti, C.; Creus, W.; Crispoltoni, M.; Cui, Z.; Dai, Y. M.; Delgado, C.; Della Torre, S.; Demakov, O.; Demirköz, M. B.; Derome, L.; Di Falco, S.; Dimiccoli, F.; Díaz, C.; von Doetinchem, P.; Dong, F.; Donnini, F.; Duranti, M.; D'Urso, D.; Egorov, A.; Eline, A.; Eronen, T.; Feng, J.; Fiandrini, E.; Finch, E.; Fisher, P.; Formato, V.; Galaktionov, Y.; Gallucci, G.; García, B.; García-López, R. J.; Gargiulo, C.; Gast, H.; Gebauer, I.; Gervasi, M.; Ghelfi, A.; Giovacchini, F.; Goglov, P.; Gómez-Coral, D. M.; Gong, J.; Goy, C.; Grabski, V.; Grandi, D.; Graziani, M.; Guo, K. H.; Haino, S.; Han, K. C.; He, Z. H.; Heil, M.; Hoffman, J.; Hsieh, T. H.; Huang, H.; Huang, Z. C.; Huh, C.; Incagli, M.; Ionica, M.; Jang, W. Y.; Jinchi, H.; Kang, S. C.; Kanishev, K.; Kim, G. N.; Kim, K. S.; Kirn, Th.; Konak, C.; Kounina, O.; Kounine, A.; Koutsenko, V.; Krafczyk, M. S.; La Vacca, G.; Laudi, E.; Laurenti, G.; Lazzizzera, I.; Lebedev, A.; Lee, H. T.; Lee, S. C.; Leluc, C.; Li, H. S.; Li, J. Q.; Li, J. Q.; Li, Q.; Li, T. X.; Li, W.; Li, Y.; Li, Z. H.; Li, Z. Y.; Lim, S.; Lin, C. H.; Lipari, P.; Lippert, T.; Liu, D.; Liu, Hu; Lordello, V. D.; Lu, S. Q.; Lu, Y. S.; Luebelsmeyer, K.; Luo, F.; Luo, J. Z.; Lv, S. S.; Machate, F.; Majka, R.; Mañá, C.; Marín, J.; Martin, T.; Martínez, G.; Masi, N.; Maurin, D.; Menchaca-Rocha, A.; Meng, Q.; Mikuni, V. M.; Mo, D. C.; Morescalchi, L.; Mott, P.; Nelson, T.; Ni, J. Q.; Nikonov, N.; Nozzoli, F.; Oliva, A.; Orcinha, M.; Palmonari, F.; Palomares, C.; Paniccia, M.; Pauluzzi, M.; Pensotti, S.; Pereira, R.; Picot-Clemente, N.; Pilo, F.; Pizzolotto, C.; Plyaskin, V.; Pohl, M.; Poireau, V.; Putze, A.; Quadrani, L.; Qi, X. M.; Qin, X.; Qu, Z. Y.; Räihä, T.; Rancoita, P. G.; Rapin, D.; Ricol, J. S.; Rosier-Lees, S.; Rozhkov, A.; Rozza, D.; Sagdeev, R.; Sandweiss, J.; Saouter, P.; Schael, S.; Schmidt, S. M.; Schulz von Dratzig, A.; Schwering, G.; Seo, E. S.; Shan, B. S.; Shi, J. Y.; Siedenburg, T.; Son, D.; Song, J. W.; Sun, W. H.; Tacconi, M.; Tang, X. W.; Tang, Z. C.; Tao, L.; Tescaro, D.; Ting, Samuel C. C.; Ting, S. M.; Tomassetti, N.; Torsti, J.; Türkoǧlu, C.; Urban, T.; Vagelli, V.; Valente, E.; Vannini, C.; Valtonen, E.; Vázquez Acosta, M.; Vecchi, M.; Velasco, M.; Vialle, J. P.; Vitale, V.; Vitillo, S.; Wang, L. Q.; Wang, N. H.; Wang, Q. L.; Wang, X.; Wang, X. Q.; Wang, Z. X.; Wei, C. C.; Weng, Z. L.; Whitman, K.; Wienkenhöver, J.; Wu, H.; Wu, X.; Xia, X.; Xiong, R. Q.; Xu, W.; Yan, Q.; Yang, J.; Yang, M.; Yang, Y.; Yi, H.; Yu, Y. J.; Yu, Z. Q.; Zeissler, S.; Zhang, C.; Zhang, J.; Zhang, J. H.; Zhang, S. D.; Zhang, S. W.; Zhang, Z.; Zheng, Z. M.; Zhu, Z. Q.; Zhuang, H. L.; Zhukov, V.; Zichichi, A.; Zimmermann, N.; Zuccon, P.; AMS Collaboration
2016-12-01
Knowledge of the rigidity dependence of the boron to carbon flux ratio (B/C) is important in understanding the propagation of cosmic rays. The precise measurement of the B /C ratio from 1.9 GV to 2.6 TV, based on 2.3 million boron and 8.3 million carbon nuclei collected by AMS during the first 5 years of operation, is presented. The detailed variation with rigidity of the B /C spectral index is reported for the first time. The B /C ratio does not show any significant structures in contrast to many cosmic ray models that require such structures at high rigidities. Remarkably, above 65 GV, the B /C ratio is well described by a single power law RΔ with index Δ =-0.333 ±0.014 (fit ) ±0.005 (syst ) , in good agreement with the Kolmogorov theory of turbulence which predicts Δ =-1 /3 asymptotically.
Developing Statistical Knowledge for Teaching during Design-Based Research
ERIC Educational Resources Information Center
Groth, Randall E.
2017-01-01
Statistical knowledge for teaching is not precisely equivalent to statistics subject matter knowledge. Teachers must know how to make statistics understandable to others as well as understand the subject matter themselves. This dual demand on teachers calls for the development of viable teacher education models. This paper offers one such model,…
Scientific progress: Knowledge versus understanding.
Dellsén, Finnur
2016-04-01
What is scientific progress? On Alexander Bird's epistemic account of scientific progress, an episode in science is progressive precisely when there is more scientific knowledge at the end of the episode than at the beginning. Using Bird's epistemic account as a foil, this paper develops an alternative understanding-based account on which an episode in science is progressive precisely when scientists grasp how to correctly explain or predict more aspects of the world at the end of the episode than at the beginning. This account is shown to be superior to the epistemic account by examining cases in which knowledge and understanding come apart. In these cases, it is argued that scientific progress matches increases in scientific understanding rather than accumulations of knowledge. In addition, considerations having to do with minimalist idealizations, pragmatic virtues, and epistemic value all favor this understanding-based account over its epistemic counterpart. Copyright © 2016 Elsevier Ltd. All rights reserved.
Topics in inference and decision-making with partial knowledge
NASA Technical Reports Server (NTRS)
Safavian, S. Rasoul; Landgrebe, David
1990-01-01
Two essential elements needed in the process of inference and decision-making are prior probabilities and likelihood functions. When both of these components are known accurately and precisely, the Bayesian approach provides a consistent and coherent solution to the problems of inference and decision-making. In many situations, however, either one or both of the above components may not be known, or at least may not be known precisely. This problem of partial knowledge about prior probabilities and likelihood functions is addressed. There are at least two ways to cope with this lack of precise knowledge: robust methods, and interval-valued methods. First, ways of modeling imprecision and indeterminacies in prior probabilities and likelihood functions are examined; then how imprecision in the above components carries over to the posterior probabilities is examined. Finally, the problem of decision making with imprecise posterior probabilities and the consequences of such actions are addressed. Application areas where the above problems may occur are in statistical pattern recognition problems, for example, the problem of classification of high-dimensional multispectral remote sensing image data.
Simple piezoelectric-actuated mirror with 180 kHz servo bandwidth.
Briles, Travis C; Yost, Dylan C; Cingöz, Arman; Ye, Jun; Schibli, Thomas R
2010-05-10
We present a high bandwidth piezoelectric-actuated mirror for length stabilization of an optical cavity. The actuator displays a transfer function with a flat amplitude response and greater than 135 masculine phase margin up to 200 kHz, allowing a 180 kHz unity gain frequency to be achieved in a closed servo loop. To the best of our knowledge, this actuator has achieved the largest servo bandwidth for a piezoelectric transducer (PZT). The actuator should be very useful in a wide variety of applications requiring precision control of optical lengths, including laser frequency stabilization, optical interferometers, and optical communications. (c) 2010 Optical Society of America.
Human Germline Mutation and the Erratic Evolutionary Clock
Przeworski, Molly
2016-01-01
Our understanding of the chronology of human evolution relies on the “molecular clock” provided by the steady accumulation of substitutions on an evolutionary lineage. Recent analyses of human pedigrees have called this understanding into question by revealing unexpectedly low germline mutation rates, which imply that substitutions accrue more slowly than previously believed. Translating mutation rates estimated from pedigrees into substitution rates is not as straightforward as it may seem, however. We dissect the steps involved, emphasizing that dating evolutionary events requires not “a mutation rate” but a precise characterization of how mutations accumulate in development in males and females—knowledge that remains elusive. PMID:27760127
Breast Imaging in the Era of Big Data: Structured Reporting and Data Mining.
Margolies, Laurie R; Pandey, Gaurav; Horowitz, Eliot R; Mendelson, David S
2016-02-01
The purpose of this article is to describe structured reporting and the development of large databases for use in data mining in breast imaging. The results of millions of breast imaging examinations are reported with structured tools based on the BI-RADS lexicon. Much of these data are stored in accessible media. Robust computing power creates great opportunity for data scientists and breast imagers to collaborate to improve breast cancer detection and optimize screening algorithms. Data mining can create knowledge, but the questions asked and their complexity require extremely powerful and agile databases. New data technologies can facilitate outcomes research and precision medicine.
NASA Technical Reports Server (NTRS)
Numata, Kenji; Alalusi, Mazin; Stolpner, Lew; Margaritis, Georgios; Camp, Jordan; Krainak, Michael
2014-01-01
We describe the characteristics of the planar-waveguide external cavity diode laser (PW-ECL). To the best of our knowledge, it is the first butterfly-packaged 1064 nm semiconductor laser that is stable enough to be locked to an external frequency reference. We evaluated its performance from the viewpoint of precision experiments. Using a hyperfine absorption line of iodine, we suppressed its frequency noise by a factor of up to 104 at 10 mHz. The PWECL's compactness and low cost make it a candidate to replace traditional Nd:YAG nonplanar ring oscillators and fiber lasers in applications that require a single longitudinal mode.
Robust Requirements Tracing via Internet Search Technology: Improving an IV and V Technique. Phase 2
NASA Technical Reports Server (NTRS)
Hayes, Jane; Dekhtyar, Alex
2004-01-01
There are three major objectives to this phase of the work. (1) Improvement of Information Retrieval (IR) methods for Independent Verification and Validation (IV&V) requirements tracing. Information Retrieval methods are typically developed for very large (order of millions - tens of millions and more documents) document collections and therefore, most successfully used methods somewhat sacrifice precision and recall in order to achieve efficiency. At the same time typical IR systems treat all user queries as independent of each other and assume that relevance of documents to queries is subjective for each user. The IV&V requirements tracing problem has a much smaller data set to operate on, even for large software development projects; the set of queries is predetermined by the high-level specification document and individual requirements considered as query input to IR methods are not necessarily independent from each other. Namely, knowledge about the links for one requirement may be helpful in determining the links of another requirement. Finally, while the final decision on the exact form of the traceability matrix still belongs to the IV&V analyst, his/her decisions are much less arbitrary than those of an Internet search engine user. All this suggests that the information available to us in the framework of the IV&V tracing problem can be successfully leveraged to enhance standard IR techniques, which in turn would lead to increased recall and precision. We developed several new methods during Phase II; (2) IV&V requirements tracing IR toolkit. Based on the methods developed in Phase I and their improvements developed in Phase II, we built a toolkit of IR methods for IV&V requirements tracing. The toolkit has been integrated, at the data level, with SAIC's SuperTracePlus (STP) tool; (3) Toolkit testing. We tested the methods included in the IV&V requirements tracing IR toolkit on a number of projects.
Personalized or Precision Medicine? The Example of Cystic Fibrosis
Marson, Fernando A. L.; Bertuzzo, Carmen S.; Ribeiro, José D.
2017-01-01
The advent of the knowledge on human genetics, by the identification of disease-associated variants, culminated in the understanding of human variability. With the genetic knowledge, the specificity of the clinical phenotype and the drug response of each individual were understood. Using the cystic fibrosis (CF) as an example, the new terms that emerged such as personalized medicine and precision medicine can be characterized. The genetic knowledge in CF is broad and the presence of a monogenic disease caused by mutations in the CFTR gene enables the phenotype–genotype association studies (including the response to drugs), considering the wide clinical and laboratory spectrum dependent on the mutual action of genotype, environment, and lifestyle. Regarding the CF disease, personalized medicine is the treatment directed at the symptoms, and this treatment is adjusted depending on the patient’s phenotype. However, more recently, the term precision medicine began to be widely used, although its correct application and understanding are still vague and poorly characterized. In precision medicine, we understand the individual as a response to the interrelation between environment, lifestyle, and genetic factors, which enabled the advent of new therapeutic models, such as conventional drugs adjustment by individual patient dosage and drug type and response, development of new drugs (read through, broker, enhancer, stabilizer, and amplifier compounds), genome editing by homologous recombination, zinc finger nucleases, TALEN (transcription activator-like effector nuclease), CRISPR-Cas9 (clustered regularly interspaced short palindromic repeats-CRISPR-associated endonuclease 9), and gene therapy. Thus, we introduced the terms personalized medicine and precision medicine based on the CF. PMID:28676762
Chuong, Kim H.; Mack, David R.; Stintzi, Alain
2018-01-01
Abstract Healthcare institutions face widespread challenges of delivering high-quality and cost-effective care, while keeping up with rapid advances in biomedical knowledge and technologies. Moreover, there is increased emphasis on developing personalized or precision medicine targeted to individuals or groups of patients who share a certain biomarker signature. Learning healthcare systems (LHS) have been proposed for integration of research and clinical practice to fill major knowledge gaps, improve care, reduce healthcare costs, and provide precision care. To date, much discussion in this context has focused on the potential of human genomic data, and not yet on human microbiome data. Rapid advances in human microbiome research suggest that profiling of, and interventions on, the human microbiome can provide substantial opportunity for improved diagnosis, therapeutics, risk management, and risk stratification. In this study, we discuss a potential role for microbiome science in LHSs. We first review the key elements of LHSs, and discuss possibilities of Big Data and patient engagement. We then consider potentials and challenges of integrating human microbiome research into clinical practice as part of an LHS. With rapid growth in human microbiome research, patient-specific microbial data will begin to contribute in important ways to precision medicine. Hence, we discuss how patient-specific microbial data can help guide therapeutic decisions and identify novel effective approaches for precision care of inflammatory bowel disease. To the best of our knowledge, this expert analysis makes an original contribution with new insights poised at the emerging intersection of LHSs, microbiome science, and postgenomics medicine. PMID:28282257
Humans Have Precise Knowledge of Familiar Geographical Slants
ERIC Educational Resources Information Center
Stigliani, Anthony; Li, Zhi; Durgin, Frank H.
2013-01-01
Whereas maps primarily represent the 2-dimensional layout of the environment, people are also aware of the 3-dimensional layout of their environment. An experiment conducted on a small college campus tested whether the remembered slants of familiar paths were precisely represented. Three measures of slant (verbal, manual, and pictorial) were…
A content review of precision agriculture courses in the United States and Canada
USDA-ARS?s Scientific Manuscript database
Knowledge of what precision agriculture (PA) content is currently taught in North America will help build a better understanding for what PA instructors should incorporate into their classes in the future. The University of Missouri partnered with several universities throughout the nation on a USDA...
Ciarleglio, Anita E; Ma, Carolyn
2017-09-01
The precision medicine initiative brought forth by President Barack Obama in 2015 is an important step on the journey to truly personalized medicine. A broad knowledge and understanding of the implications of the pharmacogenomic literature will be critical to the achievement of this goal. While a great amount of data has been published in the areas of pharmacogenomics and pharmacogenetics, there are still relatively few instances in which the need for clinical intervention can be stated without doubt, and which are widely accepted and practiced by the medical community. As our knowledge base rapidly expands, issues such as insurance reimbursement for genetic testing and education of the health care workforce will be paramount to achieving the goal of precision medicine for all patients.
On the Anisotropic Mechanical Properties of Selective Laser-Melted Stainless Steel.
Hitzler, Leonhard; Hirsch, Johann; Heine, Burkhard; Merkel, Markus; Hall, Wayne; Öchsner, Andreas
2017-09-26
The thorough description of the peculiarities of additively manufactured (AM) structures represents a current challenge for aspiring freeform fabrication methods, such as selective laser melting (SLM). These methods have an immense advantage in the fast fabrication (no special tooling or moulds required) of components, geometrical flexibility in their design, and efficiency when only small quantities are required. However, designs demand precise knowledge of the material properties, which in the case of additively manufactured structures are anisotropic and, under certain circumstances, inhomogeneous in nature. Furthermore, these characteristics are highly dependent on the fabrication settings. In this study, the anisotropic tensile properties of selective laser-melted stainless steel (1.4404, 316L) are investigated: the Young's modulus ranged from 148 to 227 GPa, the ultimate tensile strength from 512 to 699 MPa, and the breaking elongation ranged, respectively, from 12% to 43%. The results were compared to related studies in order to classify the influence of the fabrication settings. Furthermore, the influence of the chosen raw material was addressed by comparing deviations on the directional dependencies reasoned from differing microstructural developments during manufacture. Stainless steel was found to possess its maximum strength at a 45° layer versus loading offset, which is precisely where AlSi10Mg was previously reported to be at its weakest.
On the Anisotropic Mechanical Properties of Selective Laser-Melted Stainless Steel
Hirsch, Johann; Heine, Burkhard; Merkel, Markus; Hall, Wayne; Öchsner, Andreas
2017-01-01
The thorough description of the peculiarities of additively manufactured (AM) structures represents a current challenge for aspiring freeform fabrication methods, such as selective laser melting (SLM). These methods have an immense advantage in the fast fabrication (no special tooling or moulds required) of components, geometrical flexibility in their design, and efficiency when only small quantities are required. However, designs demand precise knowledge of the material properties, which in the case of additively manufactured structures are anisotropic and, under certain circumstances, inhomogeneous in nature. Furthermore, these characteristics are highly dependent on the fabrication settings. In this study, the anisotropic tensile properties of selective laser-melted stainless steel (1.4404, 316L) are investigated: the Young’s modulus ranged from 148 to 227 GPa, the ultimate tensile strength from 512 to 699 MPa, and the breaking elongation ranged, respectively, from 12% to 43%. The results were compared to related studies in order to classify the influence of the fabrication settings. Furthermore, the influence of the chosen raw material was addressed by comparing deviations on the directional dependencies reasoned from differing microstructural developments during manufacture. Stainless steel was found to possess its maximum strength at a 45° layer versus loading offset, which is precisely where AlSi10Mg was previously reported to be at its weakest. PMID:28954426
Light distribution properties in spinal cord for optogenetic stimulation (Conference Presentation)
NASA Astrophysics Data System (ADS)
GÄ secka, Alicja; Bahdine, Mohamed; Lapointe, Nicolas; Rioux, Veronique; Perez-Sanchez, Jimena; Bonin, Robert P.; De Koninck, Yves; Côté, Daniel
2016-03-01
Optogenetics is currently one of the most popular technique in neuroscience. It enables cell-selective and temporally-precise control of neuronal activity. Good spatial control of the stimulated area and minimized tissue damage requires a specific knowledge about light scattering properties. Light propagation in cell cultures and brain tissue is relatively well documented and allows for a precise and reliable delivery of light to the neurons. In spinal cord, light must pass through highly organized white matter before reaching cell bodies present in grey matter, this heterogenous structure makes it difficult to predict the propagation pattern. In this work we investigate the light distribution properties through mouse and monkey spinal cord. The light propagation depends on a fibers orientation, leading to less deep penetration profile in the direction perpendicular to the fibers and lower attenuation in the direction parallel to the fibers. Additionally, the use of different illumination wavelengths results in variations of the attenuation coefficient. Next, we use Monte-Carlo simulation to study light transport. The model gives a full 3-D simulation of light distribution in spinal cord and takes into account different scattering properties related to the fibers orientation. These studies are important to estimate the minimum optical irradiance required at the fiber tip to effectively excite the optogenetic proteins in a desired region of spinal cord.
Canoe: An Autonomous Infrastructure-Free Indoor Navigation System.
Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei
2017-04-30
The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe , an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%.
Canoe: An Autonomous Infrastructure-Free Indoor Navigation System
Dong, Kai; Wu, Wenjia; Ye, Haibo; Yang, Ming; Ling, Zhen; Yu, Wei
2017-01-01
The development of the Internet of Things (IoT) has accelerated research in indoor navigation systems, a majority of which rely on adequate wireless signals and sources. Nonetheless, deploying such a system requires periodic site-survey, which is time consuming and labor intensive. To address this issue, in this paper we present Canoe, an indoor navigation system that considers shopping mall scenarios. In our system, we do not assume any prior knowledge, such as floor-plan or the shop locations, access point placement or power settings, historical RSS measurements or fingerprints, etc. Instead, Canoe requires only that the shop owners collect and publish RSS values at the entrances of their shops and can direct a consumer to any of these shops by comparing the observed RSS values. The locations of the consumers and the shops are estimated using maximum likelihood estimation. In doing this, the direction of the target shop relative to the current orientation of the consumer can be precisely computed, such that the direction that a consumer should move can be determined. We have conducted extensive simulations using a real-world dataset. Our experiments in a real shopping mall demonstrate that if 50% of the shops publish their RSS values, Canoe can precisely navigate a consumer within 30 s, with an error rate below 9%. PMID:28468291
Pretorius, Etheresia
2017-01-01
The latest statistics from the 2016 heart disease and stroke statistics update shows that cardiovascular disease is the leading global cause of death, currently accounting for more than 17.3 million deaths per year. Type II diabetes is also on the rise with out-of-control numbers. To address these pandemics, we need to treat patients using an individualized patient care approach, but simultaneously gather data to support the precision medicine initiative. Last year the NIH announced the precision medicine initiative to generate novel knowledge regarding diseases, with a near-term focus on cancers, followed by a longer-term aim, applicable to a whole range of health applications and diseases. The focus of this paper is to suggest a combined effort between the latest precision medicine initiative, researchers and clinicians; whereby novel techniques could immediately make a difference in patient care, but long-term add to knowledge for use in precision medicine. We discuss the intricate relationship between individualized patient care and precision medicine and the current thoughts regarding which data is actually suitable for the precision medicine data gathering. The uses of viscoelastic techniques in precision medicine are discussed and how these techniques might give novel perspectives on the success of treatment regimes of cardiovascular patients are explored. Thrombo-embolic stroke, rheumathoid arthritis and type II diabetes are used as examples of diseases where precision medicine and a patient-orientated approach can possibly be implemented. In conclusion it is suggested that if all role players work together by embracing a new way of thought in treating and managing cardiovascular disease and diabetes will we be able to adequately address these out-ofcontrol conditions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Cobb, Joshua N; Declerck, Genevieve; Greenberg, Anthony; Clark, Randy; McCouch, Susan
2013-04-01
More accurate and precise phenotyping strategies are necessary to empower high-resolution linkage mapping and genome-wide association studies and for training genomic selection models in plant improvement. Within this framework, the objective of modern phenotyping is to increase the accuracy, precision and throughput of phenotypic estimation at all levels of biological organization while reducing costs and minimizing labor through automation, remote sensing, improved data integration and experimental design. Much like the efforts to optimize genotyping during the 1980s and 1990s, designing effective phenotyping initiatives today requires multi-faceted collaborations between biologists, computer scientists, statisticians and engineers. Robust phenotyping systems are needed to characterize the full suite of genetic factors that contribute to quantitative phenotypic variation across cells, organs and tissues, developmental stages, years, environments, species and research programs. Next-generation phenotyping generates significantly more data than previously and requires novel data management, access and storage systems, increased use of ontologies to facilitate data integration, and new statistical tools for enhancing experimental design and extracting biologically meaningful signal from environmental and experimental noise. To ensure relevance, the implementation of efficient and informative phenotyping experiments also requires familiarity with diverse germplasm resources, population structures, and target populations of environments. Today, phenotyping is quickly emerging as the major operational bottleneck limiting the power of genetic analysis and genomic prediction. The challenge for the next generation of quantitative geneticists and plant breeders is not only to understand the genetic basis of complex trait variation, but also to use that knowledge to efficiently synthesize twenty-first century crop varieties.
Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine
2007-01-01
Pathologies and acts are classified in thesauri to help physicians to code their activity. In practice, the use of thesauri is not sufficient to reduce variability in coding and thesauri are not suitable for computer processing. We think the automation of the coding task requires a conceptual modeling of medical items: an ontology. Our task is to help lung specialists code acts and diagnoses with software that represents medical knowledge of this concerned specialty by an ontology. The objective of the reported work was to build an ontology of pulmonary diseases dedicated to the coding process. To carry out this objective, we develop a precise methodological process for the knowledge engineer in order to build various types of medical ontologies. This process is based on the need to express precisely in natural language the meaning of each concept using differential semantics principles. A differential ontology is a hierarchy of concepts and relationships organized according to their similarities and differences. Our main research hypothesis is to apply natural language processing tools to corpora to develop the resources needed to build the ontology. We consider two corpora, one composed of patient discharge summaries and the other being a teaching book. We propose to combine two approaches to enrich the ontology building: (i) a method which consists of building terminological resources through distributional analysis and (ii) a method based on the observation of corpus sequences in order to reveal semantic relationships. Our ontology currently includes 1550 concepts and the software implementing the coding process is still under development. Results show that the proposed approach is operational and indicates that the combination of these methods and the comparison of the resulting terminological structures give interesting clues to a knowledge engineer for the building of an ontology.
NASA Astrophysics Data System (ADS)
Pino, Abdiel O.; Pladellorens, Josep
2014-07-01
A means of facilitating the transfer of Optical inspection methods knowledge and skills from academic institutions and their research partners into Panama optics and optical research groups is described. The process involves the creation of an Integrated Knowledge Group Research (IKGR), a partnership led by Polytechnic University of Panama with the support of the SENACYT and Optics and Optometry Department, Polytechnic University of Catalonia. This paper describes the development of the Project for knowledge transfer "Implementation of a method of optical inspection of low cost for improving the surface quality of rolled material of metallic and nonmetallic industrial use", this project will develop a method for measuring the surface quality using texture analysis speckle pattern formed on the surface to be characterized. The project is designed to address the shortage of key skills in the field of precision engineering for optical applications. The main issues encountered during the development of the knowledge transfer teaching and learning are discussed, and the outcomes from the first four months of knowledge transfer activities are described. In overall summary, the results demonstrate how the Integrated Knowledge Group Research and new approach to knowledge transfer has been effective in addressing the engineering skills gap in precision optics for manufactured industrial sector.
Tool simplifies machining of pipe ends for precision welding
NASA Technical Reports Server (NTRS)
Matus, S. T.
1969-01-01
Single tool prepares a pipe end for precision welding by simultaneously performing internal machining, end facing, and bevel cutting to specification standards. The machining operation requires only one milling adjustment, can be performed quickly, and produces the high quality pipe-end configurations required to ensure precision-welded joints.
Precise time and time interval applications to electric power systems
NASA Technical Reports Server (NTRS)
Wilson, Robert E.
1992-01-01
There are many applications of precise time and time interval (frequency) in operating modern electric power systems. Many generators and customer loads are operated in parallel. The reliable transfer of electrical power to the consumer partly depends on measuring power system frequency consistently in many locations. The internal oscillators in the widely dispersed frequency measuring units must be syntonized. Elaborate protection and control systems guard the high voltage equipment from short and open circuits. For the highest reliability of electric service, engineers need to study all control system operations. Precise timekeeping networks aid in the analysis of power system operations by synchronizing the clocks on recording instruments. Utility engineers want to reproduce events that caused loss of service to customers. Precise timekeeping networks can synchronize protective relay test-sets. For dependable electrical service, all generators and large motors must remain close to speed synchronism. The stable response of a power system to perturbations is critical to continuity of electrical service. Research shows that measurement of the power system state vector can aid in the monitoring and control of system stability. If power system operators know that a lightning storm is approaching a critical transmission line or transformer, they can modify operating strategies. Knowledge of the location of a short circuit fault can speed the re-energizing of a transmission line. One fault location technique requires clocks synchronized to one microsecond. Current research seeks to find out if one microsecond timekeeping can aid and improve power system control and operation.
NASA Technical Reports Server (NTRS)
Lemoine, F. G.; Zelensky, N. P.; Luthcke, S. B.; Rowlands, D. D.; Beckley, B. D.; Klosko, S. M.
2006-01-01
Launched in the summer of 1992, TOPEX/POSEIDON (T/P) was a joint mission between NASA and the Centre National d Etudes Spatiales (CNES), the French Space Agency, to make precise radar altimeter measurements of the ocean surface. After the remarkably successful 13-years of mapping the ocean surface T/P lost its ability to maneuver and was de-commissioned January 2006. T/P revolutionized the study of the Earth s oceans by vastly exceeding pre-launch estimates of surface height accuracy recoverable from radar altimeter measurements. The precision orbit lies at the heart of the altimeter measurement providing the reference frame from which the radar altimeter measurements are made. The expected quality of orbit knowledge had limited the measurement accuracy expectations of past altimeter missions, and still remains a major component in the error budget of all altimeter missions. This paper describes critical improvements made to the T/P orbit time series over the 13-years of precise orbit determination (POD) provided by the GSFC Space Geodesy Laboratory. The POD improvements from the pre-launch T/P expectation of radial orbit accuracy and Mission requirement of 13-cm to an expected accuracy of about 1.5-cm with today s latest orbits will be discussed. The latest orbits with 1.5 cm RMS radial accuracy represent a significant improvement to the 2.0-cm accuracy orbits currently available on the T/P Geophysical Data Record (GDR) altimeter product.
Precise orbits of the Lunar Reconnaissance Orbiter from radiometric tracking data
NASA Astrophysics Data System (ADS)
Löcher, Anno; Kusche, Jürgen
2018-02-01
Since 2009, the Lunar Reconnaissance Orbiter (LRO) acquires images and altimetric profiles of the lunar surface. Assembling these data to maps and terrain models requires the precise knowledge of the spacecraft trajectory. In this contribution, we present 5 years of LRO orbits from radiometric data processed with a software tailored to this mission. The presented orbits are the first independent validation of the LRO science orbits from NASA and are available for public use. A key feature of our processing is the elaborate treatment of model and observation errors by empirical parameters and an adaptive data weighting by variance component estimation. The quality of the resulting orbits is assessed by analyzing overlapping arcs. For our solution based on arcs of 2.5 days, such analysis yields a mean error of 2.81 m in total position and 0.11 m in radial direction. It is shown that this result greatly benefits from the adaptive data weighting, reducing the error by 2.54 and 0.13 m, respectively. Unfortunately, the precision achieved varies strongly, dependent on the view onto the orbital ellipse which changes with the lunar cycle. To mitigate this dependency, the arc length was extended in steps up to 10.5 days, leading in the best case to a further improvement of 0.80 m.
Assessment of physical activity of the human body considering the thermodynamic system.
Hochstein, Stefan; Rauschenberger, Philipp; Weigand, Bernhard; Siebert, Tobias; Schmitt, Syn; Schlicht, Wolfgang; Převorovská, Světlana; Maršík, František
2016-01-01
Correctly dosed physical activity is the basis of a vital and healthy life, but the measurement of physical activity is certainly rather empirical resulting in limited individual and custom activity recommendations. Certainly, very accurate three-dimensional models of the cardiovascular system exist, however, requiring the numeric solution of the Navier-Stokes equations of the flow in blood vessels. These models are suitable for the research of cardiac diseases, but computationally very expensive. Direct measurements are expensive and often not applicable outside laboratories. This paper offers a new approach to assess physical activity using thermodynamical systems and its leading quantity of entropy production which is a compromise between computation time and precise prediction of pressure, volume, and flow variables in blood vessels. Based on a simplified (one-dimensional) model of the cardiovascular system of the human body, we develop and evaluate a setup calculating entropy production of the heart to determine the intensity of human physical activity in a more precise way than previous parameters, e.g. frequently used energy considerations. The knowledge resulting from the precise real-time physical activity provides the basis for an intelligent human-technology interaction allowing to steadily adjust the degree of physical activity according to the actual individual performance level and thus to improve training and activity recommendations.
A Solar Aspect System for the HEROES Mission
NASA Technical Reports Server (NTRS)
Christe, Steven; Shih, Albert; Rodriguez, Marcello; Gregory, Kyle; Cramer, Alexander; Edgerton, Melissa; Gaskin, Jessica; O'Connor, Brian; Sobey, Alexander
2014-01-01
A new Solar Aspect System (SAS) has been developed to provide the ability to observe the Sun on an existing balloon payload HERO (short for High Energy Replicated Optics). Developed under the HEROES program (High Energy Replicated Optics to Explore the Sun), the SAS aspect system provides solar pointing knowledge in pitch, yaw, and roll. The required precision of these measurements must be better than the HEROES X-ray resolution of approximately 20 arcsec Full Width at Half Maximum (FWHM) so as to not degrade the image resolution. The SAS consists of two separate systems: the Pitch-Yaw Aspect System (PYAS) and the Roll Aspect System (RAS). The PYAS functions by projecting an image of the Sun onto a screen with precision fiducials. A CCD camera takes an image of these fiducials, and an automated algorithm determines the location of the Sun as well as the location of the fiducials. The spacing between fiducials is unique and allows each to be identified so that the location of the Sun on the screen can be precisely determined. The RAS functions by imaging the Earth's horizon in opposite directions using a silvered prism imaged by a CCD camera. The design and first results of the performance of these systems during the HEROES flight which occurred in September 2013 are presented here.
NASA Astrophysics Data System (ADS)
Maity, Arnab; Padhi, Radhakant; Mallaram, Sanjeev; Mallikarjuna Rao, G.; Manickavasagam, M.
2016-10-01
A new nonlinear optimal and explicit guidance law is presented in this paper for launch vehicles propelled by solid motors. It can ensure very high terminal precision despite not having the exact knowledge of the thrust-time curve apriori. This was motivated from using it for a carrier launch vehicle in a hypersonic mission, which demands an extremely narrow terminal accuracy window for the launch vehicle for successful initiation of operation of the hypersonic vehicle. The proposed explicit guidance scheme, which computes the optimal guidance command online, ensures the required stringent final conditions with high precision at the injection point. A key feature of the proposed guidance law is an innovative extension of the recently developed model predictive static programming guidance with flexible final time. A penalty function approach is also followed to meet the input and output inequality constraints throughout the vehicle trajectory. In this paper, the guidance law has been successfully validated from nonlinear six degree-of-freedom simulation studies by designing an inner-loop autopilot as well, which enhances confidence of its usefulness significantly. In addition to excellent nominal results, the proposed guidance has been found to have good robustness for perturbed cases as well.
NASA Astrophysics Data System (ADS)
Moreenthaler, George W.; Khatib, Nader; Kim, Byoungsoo
2003-08-01
For two decades now, the use of Remote Sensing/Precision Agriculture to improve farm yields while reducing the use of polluting chemicals and the limited water supply has been a major goal. With world population growing exponentially, arable land being consumed by urbanization, and an unfavorable farm economy, farm efficiency must increase to meet future food requirements and to make farming a sustainable, profitable occupation. "Precision Agriculture" refers to a farming methodology that applies nutrients and moisture only where and when they are needed in the field. The real goal is to increase farm profitability by identifying the additional treatments of chemicals and water that increase revenues more than they increase costs and do no exceed pollution standards (constrained optimization). Even though the economic and environmental benefits appear to be great, Remote Sensing/Precision Agriculture has not grown as rapidly as early advocates envisioned. Technology for a successful Remote Sensing/Precision Agriculture system is now in place, but other needed factors have been missing. Commercial satellite systems can now image the Earth (multi-spectrally) with a resolution as fine as 2.5 m. Precision variable dispensing systems using GPS are now available and affordable. Crop models that predict yield as a function of soil, chemical, and irrigation parameter levels have been developed. Personal computers and internet access are now in place in most farm homes and can provide a mechanism for periodically disseminating advice on what quantities of water and chemicals are needed in specific regions of each field. Several processes have been selected that fuse the disparate sources of information on the current and historic states of the crop and soil, and the remaining resource levels available, with the critical decisions that farmers are required to make. These are done in a way that is easy for the farmer to understand and profitable to implement. A "Constrained Optimization Algorithm" to further improve these processes will be presented. The objective function of the model will used to maximize the farmer's profit via increasing yields while decreasing environmental damage and decreasing applications of costly treatments. This model will incorporate information from Remote Sensing, from in-situ weather sources, from soil history, and from tacit farmer knowledge of the relative productivity of selected "Management Zones" of the farm, to provide incremental advice throughout the growing season on the optimum usage of water and chemical treatments.
Four of Ogden Lindsley's Unpublished Presentation Summaries
ERIC Educational Resources Information Center
Starlin, Clay M.
2010-01-01
As one might expect from the founder of Precision Teaching, Ogden Lindsley was precise in creating the summaries of his presentations. Ogden took the opportunity to share his latest thinking on a topic in his presentations. However, because the presentation summaries have not been published, many people have missed the advantage of this knowledge.…
Designing a Constraint Based Parser for Sanskrit
NASA Astrophysics Data System (ADS)
Kulkarni, Amba; Pokar, Sheetal; Shukl, Devanand
Verbal understanding (śā bdabodha) of any utterance requires the knowledge of how words in that utterance are related to each other. Such knowledge is usually available in the form of cognition of grammatical relations. Generative grammars describe how a language codes these relations. Thus the knowledge of what information various grammatical relations convey is available from the generation point of view and not the analysis point of view. In order to develop a parser based on any grammar one should then know precisely the semantic content of the grammatical relations expressed in a language string, the clues for extracting these relations and finally whether these relations are expressed explicitly or implicitly. Based on the design principles that emerge from this knowledge, we model the parser as finding a directed Tree, given a graph with nodes representing the words and edges representing the possible relations between them. Further, we also use the Mīmā ṃsā constraint of ākā ṅkṣā (expectancy) to rule out non-solutions and sannidhi (proximity) to prioritize the solutions. We have implemented a parser based on these principles and its performance was found to be satisfactory giving us a confidence to extend its functionality to handle the complex sentences.
The ethical duty to preserve the quality of scientific information
NASA Astrophysics Data System (ADS)
Arattano, Massimo; Gatti, Albertina; Eusebio, Elisa
2016-04-01
The commitment to communicate and divulge the knowledge acquired during his/her professional activity is certainly one of the ethical duties of the geologist. However nowadays, in the Internet era, the spreading of knowledge involves potential risks that the geologist should be aware of. These risks require a careful analysis aimed to mitigate their effects. The Internet may in fact contribute to spread (e.g. through websites like Wikipedia) information badly or even incorrectly presented. The final result could be an impediment to the diffusion of knowledge and a reduction of its effectiveness, which is precisely the opposite of the goal that a geologist should pursue. Specific criteria aimed to recognize incorrect or inadequate information would be, therefore, extremely useful. Their development and application might avoid, or at least reduce, the above mentioned risk. Ideally, such criteria could be also used to develop specific algorithms to automatically verify the quality of information available all over the Internet. A possible criterion will be here presented for the quality control of knowledge and scientific information. An example of its application in the field of geology will be provided, to verify and correct a piece of information available on the Internet. The proposed criterion could be also used for the simplification of the scientific information and the increase of its informative efficacy.
Code of Federal Regulations, 2013 CFR
2013-10-01
... programs is to advance scientific and technical knowledge and apply that knowledge to the extent necessary... are directed toward objectives for which the work or methods cannot be precisely described in advance... to encourage the best sources from the scientific and industrial community to become involved in the...
Code of Federal Regulations, 2012 CFR
2012-10-01
... programs is to advance scientific and technical knowledge and apply that knowledge to the extent necessary... are directed toward objectives for which the work or methods cannot be precisely described in advance... to encourage the best sources from the scientific and industrial community to become involved in the...
Code of Federal Regulations, 2014 CFR
2014-10-01
... programs is to advance scientific and technical knowledge and apply that knowledge to the extent necessary... are directed toward objectives for which the work or methods cannot be precisely described in advance... to encourage the best sources from the scientific and industrial community to become involved in the...
A Reaction to Mazoue's Deconstructed Campus
ERIC Educational Resources Information Center
Shrock, Sharon A.
2012-01-01
Mazoue's ("J Comput High Educ," 2012) article, "The Deconstructed Campus," is examined from the perspective of instructional design practice. Concerns center on: the knowledge base of precision instruction; the differential effectiveness of teaching procedural as opposed to declarative knowledge; the reliance on assessment of online learning; and…
Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.
Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua
2015-01-01
A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.
Simmons, Michael; Singhal, Ayush; Lu, Zhiyong
2018-01-01
The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text — found in biomedical publications and clinical notes — is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine. PMID:27807747
Simmons, Michael; Singhal, Ayush; Lu, Zhiyong
2016-01-01
The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next-generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text-found in biomedical publications and clinical notes-is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine.
Simulation Study of Effects of the Blind Deconvolution on Ultrasound Image
NASA Astrophysics Data System (ADS)
He, Xingwu; You, Junchen
2018-03-01
Ultrasonic image restoration is an essential subject in Medical Ultrasound Imaging. However, without enough and precise system knowledge, some traditional image restoration methods based on the system prior knowledge often fail to improve the image quality. In this paper, we use the simulated ultrasound image to find the effectiveness of the blind deconvolution method for ultrasound image restoration. Experimental results demonstrate that the blind deconvolution method can be applied to the ultrasound image restoration and achieve the satisfactory restoration results without the precise prior knowledge, compared with the traditional image restoration method. And with the inaccurate small initial PSF, the results shows blind deconvolution could improve the overall image quality of ultrasound images, like much better SNR and image resolution, and also show the time consumption of these methods. it has no significant increasing on GPU platform.
Evaluating abundance estimate precision and the assumptions of a count-based index for small mammals
Wiewel, A.S.; Adams, A.A.Y.; Rodda, G.H.
2009-01-01
Conservation and management of small mammals requires reliable knowledge of population size. We investigated precision of markrecapture and removal abundance estimates generated from live-trapping and snap-trapping data collected at sites on Guam (n 7), Rota (n 4), Saipan (n 5), and Tinian (n 3), in the Mariana Islands. We also evaluated a common index, captures per unit effort (CPUE), as a predictor of abundance. In addition, we evaluated cost and time associated with implementing live-trapping and snap-trapping and compared species-specific capture rates of selected live- and snap-traps. For all species, markrecapture estimates were consistently more precise than removal estimates based on coefficients of variation and 95 confidence intervals. The predictive utility of CPUE was poor but improved with increasing sampling duration. Nonetheless, modeling of sampling data revealed that underlying assumptions critical to application of an index of abundance, such as constant capture probability across space, time, and individuals, were not met. Although snap-trapping was cheaper and faster than live-trapping, the time difference was negligible when site preparation time was considered. Rattus diardii spp. captures were greatest in Haguruma live-traps (Standard Trading Co., Honolulu, HI) and Victor snap-traps (Woodstream Corporation, Lititz, PA), whereas Suncus murinus and Mus musculus captures were greatest in Sherman live-traps (H. B. Sherman Traps, Inc., Tallahassee, FL) and Museum Special snap-traps (Woodstream Corporation). Although snap-trapping and CPUE may have utility after validation against more rigorous methods, validation should occur across the full range of study conditions. Resources required for this level of validation would likely be better allocated towards implementing rigorous and robust methods.
NASA Astrophysics Data System (ADS)
Waisberg, Idel; Dexter, Jason; Gillessen, Stefan; Pfuhl, Oliver; Eisenhauer, Frank; Plewa, Phillip M.; Bauböck, Michi; Jimenez-Rosales, Alejandra; Habibi, Maryam; Ott, Thomas; von Fellenberg, Sebastiano; Gao, Feng; Widmann, Felix; Genzel, Reinhard
2018-05-01
Astrometric and spectroscopic monitoring of individual stars orbiting the supermassive black hole in the Galactic Center offer a promising way to detect general relativistic effects. While low-order effects are expected to be detected following the periastron passage of S2 in Spring 2018, detecting higher order effects due to black hole spin will require the discovery of closer stars. In this paper, we set out to determine the requirements such a star would have to satisfy to allow the detection of black hole spin. We focus on the instrument GRAVITY, which saw first light in 2016 and which is expected to achieve astrometric accuracies 10-100 μas. For an observing campaign with duration T years, total observations Nobs, astrometric precision σx, and normalized black hole spin χ, we find that a_orb(1-e^2)^{3/4} ≲ 300 R_S √{T/4 {yr}} (N_obs/120)^{0.25} √{10 μ as/σ _x} √{χ /0.9} is needed. For χ = 0.9 and a potential observing campaign with σ _x = 10 μas, 30 observations yr-1 and duration 4-10 yr, we expect ˜0.1 star with K < 19 satisfying this constraint based on the current knowledge about the stellar population in the central 1 arcsec. We also propose a method through which GRAVITY could potentially measure radial velocities with precision ˜50 km s-1. If the astrometric precision can be maintained, adding radial velocity information increases the expected number of stars by roughly a factor of 2. While we focus on GRAVITY, the results can also be scaled to parameters relevant for future extremely large telescopes.
Frelin-Labalme, Anne-Marie; Ledoux, Xavier
2017-01-01
Objective: Small animal image-guided irradiators have recently been developed to mimic the delivery techniques of clinical radiotherapy. A dosemeter adapted to millimetric beams of medium-energy X-rays is then required. This work presents the characterization of a dosemeter prototype for this particular application. Methods: A scintillating optical fibre dosemeter (called DosiRat) has been implemented to perform real-time dose measurements with the dedicated small animal X-RAD® 225Cx (Precision X-Ray, Inc., North Branford, CT) irradiator. Its sensitivity, stem effect, stability, linearity and measurement precision were determined in large field conditions for three different beam qualities, consistent with small animal irradiation and imaging parameters. Results: DosiRat demonstrates good sensitivity and stability; excellent air kerma and air kerma rate linearity; and a good repeatability for air kerma rates >1 mGy s−1. The stem effect was found to be negligible. DosiRat showed limited precision for low air kerma rate measurements (<1 mGy s−1), typically for imaging protocols. A positive energy dependence was found that can be accounted for by calibrating the dosemeter at the needed beam qualities. Conclusion: The dosimetric performances of DosiRat are very promising. Extensive studies of DosiRat energy dependence are still required. Further developments will allow to reduce the dosemeter size to ensure millimetric beams dosimetry and perform small animal in vivo dosimetry. Advances in knowledge: Among existing point dosemeters, very few are dedicated to both medium-energy X-rays and millimetric beams. Our work demonstrated that scintillating fibre dosemeters are suitable and promising tools for real-time dose measurements in the small animal field of interest. PMID:27556813
Coherent anti-Stokes Raman spectroscopic modeling for combustion diagnostics
NASA Technical Reports Server (NTRS)
Hall, R. J.
1983-01-01
The status of modelling the coherent anti-Stokes Raman spectroscopy (CARS) spectra of molecules important in combustion, such as N2, H2O, and CO2, is reviewed. It is shown that accurate modelling generally requires highly precise knowledge of line positions and reasonable estimates of Raman linewidths, and the sources of these data are discussed. CARS technique and theory is reviewed, and the status of modelling the phenomenon of collisional narrowing at pressures well above atmospheric for N2, H2O, and CO2 is described. It is shown that good agreement with experiment can be achieved using either the Gordon rotational diffusion model or phenomenological models for inelastic energy transfer rates.
iCLIP: Protein–RNA interactions at nucleotide resolution
Huppertz, Ina; Attig, Jan; D’Ambrogio, Andrea; Easton, Laura E.; Sibley, Christopher R.; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej
2014-01-01
RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein–RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein–RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. PMID:24184352
Cicero, Raúl; Criales, José Luis; Cardoso, Manuel
2009-01-01
The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.
NASA Technical Reports Server (NTRS)
Numata, Kenji; Alalusi, Mazin; Stolpner, Lew; Margaritis, Georgios; Camp, Jordan B.; Krainak, Michael A.
2014-01-01
We describe the characteristics of the planar-waveguide external cavity diode laser (PW-ECL). To the best of our knowledge, it is the first butterfly-packaged 1064-nm semiconductor laser that is stable enough to be locked to an external frequency reference. We evaluated its performance from the viewpoint of precision experiments. Especially, using a hyperfine absorption line of iodine, we suppressed its frequency noise by a factor of up to104 at 10 mHz. The PW-ECLs compactness and low cost make it a candidate to replace traditional Nd:YAGnon-planar ring oscillators and fiber lasers in applications which require a single longitudinal-mode.
Optimally designing games for behavioural research
Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.
2014-01-01
Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821
Active Collision Avoidance for Planetary Landers
NASA Technical Reports Server (NTRS)
Rickman, Doug; Hannan, Mike; Srinivasan, Karthik
2014-01-01
Present day robotic missions to other planets require precise, a priori knowledge of the terrain to pre-determine a landing spot that is safe. Landing sites can be miles from the mission objective, or, mission objectives may be tailored to suit landing sites. Future robotic exploration missions should be capable of autonomously identifying a safe landing target within a specified target area selected by mission requirements. Such autonomous landing sites must (1) 'see' the surface, (2) identify a target, and (3) land the vehicle. Recent advances in radar technology have resulted in small, lightweight, low power radars that are used for collision avoidance and cruise control systems in automobiles. Such radar systems can be adapted for use as active hazard avoidance systems for planetary landers. The focus of this CIF proposal is to leverage earlier work on collision avoidance systems for MSFC's Mighty Eagle lander and evaluate the use of automotive radar systems for collision avoidance in planetary landers.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
Department of Defense Precise Time and Time Interval program improvement plan
NASA Technical Reports Server (NTRS)
Bowser, J. R.
1981-01-01
The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.
ERIC Educational Resources Information Center
Luck, Joe D.; Fulton, John P.; Rees, Jennifer
2015-01-01
Three Precision Agriculture Data Management workshops regarding yield monitor data were conducted in 2014, reaching 62 participants. Post-workshop surveys (n = 58) indicated 73% of respondents experienced a moderate to significant increase in knowledge related to yield monitor data usage. Another 72% reported that they planned to utilize best…
Precision agriculture and information technology
Daniel L. Schmoldt
2001-01-01
As everyone knows, knowledge (along with its less-sophisticated sibling, information) is power. For a long time, detailed knowledge (in agriculture) has been generally inaccessible or was prohibitively expensive to acquire. Advances in electronics, communications, and software over the past several decades have removed those earlier impediments. Inexpensive sensors and...
Steady-state low thermal resistance characterization apparatus: The bulk thermal tester
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burg, Brian R.; Kolly, Manuel; Blasakis, Nicolas
The reliability of microelectronic devices is largely dependent on electronic packaging, which includes heat removal. The appropriate packaging design therefore necessitates precise knowledge of the relevant material properties, including thermal resistance and thermal conductivity. Thin materials and high conductivity layers make their thermal characterization challenging. A steady state measurement technique is presented and evaluated with the purpose to characterize samples with a thermal resistance below 100 mm{sup 2} K/W. It is based on the heat flow meter bar approach made up by two copper blocks and relies exclusively on temperature measurements from thermocouples. The importance of thermocouple calibration is emphasizedmore » in order to obtain accurate temperature readings. An in depth error analysis, based on Gaussian error propagation, is carried out. An error sensitivity analysis highlights the importance of the precise knowledge of the thermal interface materials required for the measurements. Reference measurements on Mo samples reveal a measurement uncertainty in the range of 5% and most accurate measurements are obtained at high heat fluxes. Measurement techniques for homogeneous bulk samples, layered materials, and protruding cavity samples are discussed. Ultimately, a comprehensive overview of a steady state thermal characterization technique is provided, evaluating the accuracy of sample measurements with thermal resistances well below state of the art setups. Accurate characterization of materials used in heat removal applications, such as electronic packaging, will enable more efficient designs and ultimately contribute to energy savings.« less
Cardi, Teodoro; D’Agostino, Nunzio; Tripodi, Pasquale
2017-01-01
In the frame of modern agriculture facing the predicted increase of population and general environmental changes, the securement of high quality food remains a major challenge to deal with. Vegetable crops include a large number of species, characterized by multiple geographical origins, large genetic variability and diverse reproductive features. Due to their nutritional value, they have an important place in human diet. In recent years, many crop genomes have been sequenced permitting the identification of genes and superior alleles associated with desirable traits. Furthermore, innovative biotechnological approaches allow to take a step forward towards the development of new improved cultivars harboring precise genome modifications. Sequence-based knowledge coupled with advanced biotechnologies is supporting the widespread application of new plant breeding techniques to enhance the success in modification and transfer of useful alleles into target varieties. Clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein 9 system, zinc-finger nucleases, and transcription activator-like effector nucleases represent the main methods available for plant genome engineering through targeted modifications. Such technologies, however, require efficient transformation protocols as well as extensive genomic resources and accurate knowledge before they can be efficiently exploited in practical breeding programs. In this review, we revise the state of the art in relation to availability of such scientific and technological resources in various groups of vegetables, describe genome editing results obtained so far and discuss the implications for future applications. PMID:28275380
NASA Astrophysics Data System (ADS)
Swanson, C.; Jandovitz, P.; Cohen, S. A.
2017-10-01
Knowledge of the full x-ray energy distribution function (XEDF) emitted from a plasma over a large dynamic range of energies can yield valuable insights about the electron energy distribution function (EEDF) of that plasma and the dynamic processes that create them. X-ray pulse height detectors such as Amptek's X-123 Fast SDD with Silicon Nitride window can detect x-rays in the range of 200eV to 100s of keV. However, extracting EEDF from this measurement requires precise knowledge of the detector's response function. This response function, including the energy scale calibration, the window transmission function, and the resolution function, can be measured directly. We describe measurements of this function from x-rays from a mono-energetic electron beam in a purpose-built gas-target x-ray tube. Large-Z effects such as line radiation, nuclear charge screening, and polarizational Bremsstrahlung are discussed.
NASA Technical Reports Server (NTRS)
Roth, Don J.; Cosgriff, Laura M.; Harder, Bryan; Zhu, Dongming; Martin, Richard E.
2013-01-01
This study investigates the applicability of a novel noncontact single-sided terahertz electromagnetic measurement method for measuring thickness in dielectric coating systems having either dielectric or conductive substrate materials. The method does not require knowledge of the velocity of terahertz waves in the coating material. The dielectric coatings ranged from approximately 300 to 1400 m in thickness. First, the terahertz method was validated on a bulk dielectric sample to determine its ability to precisely measure thickness and density variation. Then, the method was studied on simulated coating systems. One simulated coating consisted of layered thin paper samples of varying thicknesses on a ceramic substrate. Another simulated coating system consisted of adhesive-backed Teflon adhered to conducting and dielectric substrates. Alumina samples that were coated with a ceramic adhesive layer were also investigated. Finally, the method was studied for thickness measurement of actual thermal barrier coatings (TBC) on ceramic substrates. The unique aspects and limitations of this method for thickness measurements are discussed.
Toward precision medicine in Alzheimer's disease.
Reitz, Christiane
2016-03-01
In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.
[Classification in medicine. An introductory reflection on its aim and object].
Giere, W
2007-07-01
Human beings are born with the ability to recognize Gestalt and to classify. However, all classifications depend on their circumstances and intentions. There is no ultimate classification, and there is no one correct classification in medicine either. Examples for classifications of diagnoses, symptoms and procedures are discussed. The path to gaining knowledge and the basic difference between collecting data (patient file) and sorting data (register) will be illustrated using the BAIK information model. Additionally the model shows how the doctor can profit from the active electronic patient file which automatically offers him other relevant information for his current decision and saves time. "Without classification no new knowledge, no new knowledge through classification". This paradox will be solved eventually: a change of paradigms requires the overcoming of the currently valid classification system in medicine as well. Finally more precise recommendations will be given on how doctors can be freed from the burden of the need to classify and how the whole health system can gain much more valid data without limiting the doctors' freedom and creativity through co-ordinated use of IT, all while saving money at the same time.
Lanier, Wendy E.; Bailey, Larissa L.; Muths, Erin L.
2016-01-01
Conservation of imperiled species often requires knowledge of vital rates and population dynamics. However, these can be difficult to estimate for rare species and small populations. This problem is further exacerbated when individuals are not available for detection during some surveys due to limited access, delaying surveys and creating mismatches between the breeding behavior and survey timing. Here we use simulations to explore the impacts of this issue using four hypothetical boreal toad (Anaxyrus boreas boreas) populations, representing combinations of logistical access (accessible, inaccessible) and breeding behavior (synchronous, asynchronous). We examine the bias and precision of survival and breeding probability estimates generated by survey designs that differ in effort and timing for these populations. Our findings indicate that the logistical access of a site and mismatch between the breeding behavior and survey design can greatly limit the ability to yield accurate and precise estimates of survival and breeding probabilities. Simulations similar to what we have performed can help researchers determine an optimal survey design(s) for their system before initiating sampling efforts.
Liu, Yi
2016-06-01
With the development of social economy, people's lifestyle has changed accompanied with the problem of population aging. The spectrum of disease also varied accordingly, thus led to complicated and varied wound aetiology, along with the formation of innumerably changed acute and chronic wounds. Therefore, it is hard to meet the requirement of multidisciplinary knowledge and technique in the diagnosis and treatment of some extraordinary agent wound with a single discipline. The extraordinary agent wound is caused by some uncommon or rare etiological factors, the specialty of which lays on the unique mechanism of wound formation, and a lot of disciplines were involved in the diagnosis and management of the wound. A unification of multiple disciplines is needed to integrate the relevant theory and technique to care the wound by giving consideration of the symptom and the aetiology. The primary diseases which induced the uncommon agent wound should be targeted and treated effectively; meanwhile, a comprehensive treatment combined with multiple new wound management techniques should be carried out to realize the objective of precise treatment.
Psychoanalytic application and psychoanalytic integrity.
O'Neill, Sylvia
2005-02-01
In this article, the author offers an analysis of psychoanalytic application, defined as the breaking of new conceptual ground in some field of knowledge whereby the new idea is conceived, and later articulated, with the aid of reference to analogous phenomena in psychoanalysis. It requires apt analogy based on competent understanding of the applied field and of psychoanalysis. Only when the relevant differences between the applied and psychoanalytic fields are grasped can the extent of certain parallels emerge. The thinking by analogy that comprises psychoanalytic application may be intuitive and implicit, but should be susceptible of explicit theoretical elaboration that specifies, precisely, the point(s) of correspondence between psychoanalysis and the applied field in relation to a precise specification of their relevant differences. Applied psychotherapy at the interface of the internal and external worlds (historically rooted in casework) is employed as a model. By analogy with Donnet's concept of the analytic site, the author proposes the concept of the psychodynamic (case)work site, and elaborates it for that applied field in order to elucidate the proposed principles of psychoanalytic application.
Aguilar, M; Ali Cavasonza, L; Ambrosi, G; Arruda, L; Attig, N; Aupetit, S; Azzarello, P; Bachlechner, A; Barao, F; Barrau, A; Barrin, L; Bartoloni, A; Basara, L; Başeğmez-du Pree, S; Battarbee, M; Battiston, R; Becker, U; Behlmann, M; Beischer, B; Berdugo, J; Bertucci, B; Bindel, K F; Bindi, V; Boella, G; de Boer, W; Bollweg, K; Bonnivard, V; Borgia, B; Boschini, M J; Bourquin, M; Bueno, E F; Burger, J; Cadoux, F; Cai, X D; Capell, M; Caroff, S; Casaus, J; Castellini, G; Cervelli, F; Chae, M J; Chang, Y H; Chen, A I; Chen, G M; Chen, H S; Cheng, L; Chou, H Y; Choumilov, E; Choutko, V; Chung, C H; Clark, C; Clavero, R; Coignet, G; Consolandi, C; Contin, A; Corti, C; Creus, W; Crispoltoni, M; Cui, Z; Dai, Y M; Delgado, C; Della Torre, S; Demakov, O; Demirköz, M B; Derome, L; Di Falco, S; Dimiccoli, F; Díaz, C; von Doetinchem, P; Dong, F; Donnini, F; Duranti, M; D'Urso, D; Egorov, A; Eline, A; Eronen, T; Feng, J; Fiandrini, E; Finch, E; Fisher, P; Formato, V; Galaktionov, Y; Gallucci, G; García, B; García-López, R J; Gargiulo, C; Gast, H; Gebauer, I; Gervasi, M; Ghelfi, A; Giovacchini, F; Goglov, P; Gómez-Coral, D M; Gong, J; Goy, C; Grabski, V; Grandi, D; Graziani, M; Guo, K H; Haino, S; Han, K C; He, Z H; Heil, M; Hoffman, J; Hsieh, T H; Huang, H; Huang, Z C; Huh, C; Incagli, M; Ionica, M; Jang, W Y; Jinchi, H; Kang, S C; Kanishev, K; Kim, G N; Kim, K S; Kirn, Th; Konak, C; Kounina, O; Kounine, A; Koutsenko, V; Krafczyk, M S; La Vacca, G; Laudi, E; Laurenti, G; Lazzizzera, I; Lebedev, A; Lee, H T; Lee, S C; Leluc, C; Li, H S; Li, J Q; Li, J Q; Li, Q; Li, T X; Li, W; Li, Y; Li, Z H; Li, Z Y; Lim, S; Lin, C H; Lipari, P; Lippert, T; Liu, D; Liu, Hu; Lordello, V D; Lu, S Q; Lu, Y S; Luebelsmeyer, K; Luo, F; Luo, J Z; Lv, S S; Machate, F; Majka, R; Mañá, C; Marín, J; Martin, T; Martínez, G; Masi, N; Maurin, D; Menchaca-Rocha, A; Meng, Q; Mikuni, V M; Mo, D C; Morescalchi, L; Mott, P; Nelson, T; Ni, J Q; Nikonov, N; Nozzoli, F; Oliva, A; Orcinha, M; Palmonari, F; Palomares, C; Paniccia, M; Pauluzzi, M; Pensotti, S; Pereira, R; Picot-Clemente, N; Pilo, F; Pizzolotto, C; Plyaskin, V; Pohl, M; Poireau, V; Putze, A; Quadrani, L; Qi, X M; Qin, X; Qu, Z Y; Räihä, T; Rancoita, P G; Rapin, D; Ricol, J S; Rosier-Lees, S; Rozhkov, A; Rozza, D; Sagdeev, R; Sandweiss, J; Saouter, P; Schael, S; Schmidt, S M; Schulz von Dratzig, A; Schwering, G; Seo, E S; Shan, B S; Shi, J Y; Siedenburg, T; Son, D; Song, J W; Sun, W H; Tacconi, M; Tang, X W; Tang, Z C; Tao, L; Tescaro, D; Ting, Samuel C C; Ting, S M; Tomassetti, N; Torsti, J; Türkoğlu, C; Urban, T; Vagelli, V; Valente, E; Vannini, C; Valtonen, E; Vázquez Acosta, M; Vecchi, M; Velasco, M; Vialle, J P; Vitale, V; Vitillo, S; Wang, L Q; Wang, N H; Wang, Q L; Wang, X; Wang, X Q; Wang, Z X; Wei, C C; Weng, Z L; Whitman, K; Wienkenhöver, J; Wu, H; Wu, X; Xia, X; Xiong, R Q; Xu, W; Yan, Q; Yang, J; Yang, M; Yang, Y; Yi, H; Yu, Y J; Yu, Z Q; Zeissler, S; Zhang, C; Zhang, J; Zhang, J H; Zhang, S D; Zhang, S W; Zhang, Z; Zheng, Z M; Zhu, Z Q; Zhuang, H L; Zhukov, V; Zichichi, A; Zimmermann, N; Zuccon, P
2016-12-02
Knowledge of the rigidity dependence of the boron to carbon flux ratio (B/C) is important in understanding the propagation of cosmic rays. The precise measurement of the B/C ratio from 1.9 GV to 2.6 TV, based on 2.3 million boron and 8.3 million carbon nuclei collected by AMS during the first 5 years of operation, is presented. The detailed variation with rigidity of the B/C spectral index is reported for the first time. The B/C ratio does not show any significant structures in contrast to many cosmic ray models that require such structures at high rigidities. Remarkably, above 65 GV, the B/C ratio is well described by a single power law R^{Δ} with index Δ=-0.333±0.014(fit)±0.005(syst), in good agreement with the Kolmogorov theory of turbulence which predicts Δ=-1/3 asymptotically.
Sensing qualitative events to control manipulation
NASA Astrophysics Data System (ADS)
Pook, Polly K.; Ballard, Dana H.
1992-11-01
Dexterous robotic hands have numerous sensors distributed over a flexible high-degree-of- freedom framework. Control of these hands often relies on a detailed task description that is either specified a priori or computed on-line from sensory feedback. Such controllers are complex and may use unnecessary precision. In contrast, one can incorporate plan cues that provide a contextual backdrop in order to simplify the control task. To demonstrate, a Utah/MIT dexterous hand mounted on a Puma 760 arm flips a plastic egg, using the finger tendon tensions as the sole control signal. The completion of each subtask, such as picking up the spatula, finding the pan, and sliding the spatula under the egg, is detected by sensing tension states. The strategy depends on the task context but does not require precise positioning knowledge. We term this qualitative manipulation to draw a parallel with qualitative vision strategies. The approach is to design closed-loop programs that detect significant events to control manipulation but ignore inessential details. The strategy is generalized by analyzing the robot state dynamics during teleoperated hand actions to reveal the essential features that control each action.
Too Much to Count On: Impaired Very Small Numbers in Corticobasal Degeneration
ERIC Educational Resources Information Center
Halpern, Casey; Clark, Robin; Moore, Peachie; Cross, Katy; Grossman, Murray
2007-01-01
Patients with corticobasal degeneration (CBD) have calculation impairments. This study examined whether impaired number knowledge depends on verbal mediation. We focused particularly on knowledge of very small numbers, where there is a precise relationship between a cardinality and its number concept, but little hypothesized role for verbal…
Xu, Rong; Li, Li; Wang, QuanQiu
2013-01-01
Motivation: Systems approaches to studying phenotypic relationships among diseases are emerging as an active area of research for both novel disease gene discovery and drug repurposing. Currently, systematic study of disease phenotypic relationships on a phenome-wide scale is limited because large-scale machine-understandable disease–phenotype relationship knowledge bases are often unavailable. Here, we present an automatic approach to extract disease–manifestation (D-M) pairs (one specific type of disease–phenotype relationship) from the wide body of published biomedical literature. Data and Methods: Our method leverages external knowledge and limits the amount of human effort required. For the text corpus, we used 119 085 682 MEDLINE sentences (21 354 075 citations). First, we used D-M pairs from existing biomedical ontologies as prior knowledge to automatically discover D-M–specific syntactic patterns. We then extracted additional pairs from MEDLINE using the learned patterns. Finally, we analysed correlations between disease manifestations and disease-associated genes and drugs to demonstrate the potential of this newly created knowledge base in disease gene discovery and drug repurposing. Results: In total, we extracted 121 359 unique D-M pairs with a high precision of 0.924. Among the extracted pairs, 120 419 (99.2%) have not been captured in existing structured knowledge sources. We have shown that disease manifestations correlate positively with both disease-associated genes and drug treatments. Conclusions: The main contribution of our study is the creation of a large-scale and accurate D-M phenotype relationship knowledge base. This unique knowledge base, when combined with existing phenotypic, genetic and proteomic datasets, can have profound implications in our deeper understanding of disease etiology and in rapid drug repurposing. Availability: http://nlp.case.edu/public/data/DMPatternUMLS/ Contact: rxx@case.edu PMID:23828786
Fuiko, R; Kotten, B; Zettl, R; Ritschl, P
2004-03-01
Cinematic and pointing procedures are used for non-image based navigated implantation during total knee replacement. Pointing procedures require an exact knowledge of the landmarks. In this anatomical study, landmarks are defined and repeatedly referenced. Precision and reproducibility are evaluated by means of an inter- and an intra-observer study. The axes of the femur and tibia are calculated using the landmarks. The specific landmarks of 30 femurs and 27 tibias were palpated by three surgeons and digitised by means of a photogrammetric system, as used intra-operatively. The recorded data were statistically evaluated. The specific landmarks can be referenced with great precision. The vectors that influence the implant position show a mean femoral deviation of 0.9 mm and a mean tibial deviation of 1.0 mm. The repeating accuracy of every observer was 1.5 mm femoral and 1.0 mm tibial. The calculated long axes at the femur and tibia thus reach a precision of 0.1 degrees (min.-max.: 0-0.9 degrees) at the femur and 0.2 degrees (.0-1.1 degrees) at the tibia. The short axes at the distal femur and proximal tibia exhibit an average deviation of from 0.7 degrees to 1.9 degrees (0-11.3 degrees). Long axes (mechanical axes) can be determined exactly but the precision of the short axes (rotational axes) is unsatisfactory, although palpation of landmarks was accurate. Therefore, palpation of more than one rotational axis at the femur and tibia is mandatory and should be visualized on the monitor during surgery.
della Croce, U; Cappozzo, A; Kerrigan, D C
1999-03-01
Human movement analysis using stereophotogrammetry is based on the reconstruction of the instantaneous laboratory position of selected bony anatomical landmarks (AL). For this purpose, knowledge of an AL's position in relevant bone-embedded frames is required. Because ALs are not points but relatively large and curved areas, their identification by palpation or other means is subject to both intra- and inter-examiner variability. In addition, the local position of ALs, as reconstructed using an ad hoc experimental procedure (AL calibration), is affected by photogrammetric errors. The intra- and inter-examiner precision with which local positions of pelvis and lower limb palpable bony ALs can be identified and reconstructed were experimentally assessed. Six examiners and two subjects participated in the study. Intra- and inter-examiner precision (RMS distance from the mean position) resulted in the range 6-21 mm and 13-25 mm, respectively. Propagation of the imprecision of ALs to the orientation of bone-embedded anatomical frames and to hip, knee and ankle joint angles was assessed. Results showed that this imprecision may cause distortion in joint angle against time functions to the extent that information relative to angular movements in the range of 10 degrees or lower may be concealed. Bone geometry parameters estimated using the same data showed that the relevant precision does not allow for reliable bone geometry description. These findings, together with those relative to skin movement artefacts reported elsewhere, assist the human movement analyst's consciousness of the possible limitations involved in 3D movement analysis using stereophotogrammetry and call for improvements of the relevant experimental protocols.
Two-phase strategy of controlling motor coordination determined by task performance optimality.
Shimansky, Yury P; Rand, Miya K
2013-02-01
A quantitative model of optimal coordination between hand transport and grip aperture has been derived in our previous studies of reach-to-grasp movements without utilizing explicit knowledge of the optimality criterion or motor plant dynamics. The model's utility for experimental data analysis has been demonstrated. Here we show how to generalize this model for a broad class of reaching-type, goal-directed movements. The model allows for measuring the variability of motor coordination and studying its dependence on movement phase. The experimentally found characteristics of that dependence imply that execution noise is low and does not affect motor coordination significantly. From those characteristics it is inferred that the cost of neural computations required for information acquisition and processing is included in the criterion of task performance optimality as a function of precision demand for state estimation and decision making. The precision demand is an additional optimized control variable that regulates the amount of neurocomputational resources activated dynamically. It is shown that an optimal control strategy in this case comprises two different phases. During the initial phase, the cost of neural computations is significantly reduced at the expense of reducing the demand for their precision, which results in speed-accuracy tradeoff violation and significant inter-trial variability of motor coordination. During the final phase, neural computations and thus motor coordination are considerably more precise to reduce the cost of errors in making a contact with the target object. The generality of the optimal coordination model and the two-phase control strategy is illustrated on several diverse examples.
[Some reflections on evidenced-based medicine, precision medicine, and big data-based research].
Tang, J L; Li, L M
2018-01-10
Evidence-based medicine remains the best paradigm for medical practice. However, evidence alone is not decisions; decisions must also consider resources available and the values of people. Evidence shows that most of those treated with blood pressure-lowering, cholesterol-lowering, glucose-lowering and anti-cancer drugs do not benefit from preventing severe complications such as cardiovascular events and deaths. This implies that diagnosis and treatment in modern medicine in many circumstances is imprecise. It has become a dream to identify and treat only those few who can respond to the treatment. Precision medicine has thus come into being. Precision medicine is however not a new idea and cannot rely solely on gene sequencing as it was initially proposed. Neither is the large cohort and multi-factorial approach a new idea; in fact it has been used widely since 1950s. Since its very beginning, medicine has never stopped in searching for more precise diagnostic and therapeutic methods and already made achievements at various levels of our understanding and knowledge, such as vaccine, blood transfusion, imaging, and cataract surgery. Genetic biotechnology is not the only path to precision but merely a new method. Most genes are found only weakly associated with disease and are thus unlikely to lead to great improvement in diagnostic and therapeutic precision. The traditional multi-factorial approach by embracing big data and incorporating genetic factors is probably the most realistic way ahead for precision medicine. Big data boasts of possession of the total population and large sample size and claims correlation can displace causation. They are serious misleading concepts. Science has never had to observe the totality in order to draw a valid conclusion; a large sample size is required only when the anticipated effect is small and clinically less meaningful; emphasis on correlation over causation is equivalent to rejection of the scientific principles and methods in epidemiology and a call to give up the assurance for validity in scientific research, which will inevitably lead to futile interventions. Furthermore, in proving the effectiveness of intervention, analyses of real-world big data cannot displace the role of randomized controlled trial. We expressed doubts and critiques in this article on precision medicine and big data, merely hoping to stimulate discussing on the true potentials of precision medicine and big data.
SPARQL Query Re-writing Using Partonomy Based Transformation Rules
NASA Astrophysics Data System (ADS)
Jain, Prateek; Yeh, Peter Z.; Verma, Kunal; Henson, Cory A.; Sheth, Amit P.
Often the information present in a spatial knowledge base is represented at a different level of granularity and abstraction than the query constraints. For querying ontology's containing spatial information, the precise relationships between spatial entities has to be specified in the basic graph pattern of SPARQL query which can result in long and complex queries. We present a novel approach to help users intuitively write SPARQL queries to query spatial data, rather than relying on knowledge of the ontology structure. Our framework re-writes queries, using transformation rules to exploit part-whole relations between geographical entities to address the mismatches between query constraints and knowledge base. Our experiments were performed on completely third party datasets and queries. Evaluations were performed on Geonames dataset using questions from National Geographic Bee serialized into SPARQL and British Administrative Geography Ontology using questions from a popular trivia website. These experiments demonstrate high precision in retrieval of results and ease in writing queries.
Enhancing Knowledge Flow in a Health Care Context: A Mobile Computing Approach
Souza, Diego Da Silva; de Lima, Patrícia Zudio; da Silveira, Pedro C; de Souza, Jano Moreira
2014-01-01
Background Advances in mobile computing and wireless communication have allowed people to interact and exchange knowledge almost anywhere. These technologies support Medicine 2.0, where the health knowledge flows among all involved people (eg, patients, caregivers, doctors, and patients’ relatives). Objective Our paper proposes a knowledge-sharing environment that takes advantage of mobile computing and contextual information to support knowledge sharing among participants within a health care community (ie, from patients to health professionals). This software environment enables knowledge exchange using peer-to-peer (P2P) mobile networks based on users’ profiles, and it facilitates face-to-face interactions among people with similar health interests, needs, or goals. Methods First, we reviewed and analyzed relevant scientific articles and software apps to determine the current state of knowledge flow within health care. Although no proposal was capable of addressing every aspect in the Medicine 2.0 paradigm, a list of requirements was compiled. Using this requirement list and our previous works, a knowledge-sharing environment was created integrating Mobile Exchange of Knowledge (MEK) and the Easy to Deploy Indoor Positioning System (EDIPS), and a twofold qualitative evaluation was performed. Second, we analyzed the efficiency and reliability of the knowledge that the integrated MEK-EDIPS tool provided to users according to their interest topics, and then performed a proof of concept with health professionals to determine the feasibility and usefulness of using this solution in a real-world scenario. Results . Using MEK, we reached 100% precision and 80% recall in the exchange of files within the peer-to-peer network. The mechanism that facilitated face-to-face interactions was evaluated by the difference between the location indicated by the EDIPS tool and the actual location of the people involved in the knowledge exchange. The average distance error was <6.28 m for an indoor environment. The usability and usefulness of this tool was assessed by questioning a sample of 18 health professionals: 94% (17/18) agreed the integrated MEK-EDIPS tool provides greater interaction among all the participants (eg, patients, caregivers, doctors, and patients’ relatives), most considered it extremely important in the health scenario, 72% (13/18) believed it could increase the knowledge flow in a health environment, and 67% (12/18) recommend it or would like to recommend its use. Conclusions The integrated MEK-EDIPS tool can provide more services than any other software tool analyzed in this paper. The proposed integrated MEK-EDIPS tool seems to be the best alternative for supporting health knowledge flow within the Medicine 2.0 paradigm. PMID:25427923
Action Priority: Early Neurophysiological Interaction of Conceptual and Motor Representations
Koester, Dirk; Schack, Thomas
2016-01-01
Handling our everyday life, we often react manually to verbal requests or instruction, but the functional interrelations of motor control and language are not fully understood yet, especially their neurophysiological basis. Here, we investigated whether specific motor representations for grip types interact neurophysiologically with conceptual information, that is, when reading nouns. Participants performed lexical decisions and, for words, executed a grasp-and-lift task on objects of different sizes involving precision or power grips while the electroencephalogram was recorded. Nouns could denote objects that require either a precision or a power grip and could, thus, be (in)congruent with the performed grasp. In a control block, participants pointed at the objects instead of grasping them. The main result revealed an event-related potential (ERP) interaction of grip type and conceptual information which was not present for pointing. Incongruent compared to congruent conditions elicited an increased positivity (100–200 ms after noun onset). Grip type effects were obtained in response-locked analyses of the grasping ERPs (100–300 ms at left anterior electrodes). These findings attest that grip type and conceptual information are functionally related when planning a grasping action but such an interaction could not be detected for pointing. Generally, the results suggest that control of behaviour can be modulated by task demands; conceptual noun information (i.e., associated action knowledge) may gain processing priority if the task requires a complex motor response. PMID:27973539
Evaluation of response variables in computer-simulated virtual cataract surgery
NASA Astrophysics Data System (ADS)
Söderberg, Per G.; Laurell, Carl-Gustaf; Simawi, Wamidh; Nordqvist, Per; Skarman, Eva; Nordh, Leif
2006-02-01
We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at evaluating the precision in the estimation of response variables identified for measurement of the performance of VR phaco surgery. We identified 31 response variables measuring; the overall procedure, the foot pedal technique, the phacoemulsification technique, erroneous manipulation, and damage to ocular structures. Totally, 8 medical or optometry students with a good knowledge of ocular anatomy and physiology but naive to cataract surgery performed three sessions each of VR Phaco surgery. For measurement, the surgical procedure was divided into a sculpting phase and an evacuation phase. The 31 response variables were measured for each phase in all three sessions. The variance components for individuals and iterations of sessions within individuals were estimated with an analysis of variance assuming a hierarchal model. The consequences of estimated variabilities for sample size requirements were determined. It was found that generally there was more variability for iterated sessions within individuals for measurements of the sculpting phase than for measurements of the evacuation phase. This resulted in larger required sample sizes for detection of difference between independent groups or change within group, for the sculpting phase as compared to for the evacuation phase. It is concluded that several of the identified response variables can be measured with sufficient precision for evaluation of VR phaco surgery.
Kotora, Joseph G
2015-01-01
Emergency healthcare providers are required to care for victims of Chemical, Biological, Radiologic, Nuclear, and Explosive (CBRNE) agents. However, US emergency departments are often ill prepared to manage CBRNE casualties. Most providers lack adequate knowledge or experience in the areas of patient decontamination, hospital-specific disaster protocols, interagency familiarization, and available supply of necessary medical equipment and medications. This study evaluated the CBRNE preparedness of physicians, nurses, and midlevel providers in an urban tertiary care emergency department. This retrospective observational survey study used a previously constructed questionnaire instrument. A total of 205 e-mail invitations were sent to 191 eligible providers through an online survey distribution tool (Survey Monkey®). Respondents were enrolled from February 1, 2014 to March 15, 2014. Simple frequencies of correct answers were used to determine the level of preparedness of each group. Cronbach's coefficient α was used to validate the precision of the study instrument. Finally, validity coefficients and analysis of variance ANOVA were used to determine the strength of correlation between demographic variables, as well as the variation between individual responses. Fifty-nine providers responded to the questionnaire (31.14 percent response rate). The overall frequency of correct answers was 66.26 percent, indicating a relatively poor level of CBRNE preparedness. The study instrument lacked precision and reliability (coefficient α 0.4050). Significant correlations were found between the frequency of correct answers and the respondents' gender, practice experience, and previous experience with a CBRNE incident. Significant variance exists between how providers believe casualties should be decontaminated, which drugs should be administered, and the interpretation of facility-specific protocols. Emergency care providers are inadequately prepared to manage CBRNE incidents. Furthermore, a valid and precise instrument capable of measuring preparedness needs to be developed. Standardized educational curriculums that consider healthcare providers' genders, occupations, and experience levels may assist in closing the knowledge gaps between providers and reinforce emergency departments' CBRNE preparedness.
From Whiteboard to Model: A Preliminary Analysis (Preprint)
2007-01-01
models built for process control emphasize precision, and so on. One’s available knowledge and data also influence the modeling task. Domains such as...chemical engineering and circuit design are knowledge -rich, which enables a realistic expression of the entities and relationships within the...index into the others so that the modeler can move about freely with ease. And third, no matter the richness of knowledge or data about an
The philosophy of 'unity of knowledge and action' in interventional neuroradiology teaching.
Lv, Xianli; Wu, Zhongxue
2018-06-01
Despite the continuing emphasis on the importance of clinical skills, these skills do not appear to be improving and may actually be declining. The 'unity of knowledge and action' is a medicine directed precisely at this disease. The 'unity of knowledge and action' helps to learn from failure and successes, learn from mistakes of predecessors and institute a behaviour that prevents repetition of these mistakes.
ERIC Educational Resources Information Center
Gilbert, Melissa C.
2014-01-01
This study examined the productive disposition of pre-algebra students who demonstrated similar knowledge of the focal content but varied in other academic behaviors expected in the Common Core State Standards for Mathematics (CCSSM). Specifically, the study considered students' attention to precision when critiquing a peer's work. The…
NASA Technical Reports Server (NTRS)
Bryant, Nevin A.; Logan, Thomas L.; Zobrist, Albert L.
2006-01-01
Improvements to the automated co-registration and change detection software package, AFIDS (Automatic Fusion of Image Data System) has recently completed development for and validation by NGA/GIAT. The improvements involve the integration of the AFIDS ultra-fine gridding technique for horizontal displacement compensation with the recently evolved use of Rational Polynomial Functions/ Coefficients (RPFs/RPCs) for image raster pixel position to Latitude/Longitude indexing. Mapping and orthorectification (correction for elevation effects) of satellite imagery defies exact projective solutions because the data are not obtained from a single point (like a camera), but as a continuous process from the orbital path. Standard image processing techniques can apply approximate solutions, but advances in the state-of-the-art had to be made for precision change-detection and time-series applications where relief offsets become a controlling factor. The earlier AFIDS procedure required the availability of a camera model and knowledge of the satellite platform ephemeredes. The recent design advances connect the spacecraft sensor Rational Polynomial Function, a deductively developed model, with the AFIDS ultrafine grid, an inductively developed representation of the relationship raster pixel position to latitude /longitude. As a result, RPCs can be updated by AFIDS, a situation often necessary due to the accuracy limits of spacecraft navigation systems. An example of precision change detection will be presented from Quickbird.
The Undiagnosed Diseases Network: Accelerating Discovery about Health and Disease.
Ramoni, Rachel B; Mulvihill, John J; Adams, David R; Allard, Patrick; Ashley, Euan A; Bernstein, Jonathan A; Gahl, William A; Hamid, Rizwan; Loscalzo, Joseph; McCray, Alexa T; Shashi, Vandana; Tifft, Cynthia J; Wise, Anastasia L
2017-02-02
Diagnosis at the edges of our knowledge calls upon clinicians to be data driven, cross-disciplinary, and collaborative in unprecedented ways. Exact disease recognition, an element of the concept of precision in medicine, requires new infrastructure that spans geography, institutional boundaries, and the divide between clinical care and research. The National Institutes of Health (NIH) Common Fund supports the Undiagnosed Diseases Network (UDN) as an exemplar of this model of precise diagnosis. Its goals are to forge a strategy to accelerate the diagnosis of rare or previously unrecognized diseases, to improve recommendations for clinical management, and to advance research, especially into disease mechanisms. The network will achieve these objectives by evaluating patients with undiagnosed diseases, fostering a breadth of expert collaborations, determining best practices for translating the strategy into medical centers nationwide, and sharing findings, data, specimens, and approaches with the scientific and medical communities. Building the UDN has already brought insights to human and medical geneticists. The initial focus has been on data sharing, establishing common protocols for institutional review boards and data sharing, creating protocols for referring and evaluating patients, and providing DNA sequencing, metabolomic analysis, and functional studies in model organisms. By extending this precision diagnostic model nationally, we strive to meld clinical and research objectives, improve patient outcomes, and contribute to medical science. Copyright © 2017 American Society of Human Genetics. All rights reserved.
A Method for Precision Closed-Loop Irrigation Using a Modified PID Control Algorithm
NASA Astrophysics Data System (ADS)
Goodchild, Martin; Kühn, Karl; Jenkins, Malcolm; Burek, Kazimierz; Dutton, Andrew
2016-04-01
The benefits of closed-loop irrigation control have been demonstrated in grower trials which show the potential for improved crop yields and resource usage. Managing water use by controlling irrigation in response to soil moisture changes to meet crop water demands is a popular approach but requires knowledge of closed-loop control practice. In theory, to obtain precise closed-loop control of a system it is necessary to characterise every component in the control loop to derive the appropriate controller parameters, i.e. proportional, integral & derivative (PID) parameters in a classic PID controller. In practice this is often difficult to achieve. Empirical methods are employed to estimate the PID parameters by observing how the system performs under open-loop conditions. In this paper we present a modified PID controller, with a constrained integral function, that delivers excellent regulation of soil moisture by supplying the appropriate amount of water to meet the needs of the plant during the diurnal cycle. Furthermore, the modified PID controller responds quickly to changes in environmental conditions, including rainfall events which can result in: controller windup, under-watering and plant stress conditions. The experimental work successfully demonstrates the functionality of a constrained integral PID controller that delivers robust and precise irrigation control. Coir substrate strawberry growing trial data is also presented illustrating soil moisture control and the ability to match water deliver to solar radiation.
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
Material Choice for spindle of machine tools
NASA Astrophysics Data System (ADS)
Gouasmi, S.; Merzoug, B.; Abba, G.; Kherredine, L.
2012-02-01
The requirements of contemporary industry and the flashing development of modern sciences impose restrictions on the majority of the elements of machines; the resulting financial constraints can be satisfied by a better output of the production equipment. As for those concerning the design, the resistance and the correct operation of the product, these require the development of increasingly precise parts, therefore the use of increasingly powerful tools [5]. The precision of machining and the output of the machine tools are generally determined by the precision of rotation of the spindle, indeed, more this one is large more the dimensions to obtain are in the zone of tolerance and the defects of shape are minimized. During the development of the machine tool, the spindle which by definition is a rotating shaft receiving and transmitting to the work piece or the cutting tool the rotational movement, must be designed according to certain optimal parameters to be able to ensure the precision required. This study will be devoted to the choice of the material of the spindle fulfilling the imposed requirements of precision.
Cheng, Lijun; Schneider, Bryan P
2016-01-01
Background Cancer has been extensively characterized on the basis of genomics. The integration of genetic information about cancers with data on how the cancers respond to target based therapy to help to optimum cancer treatment. Objective The increasing usage of sequencing technology in cancer research and clinical practice has enormously advanced our understanding of cancer mechanisms. The cancer precision medicine is becoming a reality. Although off-label drug usage is a common practice in treating cancer, it suffers from the lack of knowledge base for proper cancer drug selections. This eminent need has become even more apparent considering the upcoming genomics data. Methods In this paper, a personalized medicine knowledge base is constructed by integrating various cancer drugs, drug-target database, and knowledge sources for the proper cancer drugs and their target selections. Based on the knowledge base, a bioinformatics approach for cancer drugs selection in precision medicine is developed. It integrates personal molecular profile data, including copy number variation, mutation, and gene expression. Results By analyzing the 85 triple negative breast cancer (TNBC) patient data in the Cancer Genome Altar, we have shown that 71.7% of the TNBC patients have FDA approved drug targets, and 51.7% of the patients have more than one drug target. Sixty-five drug targets are identified as TNBC treatment targets and 85 candidate drugs are recommended. Many existing TNBC candidate targets, such as Poly (ADP-Ribose) Polymerase 1 (PARP1), Cell division protein kinase 6 (CDK6), epidermal growth factor receptor, etc., were identified. On the other hand, we found some additional targets that are not yet fully investigated in the TNBC, such as Gamma-Glutamyl Hydrolase (GGH), Thymidylate Synthetase (TYMS), Protein Tyrosine Kinase 6 (PTK6), Topoisomerase (DNA) I, Mitochondrial (TOP1MT), Smoothened, Frizzled Class Receptor (SMO), etc. Our additional analysis of target and drug selection strategy is also fully supported by the drug screening data on TNBC cell lines in the Cancer Cell Line Encyclopedia. Conclusions The proposed bioinformatics approach lays a foundation for cancer precision medicine. It supplies much needed knowledge base for the off-label cancer drug usage in clinics. PMID:27107440
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Jiang, Y.; Burgess, A. B.
2017-12-01
Big Earth observation data have been produced, archived and made available online, but discovering the right data in a manner that precisely and efficiently satisfies user needs presents a significant challenge to the Earth Science (ES) community. An emerging trend in information retrieval community is to utilize knowledge graphs to assist users in quickly finding desired information from across knowledge sources. This is particularly prevalent within the fields of social media and complex multimodal information processing to name but a few, however building a domain-specific knowledge graph is labour-intensive and hard to keep up-to-date. In this work, we update our progress on the Earth Science Knowledge Graph (ESKG) project; an ESIP-funded testbed project which provides an automatic approach to building a dynamic knowledge graph for ES to improve interdisciplinary data discovery by leveraging implicit, latent existing knowledge present within across several U.S Federal Agencies e.g. NASA, NOAA and USGS. ESKG strengthens ties between observations and user communities by: 1) developing a knowledge graph derived from various sources e.g. Web pages, Web Services, etc. via natural language processing and knowledge extraction techniques; 2) allowing users to traverse, explore, query, reason and navigate ES data via knowledge graph interaction. ESKG has the potential to revolutionize the way in which ES communities interact with ES data in the open world through the entity, spatial and temporal linkages and characteristics that make it up. This project enables the advancement of ESIP collaboration areas including both Discovery and Semantic Technologies by putting graph information right at our fingertips in an interactive, modern manner and reducing the efforts to constructing ontology. To demonstrate the ESKG concept, we will demonstrate use of our framework across NASA JPL's PO.DAAC, NOAA's Earth Observation Requirements Evaluation System (EORES) and various USGS systems.
Salvador-Carulla, L; Lukersmith, S; Sullivan, W
2017-04-01
Guideline methods to develop recommendations dedicate most effort around organising discovery and corroboration knowledge following the evidence-based medicine (EBM) framework. Guidelines typically use a single dimension of information, and generally discard contextual evidence and formal expert knowledge and consumer's experiences in the process. In recognition of the limitations of guidelines in complex cases, complex interventions and systems research, there has been significant effort to develop new tools, guides, resources and structures to use alongside EBM methods of guideline development. In addition to these advances, a new framework based on the philosophy of science is required. Guidelines should be defined as implementation decision support tools for improving the decision-making process in real-world practice and not only as a procedure to optimise the knowledge base of scientific discovery and corroboration. A shift from the model of the EBM pyramid of corroboration of evidence to the use of broader multi-domain perspective graphically depicted as 'Greek temple' could be considered. This model takes into account the different stages of scientific knowledge (discovery, corroboration and implementation), the sources of knowledge relevant to guideline development (experimental, observational, contextual, expert-based and experiential); their underlying inference mechanisms (deduction, induction, abduction, means-end inferences) and a more precise definition of evidence and related terms. The applicability of this broader approach is presented for the development of the Canadian Consensus Guidelines for the Primary Care of People with Developmental Disabilities.
NASA Technical Reports Server (NTRS)
Pang, Yong; Lefskky, Michael; Sun, Guoqing; Ranson, Jon
2011-01-01
A spaceborne lidar mission could serve multiple scientific purposes including remote sensing of ecosystem structure, carbon storage, terrestrial topography and ice sheet monitoring. The measurement requirements of these different goals will require compromises in sensor design. Footprint diameters that would be larger than optimal for vegetation studies have been proposed. Some spaceborne lidar mission designs include the possibility that a lidar sensor would share a platform with another sensor, which might require off-nadir pointing at angles of up to 16 . To resolve multiple mission goals and sensor requirements, detailed knowledge of the sensitivity of sensor performance to these aspects of mission design is required. This research used a radiative transfer model to investigate the sensitivity of forest height estimates to footprint diameter, off-nadir pointing and their interaction over a range of forest canopy properties. An individual-based forest model was used to simulate stands of mixed conifer forest in the Tahoe National Forest (Northern California, USA) and stands of deciduous forests in the Bartlett Experimental Forest (New Hampshire, USA). Waveforms were simulated for stands generated by a forest succession model using footprint diameters of 20 m to 70 m. Off-nadir angles of 0 to 16 were considered for a 25 m diameter footprint diameter. Footprint diameters in the range of 25 m to 30 m were optimal for estimates of maximum forest height (R(sup 2) of 0.95 and RMSE of 3 m). As expected, the contribution of vegetation height to the vertical extent of the waveform decreased with larger footprints, while the contribution of terrain slope increased. Precision of estimates decreased with an increasing off-nadir pointing angle, but off-nadir pointing had less impact on height estimates in deciduous forests than in coniferous forests. When pointing off-nadir, the decrease in precision was dependent on local incidence angle (the angle between the off-nadir beam and a line normal to the terrain surface) which is dependent on the off-nadir pointing angle, terrain slope, and the difference between the laser pointing azimuth and terrain aspect; the effect was larger when the sensor was aligned with the terrain azimuth but when aspect and azimuth are opposed, there was virtually no effect on R2 or RMSE. A second effect of off-nadir pointing is that the laser beam will intersect individual crowns and the canopy as a whole from a different angle which had a distinct effect on the precision of lidar estimates of height, decreasing R2 and increasing RMSE, although the effect was most pronounced for coniferous crowns.
One-Reason Decision Making Unveiled: A Measurement Model of the Recognition Heuristic
ERIC Educational Resources Information Center
Hilbig, Benjamin E.; Erdfelder, Edgar; Pohl, Rudiger F.
2010-01-01
The fast-and-frugal recognition heuristic (RH) theory provides a precise process description of comparative judgments. It claims that, in suitable domains, judgments between pairs of objects are based on recognition alone, whereas further knowledge is ignored. However, due to the confound between recognition and further knowledge, previous…
Research on the tool holder mode in high speed machining
NASA Astrophysics Data System (ADS)
Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao
2018-03-01
High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.
Bit-Grooming: Shave Your Bits with Razor-sharp Precision
NASA Astrophysics Data System (ADS)
Zender, C. S.; Silver, J.
2017-12-01
Lossless compression can reduce climate data storage by 30-40%. Further reduction requires lossy compression that also reduces precision. Fortunately, geoscientific models and measurements generate false precision (scientifically meaningless data bits) that can be eliminated without sacrificing scientifically meaningful data. We introduce Bit Grooming, a lossy compression algorithm that removes the bloat due to false-precision, those bits and bytes beyond the meaningful precision of the data.Bit Grooming is statistically unbiased, applies to all floating point numbers, and is easy to use. Bit-Grooming reduces geoscience data storage requirements by 40-80%. We compared Bit Grooming to competitors Linear Packing, Layer Packing, and GRIB2/JPEG2000. The other compression methods have the edge in terms of compression, but Bit Grooming is the most accurate and certainly the most usable and portable.Bit Grooming provides flexible and well-balanced solutions to the trade-offs among compression, accuracy, and usability required by lossy compression. Geoscientists could reduce their long term storage costs, and show leadership in the elimination of false precision, by adopting Bit Grooming.
NASA Astrophysics Data System (ADS)
Morgenthaler, George; Khatib, Nader; Kim, Byoungsoo
with information to improve their crop's vigor has been a major topic of interest. With world population growing exponentially, arable land being consumed by urbanization, and an unfavorable farm economy, the efficiency of farming must increase to meet future food requirements and to make farming a sustainable occupation for the farmer. "Precision Agriculture" refers to a farming methodology that applies nutrients and moisture only where and when they are needed in the field. The goal is to increase farm revenue by increasing crop yield and decreasing applications of costly chemical and water treatments. In addition, this methodology will decrease the environmental costs of farming, i.e., reduce air, soil, and water pollution. Sensing/Precision Agriculture has not grown as rapidly as early advocates envisioned. Technology for a successful Remote Sensing/Precision Agriculture system is now available. Commercial satellite systems can image (multi-spectral) the Earth with a resolution of approximately 2.5 m. Variable precision dispensing systems using GPS are available and affordable. Crop models that predict yield as a function of soil, chemical, and irrigation parameter levels have been formulated. Personal computers and internet access are in place in most farm homes and can provide a mechanism to periodically disseminate, e.g. bi-weekly, advice on what quantities of water and chemicals are needed in individual regions of the field. What is missing is a model that fuses the disparate sources of information on the current states of the crop and soil, and the remaining resource levels available with the decisions farmers are required to make. This must be a product that is easy for the farmer to understand and to implement. A "Constrained Optimization Feed-back Control Model" to fill this void will be presented. The objective function of the model will be used to maximize the farmer's profit by increasing yields while decreasing environmental costs and decreasing application of costly treatments. This model will incorporate information from remote sensing, in-situ weather sources, soil measurements, crop models, and tacit farmer knowledge of the relative productivity of the selected control regions of the farm to provide incremental advice throughout the growing season on water and chemical treatments. Genetic and meta-heuristic algorithms will be used to solve the constrained optimization problem that possesses complex constraints and a non-linear objective function. *
Biosensors for spatiotemporal detection of reactive oxygen species in cells and tissues.
Erard, Marie; Dupré-Crochet, Sophie; Nüße, Oliver
2018-05-01
Redox biology has become a major issue in numerous areas of physiology. Reactive oxygen species (ROS) have a broad range of roles from signal transduction to growth control and cell death. To understand the nature of these roles, accurate measurement of the reactive compounds is required. An increasing number of tools for ROS detection is available; however, the specificity and sensitivity of these tools are often insufficient. Furthermore, their specificity has been rarely evaluated in complex physiological conditions. Many ROS probes are sensitive to environmental conditions in particular pH, which may interfere with ROS detection and cause misleading results. Accurate detection of ROS in physiology and pathophysiology faces additional challenges concerning the precise localization of the ROS and the timing of their production and disappearance. Certain ROS are membrane permeable, and certain ROS probes move across cells and organelles. Targetable ROS probes such as fluorescent protein-based biosensors are required for accurate localization. Here we analyze these challenges in more detail, provide indications on the strength and weakness of current tools for ROS detection, and point out developments that will provide improved ROS detection methods in the future. There is no universal method that fits all situations in physiology and cell biology. A detailed knowledge of the ROS probes is required to choose the appropriate method for a given biological problem. The knowledge of the shortcomings of these probes should also guide the development of new sensors.
Precision manufacturing for clinical-quality regenerative medicines.
Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard
2012-08-28
Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.
Significance and integration of molecular diagnostics in the framework of veterinary practice.
Aranaz, Alicia
2015-01-01
The field of molecular diagnostics in veterinary practice is rapidly evolving. An array of molecular techniques of different complexity is available to facilitate the fast and specific diagnosis of animal diseases. The choice for the adequate technique is dependent on the mission and attributions of the laboratory and requires both a knowledge of the molecular biology basis and of its limitations. The ability to quickly detect pathogens and their characteristics would allow for precise decision-making and target measures such as prophylaxis, appropriate therapy, and biosafety plans to control disease outbreaks. In practice, taking benefit of the huge amount of data that can be obtained using molecular techniques highlights the need of collaboration between veterinarians in the laboratory and practitioners.
Calorimetric determination of the thermoneutral potential for Li/BrCl in SOCl2 (BCX) cells
NASA Technical Reports Server (NTRS)
Darcy, Eric C.; Kalu, Eric E.; White, Ralph E.
1991-01-01
Proliferation of lithium cells into large modular battery packs are projected for future space applications. Assuring battery design safety while maintaining high energy density requires accurate and precise knowledge of the thermal parameters of the battery cell. Specifically, the thermoneutral potential was determined using heat conduction calorimetry on Li/BrCl in SOCl2 (BCX) DD-cells and compared to measurements obtained on Li/SOCl2 D-cells. Over 20 to 60 C, the Li/BCX cells were found to have a thermoneutral potential significantly higher (near 4.0 volts) than that for the Li/SOCl2 cells tested. The higher heat generation measured during discharge reflects the higher electrochemical polarization observed with the BCX cells.
Multiple-Frame Detection of Subpixel Targets in Thermal Image Sequences
NASA Technical Reports Server (NTRS)
Thompson, David R.; Kremens, Robert
2013-01-01
The new technology in this approach combines the subpixel detection information from multiple frames of a sequence to achieve a more sensitive detection result, using only the information found in the images themselves. It is taken as a constraint that the method is automated, robust, and computationally feasible for field networks with constrained computation and data rates. This precludes simply downloading a video stream for pixel-wise co-registration on the ground. It is also important that this method not require precise knowledge of sensor position or direction, because such information is often not available. It is also assumed that the scene in question is approximately planar, which is appropriate for a high-altitude airborne or orbital view.
iCLIP: protein-RNA interactions at nucleotide resolution.
Huppertz, Ina; Attig, Jan; D'Ambrogio, Andrea; Easton, Laura E; Sibley, Christopher R; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej
2014-02-01
RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein-RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein-RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Standard model of knowledge representation
NASA Astrophysics Data System (ADS)
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
Cobalt: Development and Maturation of GN&C Technologies for Precision Landing
NASA Technical Reports Server (NTRS)
Carson, John M.; Restrepo, Carolina; Seubert, Carl; Amzajerdian, Farzin
2016-01-01
The CoOperative Blending of Autonomous Landing Technologies (COBALT) instrument is a terrestrial test platform for development and maturation of guidance, navigation and control (GN&C) technologies for precision landing. The project is developing a third-generation Langley Research Center (LaRC) navigation doppler lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the Jet Propulsion Laboratory (JPL) lander vision system (LVS) for terrain relative navigation (TRN) position estimates. These technologies together provide precise navigation knowledge that is critical for a controlled and precise touchdown. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive vertical test bed (VTB) developed by Masten Space Systems, and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).
Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs
NASA Astrophysics Data System (ADS)
Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.
2016-07-01
Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.
The Daniel K. Inouye College of Pharmacy Scripts
Ciarleglio, Anita E; Ma, Carolyn
2017-01-01
The precision medicine initiative brought forth by President Barack Obama in 2015 is an important step on the journey to truly personalized medicine. A broad knowledge and understanding of the implications of the pharmacogenomic literature will be critical to the achievement of this goal. While a great amount of data has been published in the areas of pharmacogenomics and pharmacogenetics, there are still relatively few instances in which the need for clinical intervention can be stated without doubt, and which are widely accepted and practiced by the medical community. As our knowledge base rapidly expands, issues such as insurance reimbursement for genetic testing and education of the health care workforce will be paramount to achieving the goal of precision medicine for all patients. PMID:28900583
Benchmarking of Neutron Flux Parameters at the USGS TRIGA Reactor in Lakewood, Colorado
NASA Astrophysics Data System (ADS)
Alzaabi, Osama E.
The USGS TRIGA Reactor (GSTR) located at the Denver Federal Center in Lakewood Colorado provides opportunities to Colorado School of Mines students to do experimental research in the field of neutron activation analysis. The scope of this thesis is to obtain precise knowledge of neutron flux parameters at the GSTR. The Colorado School of Mines Nuclear Physics group intends to develop several research projects at the GSTR, which requires the precise knowledge of neutron fluxes and energy distributions in several irradiation locations. The fuel burn-up of the new GSTR fuel configuration and the thermal neutron flux of the core were recalculated since the GSTR core configuration had been changed with the addition of two new fuel elements. Therefore, a MCNP software package was used to incorporate the burn up of reactor fuel and to determine the neutron flux at different irradiation locations and at flux monitoring bores. These simulation results were compared with neutron activation analysis results using activated diluted gold wires. A well calibrated and stable germanium detector setup as well as fourteen samplers were designed and built to achieve accuracy in the measurement of the neutron flux. Furthermore, the flux monitoring bores of the GSTR core were used for the first time to measure neutron flux experimentally and to compare to MCNP simulation. In addition, International Atomic Energy Agency (IAEA) standard materials were used along with USGS national standard materials in a previously well calibrated irradiation location to benchmark simulation, germanium detector calibration and sample measurements to international standards.
Ehrnsperger, H; Wagner, K; Mehnert, H
1975-10-01
In an unselected collective of 100 employees of a big concern the knowledge with regard to body weight, nutrition and nicotine abuse was studied and the eating habits as well as the principles of education connected with them were ascertained. Nearly all questioned persons knew their body weight. Half of them were able to indicate their ideal weight with a precision of +/- 10%. The rate of the considerably overweight persons in the collective showed a dependence on higher age and female sex. Among the questionees with secondary education more normalweight persons were found. The test persons were well informed about the disadvantages of overweight and controlled their body weight in two thirds of the cases at least once a week. They stated to have met the term "diet" above all in the mass media. About half of them were able to define the term "calorie" approximately precisely. The questionees, however, often underestimated the required caloric quantity, apparently without drawing conclusions. The caloric content of alcoholic drinks was underestimated. Beer ranked first among the consumed alcoholic drinks. Only one fifth of the test persons "reward" their children with sweets and wish that they eat up their dinner. About half of the questionees were smokers. One fourth had never smoked, while another fourth had given up smoking. Mostly cigarettes were consumed, although nearly all test persons regarded inhaling smoking as harmful. The sole knowledge of factors that have a detrimental effect on the state of health (e.g. overweight, smoking) seems to be not sufficient to lead a healthy life. Since the test persons are, however, willing to do something for their health it would be useful to concentrate adipose persons and smokers into groups that should be treated by behaviour therapy and be positively motivated.
Analysis of flood modeling through innovative geomatic methods
NASA Astrophysics Data System (ADS)
Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo
2015-05-01
A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model, has allowed reaching a significant increase in the knowledge of hydraulic variables in natural waterways.
NASA Technical Reports Server (NTRS)
Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles
2006-01-01
SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.
High-precision arithmetic in mathematical physics
Bailey, David H.; Borwein, Jonathan M.
2015-05-12
For many scientific calculations, particularly those involving empirical data, IEEE 32-bit floating-point arithmetic produces results of sufficient accuracy, while for other applications IEEE 64-bit floating-point is more appropriate. But for some very demanding applications, even higher levels of precision are often required. Furthermore, this article discusses the challenge of high-precision computation, in the context of mathematical physics, and highlights what facilities are required to support future computation, in light of emerging developments in computer architecture.
Poutiainen, Pekka; Jaronen, Merja; Quintana, Francisco J.; Brownell, Anna-Liisa
2016-01-01
Non-invasive molecular imaging techniques can enhance diagnosis to achieve successful treatment, as well as reveal underlying pathogenic mechanisms in disorders such as multiple sclerosis (MS). The cooperation of advanced multimodal imaging techniques and increased knowledge of the MS disease mechanism allows both monitoring of neuronal network and therapeutic outcome as well as the tools to discover novel therapeutic targets. Diverse imaging modalities provide reliable diagnostic and prognostic platforms to better achieve precision medicine. Traditionally, magnetic resonance imaging (MRI) has been considered the golden standard in MS research and diagnosis. However, positron emission tomography (PET) imaging can provide functional information of molecular biology in detail even prior to anatomic changes, allowing close follow up of disease progression and treatment response. The recent findings support three major neuroinflammation components in MS: astrogliosis, cytokine elevation, and significant changes in specific proteins, which offer a great variety of specific targets for imaging purposes. Regardless of the fact that imaging of astrocyte function is still a young field and in need for development of suitable imaging ligands, recent studies have shown that inflammation and astrocyte activation are related to progression of MS. MS is a complex disease, which requires understanding of disease mechanisms for successful treatment. PET is a precise non-invasive imaging method for biochemical functions and has potential to enhance early and accurate diagnosis for precision therapy of MS. In this review we focus on modulation of different receptor systems and inflammatory aspect of MS, especially on activation of glial cells, and summarize the recent findings of PET imaging in MS and present the most potent targets for new biomarkers with the main focus on experimental MS research. PMID:27695400
3D Modelling and Printing Technology to Produce Patient-Specific 3D Models.
Birbara, Nicolette S; Otton, James M; Pather, Nalini
2017-11-10
A comprehensive knowledge of mitral valve (MV) anatomy is crucial in the assessment of MV disease. While the use of three-dimensional (3D) modelling and printing in MV assessment has undergone early clinical evaluation, the precision and usefulness of this technology requires further investigation. This study aimed to assess and validate 3D modelling and printing technology to produce patient-specific 3D MV models. A prototype method for MV 3D modelling and printing was developed from computed tomography (CT) scans of a plastinated human heart. Mitral valve models were printed using four 3D printing methods and validated to assess precision. Cardiac CT and 3D echocardiography imaging data of four MV disease patients was used to produce patient-specific 3D printed models, and 40 cardiac health professionals (CHPs) were surveyed on the perceived value and potential uses of 3D models in a clinical setting. The prototype method demonstrated submillimetre precision for all four 3D printing methods used, and statistical analysis showed a significant difference (p<0.05) in precision between these methods. Patient-specific 3D printed models, particularly using multiple print materials, were considered useful by CHPs for preoperative planning, as well as other applications such as teaching and training. This study suggests that, with further advances in 3D modelling and printing technology, patient-specific 3D MV models could serve as a useful clinical tool. The findings also highlight the potential of this technology to be applied in a variety of medical areas within both clinical and educational settings. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.
Safa, Alireza; Abdolmalaki, Reza Yazdanpanah; Shafiee, Saeed; Sadeghi, Behzad
2018-06-01
In the field of nanotechnology, there is a growing demand to provide precision control and manipulation of devices with the ability to interact with complex and unstructured environments at micro/nano-scale. As a result, ultrahigh-precision positioning stages have been turned into a key requirement of nanotechnology. In this paper, linear piezoelectric ceramic motors (LPCMs) are adopted to drive micro/nanopositioning stages since they have the ability to achieve high precision in addition to being versatile to be implemented over a wide range of applications. In the establishment of a control scheme for such manipulation systems, the presence of friction, parameter uncertainties, and external disturbances prevent the systems from providing the desired positioning accuracy. The work in this paper focuses on the development of a control framework that addresses these issues as it uses the nonsingular terminal sliding mode technique for the precise position tracking problem of an LPCM-driven positioning stage with friction, uncertain parameters, and external disturbances. The developed control algorithm exhibits the following two attractive features. First, upper bounds of system uncertainties/perturbations are adaptively estimated in the proposed controller; thus, prior knowledge about uncertainty/disturbance bounds is not necessary. Second, the discontinuous signum function is transferred to the time derivative of the control input and the continuous control signal is obtained after integration; consequently, the chattering phenomenon, which presents a major handicap to the implementation of conventional sliding mode control in real applications, is alleviated without deteriorating the robustness of the system. The stability of the controlled system is analyzed, and the convergence of the position tracking error to zero is analytically proven. The proposed control strategy is experimentally validated and compared to the existing control approaches. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
The emerging science of precision medicine and pharmacogenomics for Parkinson's disease.
Payami, Haydeh
2017-08-01
Current therapies for Parkinson's disease are problematic because they are symptomatic and have adverse effects. New drugs have failed in clinical trials because of inadequate efficacy. At the core of the problem is trying to make one drug work for all Parkinson's disease patients, when we know this premise is wrong because (1) Parkinson's disease is not a single disease, and (2) no two individuals have the same biological makeup. Precision medicine is the goal to strive for, but we are only at the beginning stages of building the infrastructure for one of the most complex projects in the history of science, and it will be a long time before Parkinson's disease reaps the benefits. Pharmacogenomics, a cornerstone of precision medicine, has already proven successful for many conditions and could also propel drug discovery and improve treatment for Parkinson's disease. To make progress in the pharmacogenomics of Parkinson's disease, we need to change course from small inconclusive candidate gene studies to large-scale rigorously planned genome-wide studies that capture the nuclear genome and the microbiome. Pharmacogenomic studies must use homogenous subtypes of Parkinson's disease or apply the brute force of statistical power to overcome heterogeneity, which will require large sample sizes achievable only via internet-based methods and electronic databases. Large-scale pharmacogenomic studies, together with biomarker discovery efforts, will yield the knowledge necessary to design clinical trials with precision to alleviate confounding by disease heterogeneity and interindividual variability in drug response, two of the major impediments to successful drug discovery and effective treatment. © 2017 International Parkinson and Movement Disorder Society. © 2017 International Parkinson and Movement Disorder Society.
Alwan, Wisam; Nestle, Frank O
2015-01-01
Psoriasis is a common, chronic inflammatory skin disease associated with multi-system manifestations including arthritis and obesity. Our knowledge of the aetiology of the condition, including the key genomic, immune and environmental factors, has led to the development of targeted, precision therapies that alleviate patient morbidity. This article reviews the key pathophysiological pathways and therapeutic targets and highlights future areas of interest in psoriasis research.
Technical Note: The determination of enclosed water volume in large flexible-wall mesocosms "KOSMOS"
NASA Astrophysics Data System (ADS)
Czerny, J.; Schulz, K. G.; Krug, S. A.; Ludwig, A.; Riebesell, U.
2013-03-01
The volume of water enclosed inside flexible-wall mesocosm bags is hard to estimate using geometrical calculations and can be strongly variable among bags of the same dimensions. Here we present a method for precise water volume determination in mesocosms using salinity as a tracer. Knowledge of the precise volume of water enclosed allows establishment of exactly planned treatment concentrations and calculation of elemental budgets.
Muraro, Antonella; Lemanske, Robert F; Hellings, Peter W; Akdis, Cezmi A; Bieber, Thomas; Casale, Thomas B; Jutel, Marek; Ong, Peck Y; Poulsen, Lars K; Schmid-Grendelmeier, Peter; Simon, Hans-Uwe; Seys, Sven F; Agache, Ioana
2016-05-01
In this consensus document we summarize the current knowledge on major asthma, rhinitis, and atopic dermatitis endotypes under the auspices of the PRACTALL collaboration platform. PRACTALL is an initiative of the European Academy of Allergy and Clinical Immunology and the American Academy of Allergy, Asthma & Immunology aiming to harmonize the European and American approaches to best allergy practice and science. Precision medicine is of broad relevance for the management of asthma, rhinitis, and atopic dermatitis in the context of a better selection of treatment responders, risk prediction, and design of disease-modifying strategies. Progress has been made in profiling the type 2 immune response-driven asthma. The endotype driven approach for non-type 2 immune response asthma, rhinitis, and atopic dermatitis is lagging behind. Validation and qualification of biomarkers are needed to facilitate their translation into pathway-specific diagnostic tests. Wide consensus between academia, governmental regulators, and industry for further development and application of precision medicine in management of allergic diseases is of utmost importance. Improved knowledge of disease pathogenesis together with defining validated and qualified biomarkers are key approaches to precision medicine. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Lantada, Andrés Díaz; Hengsbach, Stefan; Bade, Klaus
2017-10-16
In this study we present the combination of a math-based design strategy with direct laser writing as high-precision technology for promoting solid free-form fabrication of multi-scale biomimetic surfaces. Results show a remarkable control of surface topography and wettability properties. Different examples of surfaces inspired on the lotus leaf, which to our knowledge are obtained for the first time following a computer-aided design with this degree of precision, are presented. Design and manufacturing strategies towards microfluidic systems whose fluid driving capabilities are obtained just by promoting a design-controlled wettability of their surfaces, are also discussed and illustrated by means of conceptual proofs. According to our experience, the synergies between the presented computer-aided design strategy and the capabilities of direct laser writing, supported by innovative writing strategies to promote final size while maintaining high precision, constitute a relevant step forward towards materials and devices with design-controlled multi-scale and micro-structured surfaces for advanced functionalities. To our knowledge, the surface geometry of the lotus leaf, which has relevant industrial applications thanks to its hydrophobic and self-cleaning behavior, has not yet been adequately modeled and manufactured in an additive way with the degree of precision that we present here.
Precision and Fast Wavelength Tuning of a Dynamically Phase-Locked Widely-Tunable Laser
NASA Technical Reports Server (NTRS)
Numata, Kenji; Chen, Jeffrey R.; Wu, Stewart T.
2012-01-01
We report a precision and fast wavelength tuning technique demonstrated for a digital-supermode distributed Bragg reflector laser. The laser was dynamically offset-locked to a frequency-stabilized master laser using an optical phase-locked loop, enabling precision fast tuning to and from any frequencies within a 40-GHz tuning range. The offset frequency noise was suppressed to the statically offset-locked level in less than 40 s upon each frequency switch, allowing the laser to retain the absolute frequency stability of the master laser. This technique satisfies stringent requirements for gas sensing lidars and enables other applications that require such well-controlled precision fast tuning.
Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.
Pearl, Lisa S; Sprouse, Jon
2015-06-01
Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.
The landscape of precision cancer medicine clinical trials in the United States.
Roper, Nitin; Stensland, Kristian D; Hendricks, Ryan; Galsky, Matthew D
2015-05-01
Advances in tumor biology and multiplex genomic analysis have ushered in the era of precision cancer medicine. Little is currently known, however, about the landscape of prospective "precision cancer medicine" clinical trials in the U.S. We identified all adult interventional cancer trials registered on ClinicalTrials.gov between September 2005 and May 2013. Trials were classified as "precision cancer medicine" if a genomic alteration in a predefined set of 88 genes was required for enrollment. Baseline characteristics were ascertained for each trial. Of the initial 18,797 trials identified, 9094 (48%) were eligible for inclusion: 684 (8%) were classified as precision cancer medicine trials and 8410 (92%) were non-precision cancer medicine trials. Compared with non-precision cancer medicine trials, precision cancer medicine trials were significantly more likely to be phase II [RR 1.19 (1.10-1.29), p<0.001], multi-center [RR 1.18 (1.11-1.26), p<0.001], open-label [RR 1.04 (1.02-1.07), p=0.005] and involve breast [RR 4.03 (3.49-4.52), p<0.001], colorectal [RR 1.62 (1.22-2.14), p=0.002] and skin [RR 1.98 (1.55-2.54), p<0.001] cancers. Precision medicine trials required 38 unique genomic alterations for enrollment. The proportion of precision cancer medicine trials compared to the total number of trials increased from 3% in 2006 to 16% in 2013. The proportion of adult cancer clinical trials in the U.S. requiring a genomic alteration for enrollment has increased substantially over the past several years. However, such trials still represent a small minority of studies performed within the cancer clinical trials enterprise and include a small subset of putatively "actionable" alterations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Common knowledge: Now you have it, now you don`t?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagin, R.; Halpern, J.Y.; Moses, Y.
The notion of common knowledge, where everyone knows, everyone knows that everyone knows, etc., has proven to be fundamental in various disciplines, including Philosophy, Artificial Intelligence, Economics, and Psychology. This key notion was first studied by the philosopher David Lewis in the context of conventions. Lewis pointed out that in order for something to be a convention, it must in fact be common knowledge among the members of a group. (For example, the convention that green means {open_quotes}go{close_quotes} and red means {open_quotes}stop{close_quotes} is presumably common knowledge among the drivers in our society.) Common knowledge also arises in discourse understanding. Supposemore » Ann asks Bob {open_quotes}What did you think of the movie?{close_quotes} referring to a showing of Monkey Business they have just seen. Not only must Ann and Bob both know that {open_quotes}the movie{close_quotes} refers to Monkey Business, but Ann must know that Bob knows, Bob must know that Ann knows that Bob knows, and so on. In fact, by a closer analysis of this situation, it can be shown that there must be common knowledge of what movie is meant in order for Bob to answer the question appropriately. Finally, common knowledge also turns out to be a prerequisite for agreement and coordinated action in distributed systems. This is precisely what makes it such a crucial notion in the analysis of interacting groups of agents. On the other hand, in practical settings common knowledge is impossible to achieve. This puts us in a somewhat paradoxical situation, in that we claim both that common knowledge is a prerequisite for agreement and coordinated action and that it cannot be attained. We discuss two answers to this paradox: Modeling the world with a coarser granularity, and relaxing the requirements for coordination.« less
The development of composite materials for spacecraft precision reflector panels
NASA Technical Reports Server (NTRS)
Tompkins, Stephen S.; Bowles, David E.; Funk, Joan G.; Towell, Timothy W.; Lavoie, J. A.
1990-01-01
One of the critical technology needs for large precision reflectors required for future astrophysics and optical communications is in the area of structural materials. Therefore, a major area of the Precision Segmented Reflector Program at NASA is to develop lightweight composite reflector panels with durable, space environmentally stable materials which maintain both surface figure and required surface accuracy necessary for space telescope applications. Results from the materials research and development program at NASA Langley Research Center are discussed. Advanced materials that meet the reflector panel requirements are identified. Thermal, mechanical and durability properties of candidate materials after exposure to simulated space environments are compared to the baseline material.
DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.
Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal
2015-06-01
Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.
Using hyperspectral data in precision farming applications
USDA-ARS?s Scientific Manuscript database
Precision farming practices such as variable rate applications of fertilizer and agricultural chemicals require accurate field variability mapping. This chapter investigated the value of hyperspectral remote sensing in providing useful information for five applications of precision farming: (a) Soil...
UAS remote sensing for precision agriculture: An independent assessment
USDA-ARS?s Scientific Manuscript database
Small Unmanned Aircraft Systems (sUAS) are recognized as potentially important remote-sensing platforms for precision agriculture. However, research is required to determine which sensors and data processing methods are required to use sUAS in an efficient and cost-effective manner. Oregon State U...
Mazzocco, Michèle M M; Feigenson, Lisa; Halberda, Justin
2011-01-01
The Approximate Number System (ANS) is a primitive mental system of nonverbal representations that supports an intuitive sense of number in human adults, children, infants, and other animal species. The numerical approximations produced by the ANS are characteristically imprecise and, in humans, this precision gradually improves from infancy to adulthood. Throughout development, wide ranging individual differences in ANS precision are evident within age groups. These individual differences have been linked to formal mathematics outcomes, based on concurrent, retrospective, or short-term longitudinal correlations observed during the school age years. However, it remains unknown whether this approximate number sense actually serves as a foundation for these school mathematics abilities. Here we show that ANS precision measured at preschool, prior to formal instruction in mathematics, selectively predicts performance on school mathematics at 6 years of age. In contrast, ANS precision does not predict non-numerical cognitive abilities. To our knowledge, these results provide the first evidence for early ANS precision, measured before the onset of formal education, predicting later mathematical abilities.
Mazzocco, Michèle M. M.; Feigenson, Lisa; Halberda, Justin
2011-01-01
The Approximate Number System (ANS) is a primitive mental system of nonverbal representations that supports an intuitive sense of number in human adults, children, infants, and other animal species. The numerical approximations produced by the ANS are characteristically imprecise and, in humans, this precision gradually improves from infancy to adulthood. Throughout development, wide ranging individual differences in ANS precision are evident within age groups. These individual differences have been linked to formal mathematics outcomes, based on concurrent, retrospective, or short-term longitudinal correlations observed during the school age years. However, it remains unknown whether this approximate number sense actually serves as a foundation for these school mathematics abilities. Here we show that ANS precision measured at preschool, prior to formal instruction in mathematics, selectively predicts performance on school mathematics at 6 years of age. In contrast, ANS precision does not predict non-numerical cognitive abilities. To our knowledge, these results provide the first evidence for early ANS precision, measured before the onset of formal education, predicting later mathematical abilities. PMID:21935362
Double-trap measurement of the proton magnetic moment at 0.3 parts per billion precision.
Schneider, Georg; Mooser, Andreas; Bohman, Matthew; Schön, Natalie; Harrington, James; Higuchi, Takashi; Nagahama, Hiroki; Sellner, Stefan; Smorra, Christian; Blaum, Klaus; Matsuda, Yasuyuki; Quint, Wolfgang; Walz, Jochen; Ulmer, Stefan
2017-11-24
Precise knowledge of the fundamental properties of the proton is essential for our understanding of atomic structure as well as for precise tests of fundamental symmetries. We report on a direct high-precision measurement of the magnetic moment μ p of the proton in units of the nuclear magneton μ N The result, μ p = 2.79284734462 (±0.00000000082) μ N , has a fractional precision of 0.3 parts per billion, improves the previous best measurement by a factor of 11, and is consistent with the currently accepted value. This was achieved with the use of an optimized double-Penning trap technique. Provided a similar measurement of the antiproton magnetic moment can be performed, this result will enable a test of the fundamental symmetry between matter and antimatter in the baryonic sector at the 10 -10 level. Copyright © 2017, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Neumann, Jay; Parlato, Russell; Tracy, Gregory; Randolph, Max
2015-09-01
Focal plane alignment for large format arrays and faster optical systems require enhanced precision methodology and stability over temperature. The increase in focal plane array size continues to drive the alignment capability. Depending on the optical system, the focal plane flatness of less than 25μm (.001") is required over transition temperatures from ambient to cooled operating temperatures. The focal plane flatness requirement must also be maintained in airborne or launch vibration environments. This paper addresses the challenge of the detector integration into the focal plane module and housing assemblies, the methodology to reduce error terms during integration and the evaluation of thermal effects. The driving factors influencing the alignment accuracy include: datum transfers, material effects over temperature, alignment stability over test, adjustment precision and traceability to NIST standard. The FPA module design and alignment methodology reduces the error terms by minimizing the measurement transfers to the housing. In the design, the proper material selection requires matched coefficient of expansion materials minimizes both the physical shift over temperature as well as lowering the stress induced into the detector. When required, the co-registration of focal planes and filters can achieve submicron relative positioning by applying precision equipment, interferometry and piezoelectric positioning stages. All measurements and characterizations maintain traceability to NIST standards. The metrology characterizes the equipment's accuracy, repeatability and precision of the measurements.
Sports genetics moving forward: lessons learned from medical research.
Mattsson, C Mikael; Wheeler, Matthew T; Waggott, Daryl; Caleshu, Colleen; Ashley, Euan A
2016-03-01
Sports genetics can take advantage of lessons learned from human disease genetics. By righting past mistakes and increasing scientific rigor, we can magnify the breadth and depth of knowledge in the field. We present an outline of challenges facing sports genetics in the light of experiences from medical research. Sports performance is complex, resulting from a combination of a wide variety of different traits and attributes. Improving sports genetics will foremost require analyses based on detailed phenotyping. To find widely valid, reproducible common variants associated with athletic phenotypes, study sample sizes must be dramatically increased. One paradox is that in order to confirm relevance, replications in specific populations must be undertaken. Family studies of athletes may facilitate the discovery of rare variants with large effects on athletic phenotypes. The complexity of the human genome, combined with the complexity of athletic phenotypes, will require additional metadata and biological validation to identify a comprehensive set of genes involved. Analysis of personal genetic and multiomic profiles contribute to our conceptualization of precision medicine; the same will be the case in precision sports science. In the refinement of sports genetics it is essential to evaluate similarities and differences between sexes and among ethnicities. Sports genetics to date have been hampered by small sample sizes and biased methodology, which can lead to erroneous associations and overestimation of effect sizes. Consequently, currently available genetic tests based on these inherently limited data cannot predict athletic performance with any accuracy. Copyright © 2016 the American Physiological Society.
Chen, Yang; Ren, Xiaofeng; Zhang, Guo-Qiang; Xu, Rong
2013-01-01
Visual information is a crucial aspect of medical knowledge. Building a comprehensive medical image base, in the spirit of the Unified Medical Language System (UMLS), would greatly benefit patient education and self-care. However, collection and annotation of such a large-scale image base is challenging. To combine visual object detection techniques with medical ontology to automatically mine web photos and retrieve a large number of disease manifestation images with minimal manual labeling effort. As a proof of concept, we first learnt five organ detectors on three detection scales for eyes, ears, lips, hands, and feet. Given a disease, we used information from the UMLS to select affected body parts, ran the pretrained organ detectors on web images, and combined the detection outputs to retrieve disease images. Compared with a supervised image retrieval approach that requires training images for every disease, our ontology-guided approach exploits shared visual information of body parts across diseases. In retrieving 2220 web images of 32 diseases, we reduced manual labeling effort to 15.6% while improving the average precision by 3.9% from 77.7% to 81.6%. For 40.6% of the diseases, we improved the precision by 10%. The results confirm the concept that the web is a feasible source for automatic disease image retrieval for health image database construction. Our approach requires a small amount of manual effort to collect complex disease images, and to annotate them by standard medical ontology terms.
In search of a principled theory of the 'value' of knowledge.
Castelfranchi, Cristiano
2016-01-01
A theory of the Value/Utility of information and knowledge (K) is not really there. This would require a theory of the centrality of Goals in minds, and of the role of K relative to Goals and their dynamics. K value is a notion relative to Goal value. Inf/K is precisely a resource, a means and the value of means depends on the value of their possible functions and uses. The claim of this paper is that Ks have a Value and Utility, they can be more or less 'precious'; they have a cost and imply some risks; they can be not only useful but negative and dangerous. We also examine the 'quality' of this resource: its reliability; and its crucial role in goal processing: activating goals, abandoning, choosing, planning, formulating intentions, decide to act. 'Relevance theory', Information theory, Epistemic Utility theory, etc. are not enough for providing a theory of the Value/Utility of K. And also truthfulness is not 'the' Value of K. Even true information can be noxious for the subject.
NASA Astrophysics Data System (ADS)
McPhaden, Michael
2010-10-01
It is critical to recognize the benefits and limitations of scientific knowledge, particularly when it comes to predicting hazards. I agree with G. J. Wasserburg that AGU should help scientists communicate their work accurately and understandably so it can provide the greatest value to society. This objective is explicit in AGU's new strategic plan (http://www.agu.org/about/strategic_plan.shtml) and is consistent with our vision of both advancing and communicating Earth and space science to ensure a sustainable future. We as a community have an obligation to increase the role of science in informing policy to mitigate the impacts of natural disasters. Such efforts require an open exchange of ideas and information and a clear understanding of the limitations of our knowledge. In response to Flavio Dobran, I agree that scientists are not above the law and, like all citizens, must be held accountable for their actions. However, laws and lawmakers must also recognize what science can and cannot do. We cannot yet reliably predict precisely when earthquakes will occur.
Test bench for measurements of NOvA scintillator properties at JINR
NASA Astrophysics Data System (ADS)
Velikanova, D. S.; Antoshkin, A. I.; Anfimov, N. V.; Samoylov, O. B.
2018-04-01
The NOvA experiment was built to study oscillation parameters, mass hierarchy, CP- violation phase in the lepton sector and θ23 octant, via vɛ appearance and vμ disappearance modes in both neutrino and antineutrino beams. These scientific goals require good knowledge about NOvA scintillator basic properties. The new test bench was constructed and upgraded at JINR. The main goal of this bench is to measure scintillator properties (for solid and liquid scintillators), namely α/β discrimination and Birk's coefficients for protons and other hadrons (quenching factors). This knowledge will be crucial for recovering the energy of the hadronic part of neutrino interactions with scintillator nuclei. α/β discrimination was performed on the first version of the bench for LAB-based and NOvA scintillators. It was performed again on the upgraded version of the bench with higher statistic and precision level. Preliminary result of quenching factors for protons was obtained. A technical description of both versions of the bench and current results of the measurements and analysis are presented in this work.
Evidence and resources to implement Pharmacogenetic Knowledge for Precision Medicine
Caudle, Kelly E.; Gammal, Roseann S.; Whirl-Carrillo, Michelle; Hoffman, James M.; Relling, Mary V.; Klein, Teri E.
2016-01-01
Purpose Implementation of pharmacogenetics into clinical practice has been relatively slow despite substantial scientific progress over the last decade. One barrier that inhibits uptake of pharmacogenetics into routine clinical practice is the lack of knowledge of how to translate a genetic test into a clinical action based on current evidence. The purpose of this paper is to describe the current state of pharmacogenetic evidence and evidence-based resources that facilitate the uptake of pharmacogenetics into clinical practice. Summary Controversy exists over the required evidence threshold needed for routine clinical implementation of pharmacogenetics. Large randomized controlled trials are not clinically feasible or necessary for many pharmacogenetic applications. Online resources exist like the Clinical Pharmacogenetics Implementation Consortium (CPIC) and the Pharmacogenomics Knowledgebase (PharmGKB) that provide freely available, evidence-based resources that facilitate the translation of genetic laboratory test results into actionable prescribing recommendations for specific drugs. Conclusion Resources provided by organizations such as CPIC and PharmGKB that use standardized approaches to evaluate the literature and provide clinical guidance are essential for the implementation of pharmacogenetics into routine clinical practice. PMID:27864205
An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory
Yen, Chung-Cheng; Guymon, Gary L.
1990-01-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
An Efficient Deterministic-Probabilistic Approach to Modeling Regional Groundwater Flow: 1. Theory
NASA Astrophysics Data System (ADS)
Yen, Chung-Cheng; Guymon, Gary L.
1990-07-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Gur, Tamer; Cowley, Andrew; Li, Weizhong; Uludag, Mahmut; Pundir, Sangya; Cham, Jennifer A; McWilliam, Hamish; Lopez, Rodrigo
2015-07-01
The European Bioinformatics Institute (EMBL-EBI-https://www.ebi.ac.uk) provides free and unrestricted access to data across all major areas of biology and biomedicine. Searching and extracting knowledge across these domains requires a fast and scalable solution that addresses the requirements of domain experts as well as casual users. We present the EBI Search engine, referred to here as 'EBI Search', an easy-to-use fast text search and indexing system with powerful data navigation and retrieval capabilities. API integration provides access to analytical tools, allowing users to further investigate the results of their search. The interconnectivity that exists between data resources at EMBL-EBI provides easy, quick and precise navigation and a better understanding of the relationship between different data types including sequences, genes, gene products, proteins, protein domains, protein families, enzymes and macromolecular structures, together with relevant life science literature. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Frederickson, Reese
2016-09-01
When veterinary pathologists testify as expert witnesses in animal cruelty trials, they may find themselves in an intimidating and unfamiliar environment. The legal rules are clouded in mystery, the lawyers dwell on mundane details, and the witness's words are extracted with precision by a verbal scalpel. An unprepared expert witness can feel ungrounded and stripped of confidence. The goal of this article is to lift the veil of mystery and give the veterinary pathologist the tools to be a knowledgeable and confident expert witness before and during testimony. This article discusses the types of expert witnesses, disclosure requirements and the importance of a good report, the legal basics of expert testimony, and how to be an effective expert witness. The article references Minnesota law; however, the laws are similar in most jurisdictions and based on the same constitutional requirements, and the concepts presented are applicable in nearly every courtroom.(1). © The Author(s) 2016.
Flight evaluation of a computer aided low-altitude helicopter flight guidance system
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Jones, Raymond D.; Clark, Raymond
1993-01-01
The Flight Systems Development branch of the U.S. Army's Avionics Research and Development Activity (AVRADA) and NASA Ames Research Center developed for flight testing a Computer Aided Low-Altitude Helicopter Flight (CALAHF) guidance system. The system includes a trajectory-generation algorithm which uses dynamic programming and a helmet-mounted display (HMD) presentation of a pathway-in-the-sky, a phantom aircraft, and flight-path vector/predictor guidance symbology. The trajectory-generation algorithm uses knowledge of the global mission requirements, a digital terrain map, aircraft performance capabilities, and precision navigation information to determine a trajectory between mission waypoints that seeks valleys to minimize threat exposure. This system was developed and evaluated through extensive use of piloted simulation and has demonstrated a 'pilot centered' concept of automated and integrated navigation and terrain mission planning flight guidance. This system has shown a significant improvement in pilot situational awareness, and mission effectiveness as well as a decrease in training and proficiency time required for a near terrain, nighttime, adverse weather system.
NASA Astrophysics Data System (ADS)
Martin, Chris; Ben-Yakar, Adela
2016-11-01
Ultrafast laser surgery of tissue requires precise knowledge of the tissue's optical properties to control the extent of subsurface ablation. Here, we present a method to determine the scattering lengths, ℓs, and fluence thresholds, Fth, in multilayered and turbid tissue by finding the input energies required to initiate ablation at various depths in each tissue layer. We validated the method using tissue-mimicking phantoms and applied it to porcine vocal folds, which consist of an epithelial (ep) layer and a superficial lamina propia (SLP) layer. Across five vocal fold samples, we found ℓ=51.0±3.9 μm, F=1.78±0.08 J/cm2, ℓ=26.5±1.6 μm, and F=1.14±0.12 J/cm2. Our method can enable personalized determination of tissue optical properties in a clinical setting, leading to less patient-to-patient variability and more favorable outcomes in operations, such as femto-LASIK surgery.
Feltus, F Alex
2014-06-01
Understanding the control of any trait optimally requires the detection of causal genes, gene interaction, and mechanism of action to discover and model the biochemical pathways underlying the expressed phenotype. Functional genomics techniques, including RNA expression profiling via microarray and high-throughput DNA sequencing, allow for the precise genome localization of biological information. Powerful genetic approaches, including quantitative trait locus (QTL) and genome-wide association study mapping, link phenotype with genome positions, yet genetics is less precise in localizing the relevant mechanistic information encoded in DNA. The coupling of salient functional genomic signals with genetically mapped positions is an appealing approach to discover meaningful gene-phenotype relationships. Techniques used to define this genetic-genomic convergence comprise the field of systems genetics. This short review will address an application of systems genetics where RNA profiles are associated with genetically mapped genome positions of individual genes (eQTL mapping) or as gene sets (co-expression network modules). Both approaches can be applied for knowledge independent selection of candidate genes (and possible control mechanisms) underlying complex traits where multiple, likely unlinked, genomic regions might control specific complex traits. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Identification of elastic, dielectric, and piezoelectric constants in piezoceramic disks.
Perez, Nicolas; Andrade, Marco A B; Buiochi, Flavio; Adamowski, Julio C
2010-12-01
Three-dimensional modeling of piezoelectric devices requires a precise knowledge of piezoelectric material parameters. The commonly used piezoelectric materials belong to the 6mm symmetry class, which have ten independent constants. In this work, a methodology to obtain precise material constants over a wide frequency band through finite element analysis of a piezoceramic disk is presented. Given an experimental electrical impedance curve and a first estimate for the piezoelectric material properties, the objective is to find the material properties that minimize the difference between the electrical impedance calculated by the finite element method and that obtained experimentally by an electrical impedance analyzer. The methodology consists of four basic steps: experimental measurement, identification of vibration modes and their sensitivity to material constants, a preliminary identification algorithm, and final refinement of the material constants using an optimization algorithm. The application of the methodology is exemplified using a hard lead zirconate titanate piezoceramic. The same methodology is applied to a soft piezoceramic. The errors in the identification of each parameter are statistically estimated in both cases, and are less than 0.6% for elastic constants, and less than 6.3% for dielectric and piezoelectric constants.
Ageing and inflammation in the male reproductive tract.
Frungieri, M B; Calandra, R S; Bartke, A; Matzkin, M E
2018-05-08
Ageing is usually characterised by a mild chronic proinflammatory state. Despite the tight association between both processes, the phenomenon has recently been termed inflammageing. Inflammation in the male reproductive tract is frequently linked with bacterial or virus infections but also with a broad range of noninfectious processes. Prostatitis, epididymitis and orchitis, among others, can lead to infertility. However, in spite of the inflammation theory of disease, chronic inflammation in male urogenital system does not always cause symptoms. With advancing age, inflammatory processes are commonly observed in the male reproductive tract. Nevertheless, the incidence of inflammation in reproductive organs and ducts varies greatly among elderly men. Inflammageing is considered a predictor of pathogenesis and the development of age-related diseases. This article briefly summarises the current state of knowledge on inflammageing in the male reproductive tract. Yet, the precise aetiology of inflammageing in the male urogenital system, and its potential contribution not only to infertility but most importantly to adverse health outcomes remains almost unknown. Thus, further investigations are required to elucidate the precise cross-links between inflammation and male reproductive senescence, and to establish the impact of anti-inflammatory drug treatments on elder men's general health status. © 2018 Blackwell Verlag GmbH.
Real-Space Mapping of Surface Trap States in CIGSe Nanocrystals Using 4D Electron Microscopy.
Bose, Riya; Bera, Ashok; Parida, Manas R; Adhikari, Aniruddha; Shaheen, Basamat S; Alarousu, Erkki; Sun, Jingya; Wu, Tom; Bakr, Osman M; Mohammed, Omar F
2016-07-13
Surface trap states in copper indium gallium selenide semiconductor nanocrystals (NCs), which serve as undesirable channels for nonradiative carrier recombination, remain a great challenge impeding the development of solar and optoelectronics devices based on these NCs. In order to design efficient passivation techniques to minimize these trap states, a precise knowledge about the charge carrier dynamics on the NCs surface is essential. However, selective mapping of surface traps requires capabilities beyond the reach of conventional laser spectroscopy and static electron microscopy; it can only be accessed by using a one-of-a-kind, second-generation four-dimensional scanning ultrafast electron microscope (4D S-UEM) with subpicosecond temporal and nanometer spatial resolutions. Here, we precisely map the collective surface charge carrier dynamics of copper indium gallium selenide NCs as a function of the surface trap states before and after surface passivation in real space and time using S-UEM. The time-resolved snapshots clearly demonstrate that the density of the trap states is significantly reduced after zinc sulfide (ZnS) shelling. Furthermore, the removal of trap states and elongation of carrier lifetime are confirmed by the increased photocurrent of the self-biased photodetector fabricated using the shelled NCs.
Lee, Kit-Hang; Fu, Denny K.C.; Leong, Martin C.W.; Chow, Marco; Fu, Hing-Choi; Althoefer, Kaspar; Sze, Kam Yim; Yeung, Chung-Kwong
2017-01-01
Abstract Bioinspired robotic structures comprising soft actuation units have attracted increasing research interest. Taking advantage of its inherent compliance, soft robots can assure safe interaction with external environments, provided that precise and effective manipulation could be achieved. Endoscopy is a typical application. However, previous model-based control approaches often require simplified geometric assumptions on the soft manipulator, but which could be very inaccurate in the presence of unmodeled external interaction forces. In this study, we propose a generic control framework based on nonparametric and online, as well as local, training to learn the inverse model directly, without prior knowledge of the robot's structural parameters. Detailed experimental evaluation was conducted on a soft robot prototype with control redundancy, performing trajectory tracking in dynamically constrained environments. Advanced element formulation of finite element analysis is employed to initialize the control policy, hence eliminating the need for random exploration in the robot's workspace. The proposed control framework enabled a soft fluid-driven continuum robot to follow a 3D trajectory precisely, even under dynamic external disturbance. Such enhanced control accuracy and adaptability would facilitate effective endoscopic navigation in complex and changing environments. PMID:29251567
Lee, Kit-Hang; Fu, Denny K C; Leong, Martin C W; Chow, Marco; Fu, Hing-Choi; Althoefer, Kaspar; Sze, Kam Yim; Yeung, Chung-Kwong; Kwok, Ka-Wai
2017-12-01
Bioinspired robotic structures comprising soft actuation units have attracted increasing research interest. Taking advantage of its inherent compliance, soft robots can assure safe interaction with external environments, provided that precise and effective manipulation could be achieved. Endoscopy is a typical application. However, previous model-based control approaches often require simplified geometric assumptions on the soft manipulator, but which could be very inaccurate in the presence of unmodeled external interaction forces. In this study, we propose a generic control framework based on nonparametric and online, as well as local, training to learn the inverse model directly, without prior knowledge of the robot's structural parameters. Detailed experimental evaluation was conducted on a soft robot prototype with control redundancy, performing trajectory tracking in dynamically constrained environments. Advanced element formulation of finite element analysis is employed to initialize the control policy, hence eliminating the need for random exploration in the robot's workspace. The proposed control framework enabled a soft fluid-driven continuum robot to follow a 3D trajectory precisely, even under dynamic external disturbance. Such enhanced control accuracy and adaptability would facilitate effective endoscopic navigation in complex and changing environments.
49 CFR 383.111 - Required knowledge.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 5 2011-10-01 2011-10-01 false Required knowledge. 383.111 Section 383.111... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.111 Required knowledge. (a) All CMV operators must have knowledge of the following 20 general areas: (1) Safe operations regulations...
49 CFR 383.111 - Required knowledge.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 5 2010-10-01 2010-10-01 false Required knowledge. 383.111 Section 383.111... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.111 Required knowledge. All commercial motor vehicle operators must have knowledge of the following general areas: (a) Safe operations...
THz Spectroscopy and Spectroscopic Database for Astrophysics
NASA Technical Reports Server (NTRS)
Pearson, John C.; Drouin, Brian J.
2006-01-01
Molecule specific astronomical observations rely on precisely determined laboratory molecular data for interpretation. The Herschel Heterodyne Instrument for Far Infrared, a suite of SOFIA instruments, and ALMA are each well placed to expose the limitations of available molecular physics data and spectral line catalogs. Herschel and SOFIA will observe in high spectral resolution over the entire far infrared range. Accurate data to previously unimagined frequencies including infrared ro-vibrational and ro-torsional bands will be required for interpretation of the observations. Planned ALMA observations with a very small beam will reveal weaker emission features requiring accurate knowledge of higher quantum numbers and additional vibrational states. Historically, laboratory spectroscopy has been at the front of submillimeter technology development, but now astronomical receivers have an enormous capability advantage. Additionally, rotational spectroscopy is a relatively mature field attracting little interest from students and funding agencies. Molecular database maintenance is tedious and difficult to justify as research. This severely limits funding opportunities even though data bases require the same level of expertise as research. We report the application of some relatively new receiver technology into a simple solid state THz spectrometer that has the performance required to collect the laboratory data required by astronomical observations. Further detail on the lack of preparation for upcoming missions by the JPL spectral line catalog is given.
THz Spectroscopy and Spectroscopic Database for Astrophysics
NASA Technical Reports Server (NTRS)
Pearson, John C.; Drouin, Brian J.
2006-01-01
Molecule specific astronomical observations rely on precisely determined laboratory molecular data for interpretation. The Herschel Heterodyne Instrument for Far Infrared, a suite of SOFIA instruments, and ALMA are each well placed to expose the limitations of available molecular physics data and spectral line catalogs. Herschel and SOFIA will observe in high spectral resolution over the entire far infrared range. Accurate data to previously unimagined frequencies including infrared ro-vibrational and ro-torsional bands will be required for interpretation of the observations. Planned ALMA observations with a very small beam will reveal weaker emission features requiring accurate knowledge of higher quantum numbers and additional vibrational states. Historically, laboratory spectroscopy has been at the front of submillimeter technology development, but now astronomical receivers have an enormous capability advantage. Additionally, rotational spectroscopy is a relatively mature field attracting little interest from students and funding agencies. Molecular data base maintenance is tedious and difficult to justify as research. This severely limits funding opportunities even though data bases require the same level of expertise as research. We report the application of some relatively new receiver technology into a simple solid state THz spectrometer that has the performance required to collect the laboratory data required by astronomical observations. Further detail on the lack of preparation for upcoming missions by the JPL spectral line catalog is given.
Manufacturing plastic injection optical molds
NASA Astrophysics Data System (ADS)
Bourque, David
2008-08-01
ABCO Tool & Die, Inc. is a mold manufacturer specializing in the manufacturing of plastic injection molds for molded optical parts. The purpose of this presentation is to explain the concepts and procedures required to build a mold that produces precision optical parts. Optical molds can produce a variety of molded parts ranging from safety eyewear to sophisticated military lens parts, which must meet precise optical specifications. The manufacturing of these molds begins with the design engineering of precision optical components. The mold design and the related optical inserts are determined based upon the specific optical criteria and optical surface geometry. The mold manufacturing techniques will be based upon the optical surface geometry requirements and specific details. Manufacturing processes used will be specific to prescribed geometrical surface requirements of the molded part. The combined efforts result in a robust optical mold which can produce molded parts that meet the most precise optical specifications.
Cheng, Lijun; Schneider, Bryan P; Li, Lang
2016-07-01
Cancer has been extensively characterized on the basis of genomics. The integration of genetic information about cancers with data on how the cancers respond to target based therapy to help to optimum cancer treatment. The increasing usage of sequencing technology in cancer research and clinical practice has enormously advanced our understanding of cancer mechanisms. The cancer precision medicine is becoming a reality. Although off-label drug usage is a common practice in treating cancer, it suffers from the lack of knowledge base for proper cancer drug selections. This eminent need has become even more apparent considering the upcoming genomics data. In this paper, a personalized medicine knowledge base is constructed by integrating various cancer drugs, drug-target database, and knowledge sources for the proper cancer drugs and their target selections. Based on the knowledge base, a bioinformatics approach for cancer drugs selection in precision medicine is developed. It integrates personal molecular profile data, including copy number variation, mutation, and gene expression. By analyzing the 85 triple negative breast cancer (TNBC) patient data in the Cancer Genome Altar, we have shown that 71.7% of the TNBC patients have FDA approved drug targets, and 51.7% of the patients have more than one drug target. Sixty-five drug targets are identified as TNBC treatment targets and 85 candidate drugs are recommended. Many existing TNBC candidate targets, such as Poly (ADP-Ribose) Polymerase 1 (PARP1), Cell division protein kinase 6 (CDK6), epidermal growth factor receptor, etc., were identified. On the other hand, we found some additional targets that are not yet fully investigated in the TNBC, such as Gamma-Glutamyl Hydrolase (GGH), Thymidylate Synthetase (TYMS), Protein Tyrosine Kinase 6 (PTK6), Topoisomerase (DNA) I, Mitochondrial (TOP1MT), Smoothened, Frizzled Class Receptor (SMO), etc. Our additional analysis of target and drug selection strategy is also fully supported by the drug screening data on TNBC cell lines in the Cancer Cell Line Encyclopedia. The proposed bioinformatics approach lays a foundation for cancer precision medicine. It supplies much needed knowledge base for the off-label cancer drug usage in clinics. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Garrido, P; Aldaz, A; Vera, R; Calleja, M A; de Álava, E; Martín, M; Matías-Guiu, X; Palacios, J
2018-04-01
Precision medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle for each person. Precision medicine is transforming clinical and biomedical research, as well as health care itself from a conceptual, as well as a methodological viewpoint, providing extraordinary opportunities to improve public health and lower the costs of the healthcare system. However, the implementation of precision medicine poses ethical-legal, regulatory, organizational, and knowledge-related challenges. Without a national strategy, precision medicine, which will be implemented one way or another, could take place without the appropriate planning that can guarantee technical quality, equal access of all citizens to the best practices, violating the rights of patients and professionals, and jeopardizing the solvency of the healthcare system. With this paper from the Spanish Societies of Medical Oncology, Pathology, and Hospital Pharmacy, we highlight the need to institute a consensual national strategy for the development of precision medicine in our country, review the national and international context, comment on the opportunities and challenges for implementing precision medicine, and outline the objectives of a national strategy on precision medicine in cancer.
Computation as the mechanistic bridge between precision medicine and systems therapeutics.
Hansen, J; Iyengar, R
2013-01-01
Over the past 50 years, like molecular cell biology, medicine and pharmacology have been driven by a reductionist approach. The focus on individual genes and cellular components as disease loci and drug targets has been a necessary step in understanding the basic mechanisms underlying tissue/organ physiology and drug action. Recent progress in genomics and proteomics, as well as advances in other technologies that enable large-scale data gathering and computational approaches, is providing new knowledge of both normal and disease states. Systems-biology approaches enable integration of knowledge from different types of data for precision medicine and systems therapeutics. In this review, we describe recent studies that contribute to these emerging fields and discuss how together these fields can lead to a mechanism-based therapy for individual patients.
Computation as the Mechanistic Bridge Between Precision Medicine and Systems Therapeutics
Hansen, J; Iyengar, R
2014-01-01
Over the past 50 years, like molecular cell biology, medicine and pharmacology have been driven by a reductionist approach. The focus on individual genes and cellular components as disease loci and drug targets has been a necessary step in understanding the basic mechanisms underlying tissue/organ physiology and drug action. Recent progress in genomics and proteomics, as well as advances in other technologies that enable large-scale data gathering and computational approaches, is providing new knowledge of both normal and disease states. Systems-biology approaches enable integration of knowledge from different types of data for precision medicine and systems therapeutics. In this review, we describe recent studies that contribute to these emerging fields and discuss how together these fields can lead to a mechanism-based therapy for individual patients. PMID:23212109
Characterization of the Nimbus-7 SBUV radiometer for the long-term monitoring of stratospheric ozone
NASA Technical Reports Server (NTRS)
Cebula, Richard P.; Park, H.; Heath, D. F.
1988-01-01
Precise knowledge of in-orbit sensitivity change is critical for the successful monitoring of stratospheric ozone by satellite-based remote sensors. This paper evaluates those aspects of the in-flight operation that influence the long-term stability of the upper stratospheric ozone measurements made by the Nimbus-7 SBUV spectroradiometer and chronicles methods used to maintain the long-term albedo calibration of this UV sensor. It is shown that the instrument's calibration for the ozone measurement, the albedo calibration, has been maintained over the first 6 yr of operation to an accuracy of approximately + or - 2 percent. The instrument's wavelength calibration is shown to drift linearly with time. The knowledge of the SBUV wavelength assignment is maintained to a 0.02-nm precision.
Optimal MEMS device for mobility and zeta potential measurements using DC electrophoresis.
Karam, Pascal R; Dukhin, Andrei; Pennathur, Sumita
2017-05-01
We have developed a novel microchannel geometry that allows us to perform simple DC electrophoresis to measure the electrophoretic mobility and zeta potential of analytes and particles. In standard capillary geometries, mobility measurements using DC fields are difficult to perform. Specifically, measurements in open capillaries require knowledge of the hard to measure and often dynamic wall surface potential. Although measurements in closed capillaries eliminate this requirement, the measurements must be performed at infinitesimally small regions of zero flow where the pressure driven-flow completely cancels the electroosmotic flow (Komagata Planes). Furthermore, applied DC fields lead to electrode polarization, further questioning the reliability and accuracy of the measurement. In contrast, our geometry expands and moves the Komagata planes to where velocity gradients are at a minimum, and thus knowledge of the precise location of a Komagata plane is not necessary. Additionally, our microfluidic device prevents electrode polarization because of fluid recirculation around the electrodes. We fabricated our device using standard MEMS fabrication techniques and performed electrophoretic mobility measurements on 500 nm fluorescently tagged polystyrene particles at various buffer concentrations. Results are comparable to two different commercial dynamic light scattering based particle sizing instruments. We conclude with guidelines to further develop this robust electrophoretic tool that allows for facile and efficient particle characterization. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Precision Linear Actuator for Space Interferometry Mission (SIM) Siderostat Pointing
NASA Technical Reports Server (NTRS)
Cook, Brant; Braun, David; Hankins, Steve; Koenig, John; Moore, Don
2008-01-01
'SIM PlanetQuest will exploit the classical measuring tool of astrometry (interferometry) with unprecedented precision to make dramatic advances in many areas of astronomy and astrophysics'(1). In order to obtain interferometric data two large steerable mirrors, or Siderostats, are used to direct starlight into the interferometer. A gimbaled mechanism actuated by linear actuators is chosen to meet the unprecedented pointing and angle tracking requirements of SIM. A group of JPL engineers designed, built, and tested a linear ballscrew actuator capable of performing submicron incremental steps for 10 years of continuous operation. Precise, zero backlash, closed loop pointing control requirements, lead the team to implement a ballscrew actuator with a direct drive DC motor and a precision piezo brake. Motor control commutation using feedback from a precision linear encoder on the ballscrew output produced an unexpected incremental step size of 20 nm over a range of 120 mm, yielding a dynamic range of 6,000,000:1. The results prove linear nanometer positioning requires no gears, levers, or hydraulic converters. Along the way many lessons have been learned and will subsequently be shared.
[Medical big data and precision medicine: prospects of epidemiology].
Song, J; Hu, Y H
2016-08-10
Since the development of high-throughput technology, electronic medical record system and big data technology, the value of medical data has caused more attention. On the other hand, the proposal of Precision Medicine Initiative opens up the prospect for medical big data. As a Tool-related Discipline, Epidemiology is, focusing on exploitation the resources of existing big data and promoting the integration of translational research and knowledge to completely unlocking the "black box" of exposure-disease continuum. It also tries to accelerating the realization of the ultimate goal on precision medicine. The overall purpose, however is to translate the evidence from scientific research to improve the health of the people.
Precision and fast wavelength tuning of a dynamically phase-locked widely-tunable laser.
Numata, Kenji; Chen, Jeffrey R; Wu, Stewart T
2012-06-18
We report a precision and fast wavelength tuning technique demonstrated for a digital-supermode distributed Bragg reflector laser. The laser was dynamically offset-locked to a frequency-stabilized master laser using an optical phase-locked loop, enabling precision fast tuning to and from any frequencies within a ~40-GHz tuning range. The offset frequency noise was suppressed to the statically offset-locked level in less than ~40 μs upon each frequency switch, allowing the laser to retain the absolute frequency stability of the master laser. This technique satisfies stringent requirements for gas sensing lidars and enables other applications that require such well-controlled precision fast tuning.
Mars Surface Ionizing Radiation Environment: Need for Validation
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Kim, M. Y.; Clowdsley, M. S.; Heinbockel, J. H.; Tripathi, R. K.; Singleterry, R. C.; Shinn, J. L.; Suggs, R.
1999-01-01
Protection against the hazards from exposure to ionizing radiation remains an unresolved issue in the Human Exploration and Development of Space (HEDS) enterprise [1]. The major uncertainty is the lack of data on biological response to galactic cosmic ray (GCR) exposures but even a full understanding of the physical interaction of GCR with shielding and body tissues is not yet available and has a potentially large impact on mission costs. "The general opinion is that the initial flights should be short-stay missions performed as fast as possible (so-called 'Sprint' missions) to minimize crew exposure to the zero-g and space radiation environment, to ease requirements on system reliability, and to enhance the probability of mission success." The short-stay missions tend to have long transit times and may not be the best option due to the relatively long exposure to zero-g and ionizing radiation. On the other hand the short-transit missions tend to have long stays on the surface requiring an adequate knowledge of the surface radiation environment to estimate risks and to design shield configurations. Our knowledge of the surface environment is theoretically based and suffers from an incomplete understanding of the physical interactions of GCR with the Martian atmosphere, Martian surface, and intervening shield materials. An important component of Mars surface robotic exploration is the opportunity to test our understanding of the Mars surface environment. The Mars surface environment is generated by the interaction of Galactic Cosmic Rays (GCR) and Solar Particle Events (SPEs) with the Mars atmosphere and Mars surface materials. In these interactions, multiple charged ions are reduced in size and secondary particles are generated, including neutrons. Upon impact with the Martian surface, the character of the interactions changes as a result of the differing nuclear constituents of the surface materials. Among the surface environment are many neutrons diffusing from the Martian surface and especially prominent are energetic neutrons with energies up to a few hundred MeV. Testing of these computational results is first supported by ongoing experiments at the Brookhaven National Laboratory but equally important is the validation to the extent possible by measurements on the Martian surface. Such measurements are limited by power and weight requirements of the specific mission and simplified instrumentation by necessity lacks the full discernment of particle type and spectra as is possible with laboratory experimental equipment. Yet, the surface measurements are precise and a necessary requisite to validate our understanding of the surface environment. At the very minimum the surface measurements need to provide some spectral information on the neutron environment. Of absolute necessity is the precise knowledge of the detector response functions for absolute comparisons between the computational model of the surface environment and the detector measurements on the surface.
SIM Planetquest Science and Technology: A Status Report
NASA Technical Reports Server (NTRS)
Edberg, Stephen J.; Laskin, Robert A.; Marr, James C., IV; Unwin, Stephen C.; Shao, Michael
2007-01-01
Optical interferometry will open new vistas for astronomy over the next decade. The Space Interferometry Mission (SIM-PlanetQuest), operating unfettered by the Earth's atmosphere, will offer unprecedented astrometric precision that promises the discovery of Earth-analog extra-solar planets as well as a wealth of important astrophysics. Results from SIM will permit the determination of stellar masses to accuracies of 2% or better for objects ranging from brown dwarfs through main sequence stars to evolved white dwarfs, neutron stars, and black holes. Studies of star clusters will yield age determinations and internal dynamics. Microlensing measurements will present the mass spectrum of the Milky Way internal to the Sun while proper motion surveys will show the Sun's orbital radius and speed. Studies of the Galaxy's halo component and companion dwarf galaxies permit the determination of the Milky Way's mass distribution, including its Dark Matter component and the mass distribution and Dark Matter component of the Local Group. Cosmology benefits from precision (1-2%) determination of distances to Cepheid and RR Lyrae standard candles. The emission mechanism of supermassive black holes will be investigated. Finally, radio and optical celestial reference frames will be tied together by an improvement of two orders of magnitude. Optical interferometers present severe technological challenges. The Jet Propulsion Laboratory, with the support of Lockheed Martin Advanced Technology Center (LM ATC) and Northrop Grumman Space Technology (NGST), has addressed these challenges with a technology development program that is now complete. The requirements for SIM have been satisfied, based on outside peer review, using a series of laboratory tests and appropriate computer simulations: laser metrology systems perform with 10 picometer precision; mechanical vibrations have been controlled to nanometers, demonstrating orders of magnitude disturbance rejection; and knowledge of component positions throughout the whole test assembly has been demonstrated to the required picometer level. Technology transfer to the SIM flight team is now well along.
SIM PlanetQuest science and technology: a status report
NASA Astrophysics Data System (ADS)
Edberg, Stephen J.; Laskin, Robert A.; Marr, James C., IV; Unwin, Stephen C.; Shao, Michael
2007-09-01
Optical interferometry will open new vistas for astronomy over the next decade. The Space Interferometry Mission (SIM-PlanetQuest), operating unfettered by the Earth's atmosphere, will offer unprecedented astrometric precision that promises the discovery of Earth-analog extra-solar planets as well as a wealth of important astrophysics. Results from SIM will permit the determination of stellar masses to accuracies of 2% or better for objects ranging from brown dwarfs through main sequence stars to evolved white dwarfs, neutron stars, and black holes. Studies of star clusters will yield age determinations and internal dynamics. Microlensing measurements will present the mass spectrum of the Milky Way internal to the Sun while proper motion surveys will show the Sun's orbital radius and speed. Studies of the Galaxy's halo component and companion dwarf galaxies permit the determination of the Milky Way's mass distribution, including its Dark Matter component and the mass distribution and Dark Matter component of the Local Group. Cosmology benefits from precision (1-2%) determination of distances to Cepheid and RR Lyrae standard candles. The emission mechanism of supermassive black holes will be investigated. Finally, radio and optical celestial reference frames will be tied together by an improvement of two orders of magnitude. Optical interferometers present severe technological challenges. The Jet Propulsion Laboratory, with the support of Lockheed Martin Advanced Technology Center (LM ATC) and Northrop Grumman Space Technology (NGST), has addressed these challenges with a technology development program that is now complete. The requirements for SIM have been satisfied, based on outside peer review, using a series of laboratory tests and appropriate computer simulations: laser metrology systems perform with 10 picometer precision; mechanical vibrations have been controlled to nanometers, demonstrating orders of magnitude disturbance rejection; and knowledge of component positions throughout the whole test assembly has been demonstrated to the required picometer level. Technology transfer to the SIM flight team is now well along.
Oxidative Stress, Unfolded Protein Response, and Apoptosis in Developmental Toxicity
Kupsco, Allison; Schlenk, Daniel
2016-01-01
Physiological development requires precise spatiotemporal regulation of cellular and molecular processes. Disruption of these key events can generate developmental toxicity in the form of teratogenesis or mortality. The mechanism behind many developmental toxicants remains unknown. While recent work has focused on the unfolded protein response (UPR), oxidative stress, and apoptosis in the pathogenesis of disease, few studies have addressed their relationship in developmental toxicity. Redox regulation, UPR, and apoptosis are essential for physiological development and can be disturbed by a variety of endogenous and exogenous toxicants to generate lethality and diverse malformations. This review examines the current knowledge of the role of oxidative stress, UPR, and apoptosis in physiological development as well as in developmental toxicity, focusing on studies and advances in vertebrates model systems. PMID:26008783
Design and manufacturing challenges of optogenetic neural interfaces: a review
NASA Astrophysics Data System (ADS)
Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Costa, R. M.; Correia, J. H.
2017-08-01
Optogenetics is a relatively new technology to achieve cell-type specific neuromodulation with millisecond-scale temporal precision. Optogenetic tools are being developed to address neuroscience challenges, and to improve the knowledge about brain networks, with the ultimate aim of catalyzing new treatments for brain disorders and diseases. To reach this ambitious goal the implementation of mature and reliable engineered tools is required. The success of optogenetics relies on optical tools that can deliver light into the neural tissue. Objective/Approach: Here, the design and manufacturing approaches available to the scientific community are reviewed, and current challenges to accomplish appropriate scalable, multimodal and wireless optical devices are discussed. Significance: Overall, this review aims at presenting a helpful guidance to the engineering and design of optical microsystems for optogenetic applications.
A unifying model of the role of the infralimbic cortex in extinction and habits
Taylor, Jane R.; Chandler, L. Judson
2014-01-01
The infralimbic prefrontal cortex (IL) has been shown to be critical for the regulation of flexible behavior, but its precise function remains unclear. This region has been shown to be critical for the acquisition, consolidation, and expression of extinction learning, leading many to hypothesize that IL suppresses behavior as part of a “stop” network. However, this framework is at odds with IL function in habitual behavior in which the IL has been shown to be required for the expression and acquisition of ongoing habitual behavior. Here, we will review the current state of knowledge of IL anatomy and function in behavioral flexibility and provide a testable framework for a single IL mechanism underlying its function in both extinction and habit learning. PMID:25128534
'Ethos' Enabling Organisational Knowledge Creation
NASA Astrophysics Data System (ADS)
Matsudaira, Yoshito
This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.
Precision Requirements for Space-based XCO2 Data
NASA Technical Reports Server (NTRS)
Miller, C. E.; Crisp, D.; DeCola, P. C.; Olsen, S. C.; Randerson, J. T.; Rayner, P.; Jacob, D.J.; Jones, D.; Suntharalingam, P.
2005-01-01
Precision requirements have been determined for the column-averaged CO2 dry air mole fraction (X(sub CO2)) data products to be delivered by the Orbiting Carbon Observatory (OCO). These requirements result from an assessment of the amplitude and spatial gradients in X(sub CO2), the relationship between X(sub CO2) precision and surface CO2 flux uncertainties calculated from inversions of the X(sub CO2) data, and the effects of X,,Z biases on CO2 flux inversions. Observing system simulation experiments and synthesis inversion modeling demonstrate that the OCO mission design and sampling strategy provide the means to achieve the X(sub CO2) precision requirements. The impact of X(sub CO2) biases on CO2 flux uncertainties depend on their spatial and temporal extent since CO2 sources and sinks are inferred from regional-scale X(sub CO2) gradients. Simulated OCO sampling of the TRACE-P CO2 fields shows the ability of X(sub CO2) data to constrain CO2 flux inversions over Asia and distinguish regional fluxes from India and China.
The Genomic Data Commons Launches
The NCI Genomic Data Commons is a next generation knowledge network that enables the access, analysis, and submission of cancer genomic data. The GDC facilitates data sharing and promotes precision medicine in oncology.
NASA Astrophysics Data System (ADS)
Uchill, Joseph H.; Assadi, Amir H.
2003-01-01
The advent of the internet has opened a host of new and exciting questions in the science and mathematics of information organization and data mining. In particular, a highly ambitious promise of the internet is to bring the bulk of human knowledge to everyone with access to a computer network, providing a democratic medium for sharing and communicating knowledge regardless of the language of the communication. The development of sharing and communication of knowledge via transfer of digital files is the first crucial achievement in this direction. Nonetheless, available solutions to numerous ancillary problems remain far from satisfactory. Among such outstanding problems are the first few fundamental questions that have been responsible for the emergence and rapid growth of the new field of Knowledge Engineering, namely, classification of forms of data, their effective organization, and extraction of knowledge from massive distributed data sets, and the design of fast effective search engines. The precision of machine learning algorithms in classification and recognition of image data (e.g. those scanned from books and other printed documents) are still far from human performance and speed in similar tasks. Discriminating the many forms of ASCII data from each other is not as difficult in view of the emerging universal standards for file-format. Nonetheless, most of the past and relatively recent human knowledge is yet to be transformed and saved in such machine readable formats. In particular, an outstanding problem in knowledge engineering is the problem of organization and management--with precision comparable to human performance--of knowledge in the form of images of documents that broadly belong to either text, image or a blend of both. It was shown in that the effectiveness of OCR was intertwined with the success of language and font recognition.
Phase estimation without a priori phase knowledge in the presence of loss
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolodynski, Jan; Demkowicz-Dobrzanski, Rafal
2010-11-15
We find the optimal scheme for quantum phase estimation in the presence of loss when no a priori knowledge on the estimated phase is available. We prove analytically an explicit lower bound on estimation uncertainty, which shows that, as a function of the number of probes, quantum precision enhancement amounts at most to a constant factor improvement over classical strategies.
Quantum Metrology Assisted by Abstention
NASA Astrophysics Data System (ADS)
Gendra, B.; Ronco-Bonvehi, E.; Calsamiglia, J.; Muñoz-Tapia, R.; Bagan, E.
2013-03-01
The main goal of quantum metrology is to obtain accurate values of physical parameters using quantum probes. In this context, we show that abstention, i.e., the possibility of getting an inconclusive answer at readout, can drastically improve the measurement precision and even lead to a change in its asymptotic behavior, from the shot-noise to the Heisenberg scaling. We focus on phase estimation and quantify the required amount of abstention for a given precision. We also develop analytical tools to obtain the asymptotic behavior of the precision and required rate of abstention for arbitrary pure states.
Liu, Wen P; Azizian, Mahdi; Sorger, Jonathan; Taylor, Russell H; Reilly, Brian K; Cleary, Kevin; Preciado, Diego
2014-03-01
To our knowledge, this is the first reported cadaveric feasibility study of a master-slave-assisted cochlear implant procedure in the otolaryngology-head and neck surgery field using the da Vinci Si system (da Vinci Surgical System; Intuitive Surgical, Inc). We describe the surgical workflow adaptations using a minimally invasive system and image guidance integrating intraoperative cone beam computed tomography through augmented reality. To test the feasibility of da Vinci Si-assisted cochlear implant surgery with augmented reality, with visualization of critical structures and facilitation with precise cochleostomy for electrode insertion. Cadaveric case study of bilateral cochlear implant approaches conducted at Intuitive Surgical Inc, Sunnyvale, California. Bilateral cadaveric mastoidectomies, posterior tympanostomies, and cochleostomies were performed using the da Vinci Si system on a single adult human donor cadaveric specimen. Radiographic confirmation of successful cochleostomies, placement of a phantom cochlear implant wire, and visual confirmation of critical anatomic structures (facial nerve, cochlea, and round window) in augmented stereoendoscopy. With a surgical mean time of 160 minutes per side, complete bilateral cochlear implant procedures were successfully performed with no violation of critical structures, notably the facial nerve, chorda tympani, sigmoid sinus, dura, or ossicles. Augmented reality image overlay of the facial nerve, round window position, and basal turn of the cochlea was precise. Postoperative cone beam computed tomography scans confirmed successful placement of the phantom implant electrode array into the basal turn of the cochlea. To our knowledge, this is the first study in the otolaryngology-head and neck surgery literature examining the use of master-slave-assisted cochleostomy with augmented reality for cochlear implants using the da Vinci Si system. The described system for cochleostomy has the potential to improve the surgeon's confidence, as well as surgical safety, efficiency, and precision by filtering tremor. The integration of augmented reality may be valuable for surgeons dealing with complex cases of congenital anatomic abnormality, for revision cochlear implant with distorted anatomy and poorly pneumatized mastoids, and as a method of interactive teaching. Further research into the cost-benefit ratio of da Vinci Si-assisted otologic surgery, as well as refinements of the proposed workflow, are required before considering clinical studies.
Ferguson, Lynnette R; De Caterina, Raffaele; Görman, Ulf; Allayee, Hooman; Kohlmeier, Martin; Prasad, Chandan; Choi, Myung Sook; Curi, Rui; de Luis, Daniel Antonio; Gil, Ángel; Kang, Jing X; Martin, Ron L; Milagro, Fermin I; Nicoletti, Carolina Ferreira; Nonino, Carla Barbosa; Ordovas, Jose Maria; Parslow, Virginia R; Portillo, María P; Santos, José Luis; Serhan, Charles N; Simopoulos, Artemis P; Velázquez-Arellano, Antonio; Zulet, Maria Angeles; Martinez, J Alfredo
2016-01-01
Diversity in the genetic profile between individuals and specific ethnic groups affects nutrient requirements, metabolism and response to nutritional and dietary interventions. Indeed, individuals respond differently to lifestyle interventions (diet, physical activity, smoking, etc.). The sequencing of the human genome and subsequent increased knowledge regarding human genetic variation is contributing to the emergence of personalized nutrition. These advances in genetic science are raising numerous questions regarding the mode that precision nutrition can contribute solutions to emerging problems in public health, by reducing the risk and prevalence of nutrition-related diseases. Current views on personalized nutrition encompass omics technologies (nutrigenomics, transcriptomics, epigenomics, foodomics, metabolomics, metagenomics, etc.), functional food development and challenges related to legal and ethical aspects, application in clinical practice, and population scope, in terms of guidelines and epidemiological factors. In this context, precision nutrition can be considered as occurring at three levels: (1) conventional nutrition based on general guidelines for population groups by age, gender and social determinants; (2) individualized nutrition that adds phenotypic information about the person's current nutritional status (e.g. anthropometry, biochemical and metabolic analysis, physical activity, among others), and (3) genotype-directed nutrition based on rare or common gene variation. Research and appropriate translation into medical practice and dietary recommendations must be based on a solid foundation of knowledge derived from studies on nutrigenetics and nutrigenomics. A scientific society, such as the International Society of Nutrigenetics/Nutrigenomics (ISNN), internationally devoted to the study of nutrigenetics/nutrigenomics, can indeed serve the commendable roles of (1) promoting science and favoring scientific communication and (2) permanently working as a 'clearing house' to prevent disqualifying logical jumps, correct or stop unwarranted claims, and prevent the creation of unwarranted expectations in patients and in the general public. In this statement, we are focusing on the scientific aspects of disciplines covering nutrigenetics and nutrigenomics issues. Genetic screening and the ethical, legal, social and economic aspects will be dealt with in subsequent statements of the Society. © 2016 S. Karger AG, Basel.
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
High Precision Prediction of Functional Sites in Protein Structures
Buturovic, Ljubomir; Wong, Mike; Tang, Grace W.; Altman, Russ B.; Petkovic, Dragutin
2014-01-01
We address the problem of assigning biological function to solved protein structures. Computational tools play a critical role in identifying potential active sites and informing screening decisions for further lab analysis. A critical parameter in the practical application of computational methods is the precision, or positive predictive value. Precision measures the level of confidence the user should have in a particular computed functional assignment. Low precision annotations lead to futile laboratory investigations and waste scarce research resources. In this paper we describe an advanced version of the protein function annotation system FEATURE, which achieved 99% precision and average recall of 95% across 20 representative functional sites. The system uses a Support Vector Machine classifier operating on the microenvironment of physicochemical features around an amino acid. We also compared performance of our method with state-of-the-art sequence-level annotator Pfam in terms of precision, recall and localization. To our knowledge, no other functional site annotator has been rigorously evaluated against these key criteria. The software and predictive models are incorporated into the WebFEATURE service at http://feature.stanford.edu/wf4.0-beta. PMID:24632601
Gas Chromatic Mass Spectrometer
NASA Technical Reports Server (NTRS)
Wey, Chowen
1995-01-01
Gas chromatograph/mass spectrometer (GC/MS) used to measure and identify combustion species present in trace concentration. Advanced extractive diagnostic method measures to parts per billion (PPB), as well as differentiates between different types of hydrocarbons. Applicable for petrochemical, waste incinerator, diesel transporation, and electric utility companies in accurately monitoring types of hydrocarbon emissions generated by fuel combustion, in order to meet stricter environmental requirements. Other potential applications include manufacturing processes requiring precise detection of toxic gaseous chemicals, biomedical applications requiring precise identification of accumulative gaseous species, and gas utility operations requiring high-sensitivity leak detection.
Decision making under uncertainty: a quasimetric approach.
N'Guyen, Steve; Moulin-Frier, Clément; Droulez, Jacques
2013-01-01
We propose a new approach for solving a class of discrete decision making problems under uncertainty with positive cost. This issue concerns multiple and diverse fields such as engineering, economics, artificial intelligence, cognitive science and many others. Basically, an agent has to choose a single or series of actions from a set of options, without knowing for sure their consequences. Schematically, two main approaches have been followed: either the agent learns which option is the correct one to choose in a given situation by trial and error, or the agent already has some knowledge on the possible consequences of his decisions; this knowledge being generally expressed as a conditional probability distribution. In the latter case, several optimal or suboptimal methods have been proposed to exploit this uncertain knowledge in various contexts. In this work, we propose following a different approach, based on the geometric intuition of distance. More precisely, we define a goal independent quasimetric structure on the state space, taking into account both cost function and transition probability. We then compare precision and computation time with classical approaches.
Estimating maneuvers for precise relative orbit determination using GPS
NASA Astrophysics Data System (ADS)
Allende-Alba, Gerardo; Montenbruck, Oliver; Ardaens, Jean-Sébastien; Wermuth, Martin; Hugentobler, Urs
2017-01-01
Precise relative orbit determination is an essential element for the generation of science products from distributed instrumentation of formation flying satellites in low Earth orbit. According to the mission profile, the required formation is typically maintained and/or controlled by executing maneuvers. In order to generate consistent and precise orbit products, a strategy for maneuver handling is mandatory in order to avoid discontinuities or precision degradation before, after and during maneuver execution. Precise orbit determination offers the possibility of maneuver estimation in an adjustment of single-satellite trajectories using GPS measurements. However, a consistent formulation of a precise relative orbit determination scheme requires the implementation of a maneuver estimation strategy which can be used, in addition, to improve the precision of maneuver estimates by drawing upon the use of differential GPS measurements. The present study introduces a method for precise relative orbit determination based on a reduced-dynamic batch processing of differential GPS pseudorange and carrier phase measurements, which includes maneuver estimation as part of the relative orbit adjustment. The proposed method has been validated using flight data from space missions with different rates of maneuvering activity, including the GRACE, TanDEM-X and PRISMA missions. The results show the feasibility of obtaining precise relative orbits without degradation in the vicinity of maneuvers as well as improved maneuver estimates that can be used for better maneuver planning in flight dynamics operations.
Calculus domains modelled using an original bool algebra based on polygons
NASA Astrophysics Data System (ADS)
Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.
2016-08-01
Analytical and numerical computer based models require analytical definitions of the calculus domains. The paper presents a method to model a calculus domain based on a bool algebra which uses solid and hollow polygons. The general calculus relations of the geometrical characteristics that are widely used in mechanical engineering are tested using several shapes of the calculus domain in order to draw conclusions regarding the most effective methods to discretize the domain. The paper also tests the results of several CAD commercial software applications which are able to compute the geometrical characteristics, being drawn interesting conclusions. The tests were also targeting the accuracy of the results vs. the number of nodes on the curved boundary of the cross section. The study required the development of an original software consisting of more than 1700 computer code lines. In comparison with other calculus methods, the discretization using convex polygons is a simpler approach. Moreover, this method doesn't lead to large numbers as the spline approximation did, in that case being required special software packages in order to offer multiple, arbitrary precision. The knowledge resulted from this study may be used to develop complex computer based models in engineering.
Testing the Standard Model by precision measurement of the weak charges of quarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross Young; Roger Carlini; Anthony Thomas
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low-energy. The precision of this new result, combined with earlier atomic parity-violation measurements, limits the magnitude of possible contributions from physics beyond the Standard Model - setting a model-independent, lower-bound on the scale of new physics at ~1 TeV.
Testing the standard model by precision measurement of the weak charges of quarks.
Young, R D; Carlini, R D; Thomas, A W; Roche, J
2007-09-21
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low energy. The precision of this new result, combined with earlier atomic parity-violation measurements, places tight constraints on the size of possible contributions from physics beyond the standard model. Consequently, this result improves the lower-bound on the scale of relevant new physics to approximately 1 TeV.
Knowledge-guided fuzzy logic modeling to infer cellular signaling networks from proteomic data
Liu, Hui; Zhang, Fan; Mishra, Shital Kumar; Zhou, Shuigeng; Zheng, Jie
2016-01-01
Modeling of signaling pathways is crucial for understanding and predicting cellular responses to drug treatments. However, canonical signaling pathways curated from literature are seldom context-specific and thus can hardly predict cell type-specific response to external perturbations; purely data-driven methods also have drawbacks such as limited biological interpretability. Therefore, hybrid methods that can integrate prior knowledge and real data for network inference are highly desirable. In this paper, we propose a knowledge-guided fuzzy logic network model to infer signaling pathways by exploiting both prior knowledge and time-series data. In particular, the dynamic time warping algorithm is employed to measure the goodness of fit between experimental and predicted data, so that our method can model temporally-ordered experimental observations. We evaluated the proposed method on a synthetic dataset and two real phosphoproteomic datasets. The experimental results demonstrate that our model can uncover drug-induced alterations in signaling pathways in cancer cells. Compared with existing hybrid models, our method can model feedback loops so that the dynamical mechanisms of signaling networks can be uncovered from time-series data. By calibrating generic models of signaling pathways against real data, our method supports precise predictions of context-specific anticancer drug effects, which is an important step towards precision medicine. PMID:27774993
The economic case for precision medicine.
Gavan, Sean P; Thompson, Alexander J; Payne, Katherine
2018-01-01
Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.
The economic case for precision medicine
Gavan, Sean P.; Thompson, Alexander J.; Payne, Katherine
2018-01-01
ABSTRACT Introduction: The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered: The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary: The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers. PMID:29682615
Strategies for In situ and Sample Return Analyses
NASA Astrophysics Data System (ADS)
Papanastassiou, D. A.
2006-12-01
There is general agreement that planetary exploration proceeds from orbital reconnaissance of a planet, to surface and near-surface in situ exploration, to sample return missions, which bring back samples for investigations in terrestrial laboratories, using the panoply of state-of-the-art analytical techniques. The applicable techniques may depend on the nature of the returned material and complementary and multi- disciplinary techniques can be used to best advantage. High precision techniques also serve to provide the "ground truth" and calibrate past and future orbital and in situ measurements on a planet. It is also recognized that returned samples may continue to be analyzed by novel techniques as the techniques become developed, in part to address specific characteristics of returned samples. There are geophysical measurements such as those of the moment of inertia of a planet, seismic activity, and surface morphology that depend on orbital and in-situ science. Other characteristics, such as isotopic ages and isotopic compositions (e.g., initial Sr and Nd) as indicators of planetary mantle or crust evolution and sample provenance require returned samples. In situ analyses may be useful for preliminary characterization and for optimization of sample selection for sample return. In situ analyses by Surveyor on the Moon helped identify the major element chemistry of lunar samples and the need for high precision mass spectrometry (e. g., for Rb-Sr ages, based on extremely low alkali contents). The discussion of in-situ investigations vs. investigations on returned samples must be directly related to available instrumentation and to instrumentation that can be developed in the foreseeable future. The discussion of choices is not a philosophical but instead a very practical issue: what precision is required for key investigations and what is the instrumentation that meets or exceeds the required precision. This must be applied to potential in situ instruments and to laboratory instruments. Age determinations and use of isotopes for deciphering planetary evolution are viewed as off-limits for in-situ determinations, as they require: a) typically high precision mass spectrometry (at 0.01% and below); b) the determination of parent-daughter element ratios at least at the percent level; c) the measurement of coexisting minerals (for internal isochron determinations); d) low contamination (e. g., for U-Pb and Pb-Pb); and e) removal of adhering phases and contaminants, not related to the samples to be analyzed. Total K-Ar age determinations are subject to fewer requirements and may be feasible, in situ, but in the absence of neutron activation, as required for 39Ar-40Ar, the expected precision is at the level of ~20%, with trapped Ar in the samples introducing further uncertainty. Precision of 20% for K-Ar may suffice to address some key cratering rate uncertainties on Mars, especially as applicable to the Middle Amazonian(1). For in situ, the key issues, which must be addressed for all measurements are: what precision is required and are there instruments available, at the required precision levels. These issues must be addressed many years before a mission gets defined. Low precision instruments on several in situ missions that do not address key scientific questions may in fact be more expensive, in their sum, than a sample return mission. In summary, all missions should undergo similar intense scrutiny with regard to desired science and feasibility, based on available instrumentation (with demonstrated and known capabilities) and cost. 1. P. T. Doran et al. (2004) Earth Sci. Rev. 67, 313-337.
Precise attitude control of the Stanford relativity satellite.
NASA Technical Reports Server (NTRS)
Bull, J. S.; Debra, D. B.
1973-01-01
A satellite being designed by the Stanford University to measure (with extremely high precision) the effect of General Relativity is described. Specifically, the satellite will measure two relativistic precessions predicted by the theory: the geodetic effect (6.9 arcsec/yr), due solely to motion about the earth, and the motional effect (0.05 arcsec/yr), due to rotation of the earth. The gyro design requirements, including the requirement for precise attitude control and a dynamic model for attitude control synthesis, are discussed. Closed loop simulation of the satellite's natural dynamics on an analog computer is described.
Panel 3: Genetics and Precision Medicine of Otitis Media.
Lin, Jizhen; Hafrén, Hena; Kerschner, Joseph; Li, Jian-Dong; Brown, Steve; Zheng, Qing Y; Preciado, Diego; Nakamura, Yoshihisa; Huang, Qiuhong; Zhang, Yan
2017-04-01
Objective The objective is to perform a comprehensive review of the literature up to 2015 on the genetics and precision medicine relevant to otitis media. Data Sources PubMed database of the National Library of Medicine. Review Methods Two subpanels were formed comprising experts in the genetics and precision medicine of otitis media. Each of the panels reviewed the literature in their respective fields and wrote draft reviews. The reviews were shared with all panel members, and a merged draft was created. The entire panel met at the 18th International Symposium on Recent Advances in Otitis Media in June 2015 and discussed the review and refined the content. A final draft was made, circulated, and approved by the panel members. Conclusion Many genes relevant to otitis media have been identified in the last 4 years in advancing our knowledge regarding the predisposition of the middle ear mucosa to commensals and pathogens. Advances include mutant animal models and clinical studies. Many signaling pathways are involved in the predisposition of otitis media. Implications for Practice New knowledge on the genetic background relevant to otitis media forms a basis of novel potential interventions, including potential new ways to treat otitis media.
NASA Astrophysics Data System (ADS)
Liu, Benjamin M.; Abebe, Yitayew; McHugh, Oloro V.; Collick, Amy S.; Gebrekidan, Brhane; Steenhuis, Tammo S.
This study highlights two highly degraded watersheds in the semi-arid Amhara region of Ethiopia where integrated water resource management activities were carried out to decrease dependence on food aid through improved management of ‘green’ water. While top-down approaches require precise and centrally available knowledge to deal with the uncertainty in engineering design of watershed management projects, bottom-up approaches can succeed without such information by making extensive use of stakeholder knowledge. This approach works best in conjunction with the development of leadership confidence within local communities. These communities typically face a number of problems, most notably poverty, that prevent them from fully investing in the protection of their natural resources, so an integrated management system is needed to suitably address the interrelated problems. Many different implementing agencies were brought together in the two study watersheds to address water scarcity, crop production, and soil erosion, but the cornerstone was enabling local potential through the creation and strengthening of community watershed management organizations. Leadership training and the reinforcement of stakeholder feedback as a fundamental activity led to increased ownership and willingness to take on new responsibilities. A series of small short term successes ranging from micro-enterprise cooperatives to gully rehabilitation have resulted in the pilot communities becoming confident of their own capabilities and proud to share their successes and knowledge with other communities struggling with natural resource degradation.
Fini, M. Elizabeth; Schwartz, Stephen G.; Gao, Xiaoyi; Jeong, Shinwu; Patel, Nitin; Itakura, Tatsuo; Price, Marianne O.; Price, Francis W.; Varma, Rohit; Stamer, W. Daniel
2016-01-01
Elevation of intraocular pressure (IOP) due to therapeutic use of glucocorticoids is called steroid-induced ocular hypertension (SIOH); this can lead to steroid-induced glaucoma (SIG). Glucocorticoids initiate signaling cascades ultimately affecting expression of hundreds of genes; this provides the potential for a highly personalized pharmacological response. Studies attempting to define genetic risk factors were undertaken early in the history of glucocorticoid use, however scientific tools available at that time were limited and progress stalled. In contrast, significant advances were made over the ensuing years in defining disease pathophysiology. As the genomics age emerged, it appeared the time was right to renew investigation into genetics. Pharmacogenomics is an unbiased discovery approach, not requiring an underlying hypothesis, and provides a way to pinpoint clinically significant genes and pathways that could not have been discovered any other way. Results of the first genome-wide association study to identify polymorphisms associated with SIOH, and follow-up on two novel genes linked to the disorder, GPR158 and HCG22, is discussed in the second half of the article. However, knowledge of genetic variants determining response to steroids in the eye also has value in its own right as a predictive and diagnostic tool. This article concludes with a discussion of how the Precision Medicine Initiative®, announced by U.S. President Obama in his 2015 State of the Union address, is beginning to touch the practice of ophthalmology. It is argued that SIOH/SIG may provide one of the next opportunities for effective application of precision medicine. PMID:27666015
Science 101: How Do Atomic Clocks Work?
ERIC Educational Resources Information Center
Science and Children, 2008
2008-01-01
You might be wondering why in the world we need such precise measures of time. Well, many systems we use everyday, such as Global Positioning Systems, require precise synchronization of time. This comes into play in telecommunications and wireless communications, also. For purely scientific reasons, we can use precise measurement of time to…
Peer Assessment with Online Tools to Improve Student Modeling
ERIC Educational Resources Information Center
Atkins, Leslie J.
2012-01-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to…
NASA Technical Reports Server (NTRS)
Agnes, Gregory S.; Waldman, Jeff; Hughes, Richard; Peterson, Lee D.
2015-01-01
NASA's proposed Surface Water Ocean Topography (SWOT) mission, scheduled to launch in 2020, would provide critical information about Earth's oceans, ocean circulation, fresh water storage, and river discharge. The mission concept calls for a dual-antenna Ka-band radar interferometer instrument, known as KaRIn, that would map the height of water globally along two 50 km wide swaths. The KaRIn antennas, which would be separated by 10 meters on either side of the spacecraft, would need to be precisely deployable in order to meet demanding pointing requirements. Consequently, an effort was undertaken to design build and prototype a precision deployable Mast for the KaRIn instrument. Each mast was 4.5-m long with a required dilitation stability of 2.5 microns over 3 minutes. It required a minimum first mode of 7 Hz. Deployment repeatability was less than +/- 7 arcsec in all three rotation directions. Overall mass could not exceed 41.5 Kg including any actuators and thermal blanketing. This set of requirements meant the boom had to be three times lighter and two orders of magnitude more precise than the existing state of the art for deployable booms.
Analytical difficulties facing today's regulatory laboratories: issues in method validation.
MacNeil, James D
2012-08-01
The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Smith, R. L.; Lyubomirsky, A. S.
1981-01-01
Two techniques were analyzed. The first is a representation using Chebyshev expansions in three-dimensional cells. The second technique employs a temporary file for storing the components of the nonspherical gravity force. Computer storage requirements and relative CPU time requirements are presented. The Chebyshev gravity representation can provide a significant reduction in CPU time in precision orbit calculations, but at the cost of a large amount of direct-access storage space, which is required for a global model.
Analysis of generic reentry vehicle flight dynamics
NASA Astrophysics Data System (ADS)
Metsker, Yu.; Weinand, K.; Geulen, G.; Haidn, O. J.
2018-06-01
The knowledge of reentry vehicles (RV) flight characteristics regarding geometrical shape, dimensions, and mechanical properties is essential for precise prediction of their flight trajectory, impact point, and possible deviations according to simulation uncertainties. The flight characteristic estimations of existing RV require both body dimensions and mechanical properties of the objects. Due to comparatively simple and reliable methods of specifying the vehicle outer dimensions, e. g., photos and videomaterials, the estimation of mechanical properties is a subject of higher uncertainties. Within this study, a generic medium range ballistic missile (MRBM) RV was examined for several modifications such as center of gravity (CoG) position, weight moment of inertia, and initial reentry flight states. Combinations of these variables with constant aerodynamic properties for maximal lateral accelerations will be determined. Basing on these, potential evasion maneuver capabilities of the RV will be described.
A combined microphone and camera calibration technique with application to acoustic imaging.
Legg, Mathew; Bradley, Stuart
2013-10-01
We present a calibration technique for an acoustic imaging microphone array, combined with a digital camera. Computer vision and acoustic time of arrival data are used to obtain microphone coordinates in the camera reference frame. Our new method allows acoustic maps to be plotted onto the camera images without the need for additional camera alignment or calibration. Microphones and cameras may be placed in an ad-hoc arrangement and, after calibration, the coordinates of the microphones are known in the reference frame of a camera in the array. No prior knowledge of microphone positions, inter-microphone spacings, or air temperature is required. This technique is applied to a spherical microphone array and a mean difference of 3 mm was obtained between the coordinates obtained with this calibration technique and those measured using a precision mechanical method.
Liu, Kaijun; Fang, Binji; Wu, Yi; Li, Ying; Jin, Jun; Tan, Liwen; Zhang, Shaoxiang
2013-09-01
Anatomical knowledge of the larynx region is critical for understanding laryngeal disease and performing required interventions. Virtual reality is a useful method for surgical education and simulation. Here, we assembled segmented cross-section slices of the larynx region from the Chinese Visible Human dataset. The laryngeal structures were precisely segmented manually as 2D images, then reconstructed and displayed as 3D images in the virtual reality Dextrobeam system. Using visualization and interaction with the virtual reality modeling language model, a digital laryngeal anatomy instruction was constructed using HTML and JavaScript languages. The volume larynx models can thus display an arbitrary section of the model and provide a virtual dissection function. This networked teaching system of the digital laryngeal anatomy can be read remotely, displayed locally, and manipulated interactively.
Spatio-Temporal Patterning in Primary Motor Cortex at Movement Onset.
Best, Matthew D; Suminski, Aaron J; Takahashi, Kazutaka; Brown, Kevin A; Hatsopoulos, Nicholas G
2017-02-01
Voluntary movement initiation involves the engagement of large populations of motor cortical neurons around movement onset. Despite knowledge of the temporal dynamics that lead to movement, the spatial structure of these dynamics across the cortical surface remains unknown. In data from 4 rhesus macaques, we show that the timing of attenuation of beta frequency local field potential oscillations, a correlate of locally activated cortex, forms a spatial gradient across primary motor cortex (MI). We show that these spatio-temporal dynamics are recapitulated in the engagement order of ensembles of MI neurons. We demonstrate that these patterns are unique to movement onset and suggest that movement initiation requires a precise spatio-temporal sequential activation of neurons in MI. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Ronaldson, Patrick T; Davis, Thomas P
2012-01-01
The blood–brain barrier (BBB) is the most significant obstacle to effective CNS drug delivery. It possesses structural and biochemical features (i.e., tight-junction protein complexes and, influx and efflux transporters) that restrict xenobiotic permeation. Pathophysiological stressors (i.e., peripheral inflammatory pain) can alter BBB tight junctions and transporters, which leads to drug-permeation changes. This is especially critical for opioids, which require precise CNS concentrations to be safe and effective analgesics. Recent studies have identified molecular targets (i.e., endogenous transporters and intracellular signaling systems) that can be exploited for optimization of CNS drug delivery. This article summarizes current knowledge in this area and emphasizes those targets that present the greatest opportunity for controlling drug permeation and/or drug transport across the BBB in an effort to achieve optimal CNS opioid delivery. PMID:22468221
Applications of the Trojan Horse method in nuclear astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spitaleri, Claudio, E-mail: spitaleri@lns.infn.it
2015-02-24
The study of the energy production in stars and related nucleosyntesis processes requires increasingly precise knowledge of the nuclear reaction cross section and reaction rates at interaction energy. In order to overcome the experimental difficulties, arising from small cross-sections involved in charge particle induced reactions at astrophysical energies, and from the presence of electron screening, it was necessary to introduce indirect methods. Trough these methods it is possible to measure cross sections at very small energies and retrieve information on electron screening effect when ultra-low energy direct measurements are available. The Trojan Horse Method (THM) represents the indirect technique tomore » determine the bare nucleus astrophysical S-factor for reactions between charged particles at astrophysical energies. The basic theory of the THM is discussed in the case of non-resonant.« less
Simulation evaluation of TIMER, a time-based, terminal air traffic, flow-management concept
NASA Technical Reports Server (NTRS)
Credeur, Leonard; Capron, William R.
1989-01-01
A description of a time-based, extended terminal area ATC concept called Traffic Intelligence for the Management of Efficient Runway scheduling (TIMER) and the results of a fast-time evaluation are presented. The TIMER concept is intended to bridge the gap between today's ATC system and a future automated time-based ATC system. The TIMER concept integrates en route metering, fuel-efficient cruise and profile descents, terminal time-based sequencing and spacing together with computer-generated controller aids, to improve delivery precision for fuller use of runway capacity. Simulation results identify and show the effects and interactions of such key variables as horizon of control location, delivery time error at both the metering fix and runway threshold, aircraft separation requirements, delay discounting, wind, aircraft heading and speed errors, and knowledge of final approach speed.
A hierarchical Bayesian method for vibration-based time domain force reconstruction problems
NASA Astrophysics Data System (ADS)
Li, Qiaofeng; Lu, Qiuhai
2018-05-01
Traditional force reconstruction techniques require prior knowledge on the force nature to determine the regularization term. When such information is unavailable, the inappropriate term is easily chosen and the reconstruction result becomes unsatisfactory. In this paper, we propose a novel method to automatically determine the appropriate q as in ℓq regularization and reconstruct the force history. The method incorporates all to-be-determined variables such as the force history, precision parameters and q into a hierarchical Bayesian formulation. The posterior distributions of variables are evaluated by a Metropolis-within-Gibbs sampler. The point estimates of variables and their uncertainties are given. Simulations of a cantilever beam and a space truss under various loading conditions validate the proposed method in providing adaptive determination of q and better reconstruction performance than existing Bayesian methods.
Attaining the Photometric Precision Required by Future Dark Energy Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stubbs, Christopher
2013-01-21
This report outlines our progress towards achieving the high-precision astronomical measurements needed to derive improved constraints on the nature of the Dark Energy. Our approach to obtaining higher precision flux measurements has two basic components: 1) determination of the optical transmission of the atmosphere, and 2) mapping out the instrumental photon sensitivity function vs. wavelength, calibrated by referencing the measurements to the known sensitivity curve of a high precision silicon photodiode, and 3) using the self-consistency of the spectrum of stars to achieve precise color calibrations.
Hout, Michael C; Goldinger, Stephen D
2015-01-01
When people look for things in the environment, they use target templates-mental representations of the objects they are attempting to locate-to guide attention and to assess incoming visual input as potential targets. However, unlike laboratory participants, searchers in the real world rarely have perfect knowledge regarding the potential appearance of targets. In seven experiments, we examined how the precision of target templates affects the ability to conduct visual search. Specifically, we degraded template precision in two ways: 1) by contaminating searchers' templates with inaccurate features, and 2) by introducing extraneous features to the template that were unhelpful. We recorded eye movements to allow inferences regarding the relative extents to which attentional guidance and decision-making are hindered by template imprecision. Our findings support a dual-function theory of the target template and highlight the importance of examining template precision in visual search.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
Commissioning Procedures for Mechanical Precision and Accuracy in a Dedicated LINAC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballesteros-Zebadua, P.; Larrga-Gutierrez, J. M.; Garcia-Garduno, O. A.
2008-08-11
Mechanical precision measurements are fundamental procedures for the commissioning of a dedicated LINAC. At our Radioneurosurgery Unit, these procedures can be suitable as quality assurance routines that allow the verification of the equipment geometrical accuracy and precision. In this work mechanical tests were performed for gantry and table rotation, obtaining mean associated uncertainties of 0.3 mm and 0.71 mm, respectively. Using an anthropomorphic phantom and a series of localized surface markers, isocenter accuracy showed to be smaller than 0.86 mm for radiosurgery procedures and 0.95 mm for fractionated treatments with mask. All uncertainties were below tolerances. The highest contribution tomore » mechanical variations is due to table rotation, so it is important to correct variations using a localization frame with printed overlays. Mechanical precision knowledge would allow to consider the statistical errors in the treatment planning volume margins.« less
Hout, Michael C.; Goldinger, Stephen D.
2014-01-01
When people look for things in the environment, they use target templates—mental representations of the objects they are attempting to locate—to guide attention and to assess incoming visual input as potential targets. However, unlike laboratory participants, searchers in the real world rarely have perfect knowledge regarding the potential appearance of targets. In seven experiments, we examined how the precision of target templates affects the ability to conduct visual search. Specifically, we degraded template precision in two ways: 1) by contaminating searchers’ templates with inaccurate features, and 2) by introducing extraneous features to the template that were unhelpful. We recorded eye movements to allow inferences regarding the relative extents to which attentional guidance and decision-making are hindered by template imprecision. Our findings support a dual-function theory of the target template and highlight the importance of examining template precision in visual search. PMID:25214306
Observing System Simulations for ASCENDS: Synthesizing Science Measurement Requirements (Invited)
NASA Astrophysics Data System (ADS)
Kawa, S. R.; Baker, D. F.; Schuh, A. E.; Crowell, S.; Rayner, P. J.; Hammerling, D.; Michalak, A. M.; Wang, J. S.; Eluszkiewicz, J.; Ott, L.; Zaccheo, T.; Abshire, J. B.; Browell, E. V.; Moore, B.; Crisp, D.
2013-12-01
The measurement of atmospheric CO2 from space using active (lidar) sensing techniques has several potentially significant advantages in comparison to current and planned passive CO2 instruments. Application of this new technology aims to advance CO2 measurement capability and carbon cycle science into the next decade. The NASA Active Sensing of Carbon Emissions, Nights, Days, and Seasons (ASCENDS) mission has been recommended by the US National Academy of Sciences Decadal Survey for the next generation of space-based CO2 observing systems. ASCENDS is currently planned for launch in 2022. Several possible lidar instrument approaches have been demonstrated in airborne campaigns and the results indicate that such sensors are quite feasible. Studies are now underway to evaluate performance requirements for space mission implementation. Satellite CO2 observations must be highly precise and unbiased in order to accurately infer global carbon source/sink fluxes. Measurement demands are likely to further increase in the wake of GOSAT, OCO-2, and enhanced ground-based in situ and remote sensing CO2 data. The objective of our work is to quantitatively and consistently evaluate the measurement capabilities and requirements for ASCENDS in the context of advancing our knowledge of carbon flux distributions and their dependence on underlying physical processes. Considerations include requirements for precision, relative accuracy, spatial/temporal coverage and resolution, vertical information content, interferences, and possibly the tradeoffs among these parameters, while at the same time framing a mission that can be implemented within a constrained budget. Here, we attempt to synthesize the results of observing system simulation studies, commissioned by the ASCENDS Science Requirements Definition Team, into a coherent set of mission performance guidelines. A variety of forward and inverse model frameworks are employed to reduce the potential dependence of the results on model specifics. Sensitivity to key instrument design variables is explored and quantified. Global random error measurement scenarios show significant improvement in resolving CO2 fluxes and reducing uncertainties for expected lidar instrument error levels. The improvement beyond that expected for OCO-2 with random errors only, however, is limited for regions where passive sampling is not limited by lack of sunlight or heavy cloud cover. Simulations including prospective systematic (bias) errors, which are expected to be lesser for the lidar system, provide guidance for instrument design requirements as well as reinforcing the priority for a comprehensive calibration/validation component to the mission. The necessity of including coincident lidar measurements of the O2 column, in order to normalize the CO2 column to dry air mole fraction, will also be discussed. The results indicate that within reasonable technological assumptions for the system performance, high measurement quality and quantity can be obtained that will fulfill the nominal ASCENDS objectives and provide substantial improvement in our knowledge of global carbon cycle processes.
NASA Astrophysics Data System (ADS)
Haagmans, G. G.; Verhagen, S.; Voûte, R. L.; Verbree, E.
2017-09-01
Since GPS tends to fail for indoor positioning purposes, alternative methods like indoor positioning systems (IPS) based on Bluetooth low energy (BLE) are developing rapidly. Generally, IPS are deployed in environments covered with obstacles such as furniture, walls, people and electronics influencing the signal propagation. The major factor influencing the system performance and to acquire optimal positioning results is the geometry of the beacons. The geometry of the beacons is limited to the available infrastructure that can be deployed (number of beacons, basestations and tags), which leads to the following challenge: Given a limited number of beacons, where should they be placed in a specified indoor environment, such that the geometry contributes to optimal positioning results? This paper aims to propose a statistical model that is able to select the optimal configuration that satisfies the user requirements in terms of precision. The model requires the definition of a chosen 3D space (in our case 7 × 10 × 6 meter), number of beacons, possible user tag locations and a performance threshold (e.g. required precision). For any given set of beacon and receiver locations, the precision, internal- and external reliability can be determined on forehand. As validation, the modeled precision has been compared with observed precision results. The measurements have been performed with an IPS of BlooLoc at a chosen set of user tag locations for a given geometric configuration. Eventually, the model is able to select the optimal geometric configuration out of millions of possible configurations based on a performance threshold (e.g. required precision).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.
2014-04-15
Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample sizemore » required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence.« less
Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.
2014-01-01
Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence. PMID:24694150
Zhu, Haixin; Zhou, Xianfeng; Su, Fengyu; Tian, Yanqing; Ashili, Shashanka; Holl, Mark R; Meldrum, Deirdre R
2012-10-01
We report a novel method for wafer level, high throughput optical chemical sensor patterning, with precise control of the sensor volume and capability of producing arbitrary microscale patterns. Monomeric oxygen (O(2)) and pH optical probes were polymerized with 2-hydroxyethyl methacrylate (HEMA) and acrylamide (AM) to form spin-coatable and further crosslinkable polymers. A micro-patterning method based on micro-fabrication techniques (photolithography, wet chemical process and reactive ion etch) was developed to miniaturize the sensor film onto glass substrates in arbitrary sizes and shapes. The sensitivity of fabricated micro-patterns was characterized under various oxygen concentrations and pH values. The process for spatially integration of two sensors (Oxygen and pH) on the same substrate surface was also developed, and preliminary fabrication and characterization results were presented. To the best of our knowledge, it is the first time that poly (2-hydroxylethyl methacrylate)-co-poly (acrylamide) (PHEMA-co-PAM)-based sensors had been patterned and integrated at the wafer level with micron scale precision control using microfabrication techniques. The developed methods can provide a feasible way to miniaturize and integrate the optical chemical sensor system and can be applied to any lab-on-a-chip system, especially the biological micro-systems requiring optical sensing of single or multiple analytes.
Enhanced Photoelectrochemical Performance of Cuprous Oxide/Graphene Nanohybrids
2017-01-01
Combination of an oxide semiconductor with a highly conductive nanocarbon framework (such as graphene or carbon nanotubes) is an attractive avenue to assemble efficient photoelectrodes for solar fuel generation. To fully exploit the possible synergies of the hybrid formation, however, precise knowledge of these systems is required to allow rational design and morphological engineering. In this paper, we present the controlled electrochemical deposition of nanocrystalline p-Cu2O on the surface of different graphene substrates. The developed synthetic protocol allowed tuning of the morphological features of the hybrids as deduced from electron microscopy. (Photo)electrochemical measurements (including photovoltammetry, electrochemical impedance spectroscopy, photocurrent transient analysis) demonstrated better performance for the 2D graphene containing photoelectrodes, compared to the bare Cu2O films, the enhanced performance being rooted in suppressed charge carrier recombination. To elucidate the precise role of graphene, comparative studies were performed with carbon nanotube (CNT) films and 3D graphene foams. These studies revealed, after allowing for the effect of increased surface area, that the 3D graphene substrate outperformed the other two nanocarbons. Its interconnected structure facilitated effective charge separation and transport, leading to better harvesting of the generated photoelectrons. These hybrid assemblies are shown to be potentially attractive candidates in photoelectrochemical energy conversion schemes, namely CO2 reduction. PMID:28460518
Automated coregistration of MTI spectral bands
NASA Astrophysics Data System (ADS)
Theiler, James P.; Galbraith, Amy E.; Pope, Paul A.; Ramsey, Keri A.; Szymanski, John J.
2002-08-01
In the focal plane of a pushbroom imager, a linear array of pixels is scanned across the scene, building up the image one row at a time. For the Multispectral Thermal Imager (MTI), each of fifteen different spectral bands has its own linear array. These arrays are pushed across the scene together, but since each band's array is at a different position on the focal plane, a separate image is produced for each band. The standard MTI data products (LEVEL1B_R_COREG and LEVEL1B_R_GEO) resample these separate images to a common grid and produce coregistered multispectral image cubes. The coregistration software employs a direct ``dead reckoning' approach. Every pixel in the calibrated image is mapped to an absolute position on the surface of the earth, and these are resampled to produce an undistorted coregistered image of the scene. To do this requires extensive information regarding the satellite position and pointing as a function of time, the precise configuration of the focal plane, and the distortion due to the optics. These must be combined with knowledge about the position and altitude of the target on the rotating ellipsoidal earth. We will discuss the direct approach to MTI coregistration, as well as more recent attempts to tweak the precision of the band-to-band registration using correlations in the imagery itself.
Blom, Philip Stephen; Marcillo, Omar Eduardo
2016-12-05
A method is developed to apply acoustic tomography methods to a localized network of infrasound arrays with intention of monitoring the atmosphere state in the region around the network using non-local sources without requiring knowledge of the precise source location or non-local atmosphere state. Closely spaced arrays provide a means to estimate phase velocities of signals that can provide limiting bounds on certain characteristics of the atmosphere. Larger spacing between such clusters provide a means to estimate celerity from propagation times along multiple unique stratospherically or thermospherically ducted propagation paths and compute more precise estimates of the atmosphere state. Inmore » order to avoid the commonly encountered complex, multimodal distributions for parametric atmosphere descriptions and to maximize the computational efficiency of the method, an optimal parametrization framework is constructed. This framework identifies the ideal combination of parameters for tomography studies in specific regions of the atmosphere and statistical model selection analysis shows that high quality corrections to the middle atmosphere winds can be obtained using as few as three parameters. Lastly, comparison of the resulting estimates for synthetic data sets shows qualitative agreement between the middle atmosphere winds and those estimated from infrasonic traveltime observations.« less
Automatic seed selection for segmentation of liver cirrhosis in laparoscopic sequences
NASA Astrophysics Data System (ADS)
Sinha, Rahul; Marcinczak, Jan Marek; Grigat, Rolf-Rainer
2014-03-01
For computer aided diagnosis based on laparoscopic sequences, image segmentation is one of the basic steps which define the success of all further processing. However, many image segmentation algorithms require prior knowledge which is given by interaction with the clinician. We propose an automatic seed selection algorithm for segmentation of liver cirrhosis in laparoscopic sequences which assigns each pixel a probability of being cirrhotic liver tissue or background tissue. Our approach is based on a trained classifier using SIFT and RGB features with PCA. Due to the unique illumination conditions in laparoscopic sequences of the liver, a very low dimensional feature space can be used for classification via logistic regression. The methodology is evaluated on 718 cirrhotic liver and background patches that are taken from laparoscopic sequences of 7 patients. Using a linear classifier we achieve a precision of 91% in a leave-one-patient-out cross-validation. Furthermore, we demonstrate that with logistic probability estimates, seeds with high certainty of being cirrhotic liver tissue can be obtained. For example, our precision of liver seeds increases to 98.5% if only seeds with more than 95% probability of being liver are used. Finally, these automatically selected seeds can be used as priors in Graph Cuts which is demonstrated in this paper.
Transcutaneous vaccination via laser microporation
Weiss, Richard; Hessenberger, Michael; Kitzmüller, Sophie; Bach, Doris; Weinberger, Esther E.; Krautgartner, Wolf D.; Hauser-Kronberger, Cornelia; Malissen, Bernard; Boehler, Christof; Kalia, Yogeshvar N.; Thalhamer, Josef; Scheiblhofer, Sandra
2012-01-01
Driven by constantly increasing knowledge about skin immunology, vaccine delivery via the cutaneous route has recently gained renewed interest. Considering its richness in immunocompetent cells, targeting antigens to the skin is considered to be more effective than intramuscular or subcutaneous injections. However, circumvention of the superficial layer of the skin, the stratum corneum, represents the major challenge for cutaneous immunization. An optimal delivery method has to be effective and reliable, but also highly adaptable to specific demands, should avoid the use of hypodermic needles and the requirement of specially trained healthcare workers. The P.L.E.A.S.E.® (Precise Laser Epidermal System) device employed in this study for creation of aqueous micropores in the skin fulfills these prerequisites by combining the precision of its laser scanning technology with the flexibility to vary the number, density and the depth of the micropores in a user-friendly manner. We investigated the potential of transcutaneous immunization via laser-generated micropores for induction of specific immune responses and compared the outcomes to conventional subcutaneous injection. By targeting different layers of the skin we were able to bias polarization of T cells, which could be modulated by addition of adjuvants. The P.L.E.A.S.E.® device represents a highly effective and versatile platform for transcutaneous vaccination. PMID:22750193
Crowdsourcing for error detection in cortical surface delineations.
Ganz, Melanie; Kondermann, Daniel; Andrulis, Jonas; Knudsen, Gitte Moos; Maier-Hein, Lena
2017-01-01
With the recent trend toward big data analysis, neuroimaging datasets have grown substantially in the past years. While larger datasets potentially offer important insights for medical research, one major bottleneck is the requirement for resources of medical experts needed to validate automatic processing results. To address this issue, the goal of this paper was to assess whether anonymous nonexperts from an online community can perform quality control of MR-based cortical surface delineations derived by an automatic algorithm. So-called knowledge workers from an online crowdsourcing platform were asked to annotate errors in automatic cortical surface delineations on 100 central, coronal slices of MR images. On average, annotations for 100 images were obtained in less than an hour. When using expert annotations as reference, the crowd on average achieves a sensitivity of 82 % and a precision of 42 %. Merging multiple annotations per image significantly improves the sensitivity of the crowd (up to 95 %), but leads to a decrease in precision (as low as 22 %). Our experiments show that the detection of errors in automatic cortical surface delineations generated by anonymous untrained workers is feasible. Future work will focus on increasing the sensitivity of our method further, such that the error detection tasks can be handled exclusively by the crowd and expert resources can be focused on error correction.
The Promise of Personalized Medicine
... to more precise tools, from less to more quantitative data, to inform yourself and increase your knowledge. Photo courtesy of NIH Klose: How would you describe your own approach to research? Dr. Zerhouni : I'm sort of a hybrid ...
Foundation of a Knowledge Representation System for Image Understanding.
1980-10-01
This is useful for holding the system together, for computing similarity between objects, for quickly retrieving desired information in as detailed a...mined by how much precision is needed to carry through the current computation . In Section 2, we discuss the OVS system itself, its structure and how...2.0 OVS SYSTEM Our goal here is to present the computational constraints involved in the design of a knowledge representation system which is
Tacit Knowledge of Caring and Embodied Selfhood
Kontos, Pia C.; Naglie, Gary
2013-01-01
The tacit knowledge paradigm is gaining recognition as an important source of knowledge that informs clinical decision-making. However, it is limited by an exclusive focus on knowledge acquired through clinical practice, and a consequent neglect of the primordial and socio-cultural significance of embodied selfhood, precisely what provides the foundational structure of tacit knowledge of caring and facilitates its manifestation. Drawing on findings from a qualitative study of forty-three dementia care practitioners in Ontario, Canada that utilized research-based drama and focus group methodology, we argue that embodied selfhood is fundamental to tacit knowledge of caring. Results are analyzed drawing upon the theoretical precepts of embodied selfhood that are rooted in Merleau-Ponty’s (1962) reconceptualization of perception and Bourdieu’s (1977, 1990) notion of habitus. We conclude with a call for further exploration of the body as a site of the production of tacit knowledge. PMID:19392935
Tacit knowledge of caring and embodied selfhood.
Kontos, Pia C; Naglie, Gary
2009-07-01
The tacit knowledge paradigm is gaining recognition as an important source of knowledge that informs clinical decision-making. It is, however, limited by an exclusive focus on knowledge acquired through clinical practice, and a consequent neglect of the primordial and socio-cultural significance of embodied selfhood, precisely what provides the foundational structure of tacit knowledge of caring and facilitates its manifestation. Drawing on findings from a qualitative study of 43 dementia care practitioners in Ontario, Canada that utilised research-based drama and focus group methodology, we argue that embodied selfhood is fundamental to tacit knowledge of caring. Results are analysed drawing upon the theoretical precepts of embodied selfhood that are rooted in Merleau-Ponty's (1962) reconceptualisation of perception and Bourdieu's (1977, 1990) notion of habitus. We conclude with a call for further exploration of the body as a site of the production of tacit knowledge.
Situation awareness system for Canada
NASA Astrophysics Data System (ADS)
Hill, Andrew
1999-07-01
Situation awareness encompasses a knowledge of orders, plans and current knowledge of friendly force actions. Knowing where you are and being able to transmit that information in near real-time to other friendly forces provides the ability to exercise precise command and control over those forces. With respect to current command and control using voice methods, between 40 percent and 60 percent of Combat Net Radio traffic relates to location reporting of some sort. Commanders at Battle Group and below spend, on average, 40 percent of their total time performing position and navigation related functions. The need to rapidly transfer own force location information throughout a force and to process the received information quickly, accurately and reliably provides the rationale for the requirement for an automated situation awareness system. This paper describes the Situation Awareness System (SAS) being developed by Computing Devices Canada for the Canadian Department of National Defence as a component of the Position Determination and Navigation for Land Forces program. The SAS is being integrated with the Iris Tactical Command, Control, Communications System, which is also being developed by Computing Devices. The SAS software provides a core operating environment onto which command and control functionality can be easily added to produce general and specialist battlefield management systems.
GNSS satellite transmit power and its impact on orbit determination
NASA Astrophysics Data System (ADS)
Steigenberger, Peter; Thoelert, Steffen; Montenbruck, Oliver
2018-06-01
Antenna thrust is a small acceleration acting on Global Navigation Satellite System satellites caused by the transmission of radio navigation signals. Knowledge about the transmit power and the mass of the satellites is required for the computation of this effect. The actual transmit power can be obtained from measurements with a high-gain antenna and knowledge about the properties of the transmit and receive antennas as well as losses along the propagation path. Transmit power measurements for different types of GPS, GLONASS, Galileo, and BeiDou-2 satellites were taken with a 30-m dish antenna of the German Aerospace Center (DLR) located at its ground station in Weilheim. For GPS, total L-band transmit power levels of 50-240 W were obtained, 20-135 W for GLONASS, 95-265 W for Galileo, and 130-185 W for BeiDou-2. The transmit power differs usually only slightly for individual spacecraft within one satellite block. An exception are the GLONASS-M satellites where six subgroups with different transmit power levels could be identified. Considering the antenna thrust in precise orbit determination of GNSS satellites decreases the orbital radius by 1-27 mm depending on the transmit power, the satellite mass, and the orbital period.
Investigation of Space Interferometer Control Using Imaging Sensor Output Feedback
NASA Technical Reports Server (NTRS)
Leitner, Jesse A.; Cheng, Victor H. L.
2003-01-01
Numerous space interferometry missions are planned for the next decade to verify different enabling technologies towards very-long-baseline interferometry to achieve high-resolution imaging and high-precision measurements. These objectives will require coordinated formations of spacecraft separately carrying optical elements comprising the interferometer. High-precision sensing and control of the spacecraft and the interferometer-component payloads are necessary to deliver sub-wavelength accuracy to achieve the scientific objectives. For these missions, the primary scientific product of interferometer measurements may be the only source of data available at the precision required to maintain the spacecraft and interferometer-component formation. A concept is studied for detecting the interferometer's optical configuration errors based on information extracted from the interferometer sensor output. It enables precision control of the optical components, and, in cases of space interferometers requiring formation flight of spacecraft that comprise the elements of a distributed instrument, it enables the control of the formation-flying vehicles because independent navigation or ranging sensors cannot deliver the high-precision metrology over the entire required geometry. Since the concept can act on the quality of the interferometer output directly, it can detect errors outside the capability of traditional metrology instruments, and provide the means needed to augment the traditional instrumentation to enable enhanced performance. Specific analyses performed in this study include the application of signal-processing and image-processing techniques to solve the problems of interferometer aperture baseline control, interferometer pointing, and orientation of multiple interferometer aperture pairs.
GABA-Mediated Presynaptic Inhibition Is Required for Precision of Long-Term Memory
ERIC Educational Resources Information Center
Cullen, Patrick K.; Dulka, Brooke N.; Ortiz, Samantha; Riccio, David C.; Jasnow, Aaron M.
2014-01-01
Though much attention has been given to the neural structures that underlie the long-term consolidation of contextual memories, little is known about the mechanisms responsible for the maintenance of memory precision. Here, we demonstrate a rapid time-dependent decline in memory precision in GABA [subscript B(1a)] receptor knockout mice. First, we…
Baynam, Gareth; Bowman, Faye; Lister, Karla; Walker, Caroline E; Pachter, Nicholas; Goldblatt, Jack; Boycott, Kym M; Gahl, William A; Kosaki, Kenjiro; Adachi, Takeya; Ishii, Ken; Mahede, Trinity; McKenzie, Fiona; Townshend, Sharron; Slee, Jennie; Kiraly-Borri, Cathy; Vasudevan, Anand; Hawkins, Anne; Broley, Stephanie; Schofield, Lyn; Verhoef, Hedwig; Groza, Tudor; Zankl, Andreas; Robinson, Peter N; Haendel, Melissa; Brudno, Michael; Mattick, John S; Dinger, Marcel E; Roscioli, Tony; Cowley, Mark J; Olry, Annie; Hanauer, Marc; Alkuraya, Fowzan S; Taruscio, Domenica; Posada de la Paz, Manuel; Lochmüller, Hanns; Bushby, Kate; Thompson, Rachel; Hedley, Victoria; Lasko, Paul; Mina, Kym; Beilby, John; Tifft, Cynthia; Davis, Mark; Laing, Nigel G; Julkowska, Daria; Le Cam, Yann; Terry, Sharon F; Kaufmann, Petra; Eerola, Iiro; Norstedt, Irene; Rath, Ana; Suematsu, Makoto; Groft, Stephen C; Austin, Christopher P; Draghia-Akli, Ruxandra; Weeramanthri, Tarun S; Molster, Caron; Dawkins, Hugh J S
2017-01-01
Public health relies on technologies to produce and analyse data, as well as effectively develop and implement policies and practices. An example is the public health practice of epidemiology, which relies on computational technology to monitor the health status of populations, identify disadvantaged or at risk population groups and thereby inform health policy and priority setting. Critical to achieving health improvements for the underserved population of people living with rare diseases is early diagnosis and best care. In the rare diseases field, the vast majority of diseases are caused by destructive but previously difficult to identify protein-coding gene mutations. The reduction in cost of genetic testing and advances in the clinical use of genome sequencing, data science and imaging are converging to provide more precise understandings of the 'person-time-place' triad. That is: who is affected (people); when the disease is occurring (time); and where the disease is occurring (place). Consequently we are witnessing a paradigm shift in public health policy and practice towards 'precision public health'.Patient and stakeholder engagement has informed the need for a national public health policy framework for rare diseases. The engagement approach in different countries has produced highly comparable outcomes and objectives. Knowledge and experience sharing across the international rare diseases networks and partnerships has informed the development of the Western Australian Rare Diseases Strategic Framework 2015-2018 (RD Framework) and Australian government health briefings on the need for a National plan.The RD Framework is guiding the translation of genomic and other technologies into the Western Australian health system, leading to greater precision in diagnostic pathways and care, and is an example of how a precision public health framework can improve health outcomes for the rare diseases population.Five vignettes are used to illustrate how policy decisions provide the scaffolding for translation of new genomics knowledge, and catalyze transformative change in delivery of clinical services. The vignettes presented here are from an Australian perspective and are not intended to be comprehensive, but rather to provide insights into how a new and emerging 'precision public health' paradigm can improve the experiences of patients living with rare diseases, their caregivers and families.The conclusion is that genomic public health is informed by the individual and family needs, and the population health imperatives of an early and accurate diagnosis; which is the portal to best practice care. Knowledge sharing is critical for public health policy development and improving the lives of people living with rare diseases.
Protocol for Communication Networking for Formation Flying
NASA Technical Reports Server (NTRS)
Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren
2009-01-01
An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in diverse local-area networks, this protocol offers both (1) a random- access mode needed for the early PFF deployment phase and (2) a time-bounded-services mode needed during PFF-maintenance operations. Switching between these two modes could be controlled by upper-layer entities using standard link-management mechanisms. Because the early deployment phase of a PFF mission can be expected to involve multihop relaying to achieve network connectivity (see figure), the proposed protocol includes the open shortest path first (OSPF) network protocol that is commonly used in the Internet. Each spacecraft in a PFF network would be in one of seven distinct states as the mission evolved from initial deployment, through coarse formation, and into precise formation. Reconfiguration of the formation to perform different scientific observations would also cause state changes among the network nodes. The application protocol provides for recognition and tracking of the seven states for each node and for protocol changes under specified conditions to adapt the network and satisfy communication requirements associated with the current PFF mission phase. Except during early deployment, when peer-to-peer random access discovery methods would be used, the application protocol provides for operation in a centralized manner.
Spacecraft Alignment Determination and Control for Dual Spacecraft Precision Formation Flying
NASA Technical Reports Server (NTRS)
Calhoun, Philip; Novo-Gradac, Anne-Marie; Shah, Neerav
2017-01-01
Many proposed formation flying missions seek to advance the state of the art in spacecraft science imaging by utilizing precision dual spacecraft formation flying to enable a virtual space telescope. Using precision dual spacecraft alignment, very long focal lengths can be achieved by locating the optics on one spacecraft and the detector on the other. Proposed science missions include astrophysics concepts with spacecraft separations from 1000 km to 25,000 km, such as the Milli-Arc-Second Structure Imager (MASSIM) and the New Worlds Observer, and Heliophysics concepts for solar coronagraphs and X-ray imaging with smaller separations (50m-500m). All of these proposed missions require advances in guidance, navigation, and control (GNC) for precision formation flying. In particular, very precise astrometric alignment control and estimation is required for precise inertial pointing of the virtual space telescope to enable science imaging orders of magnitude better than can be achieved with conventional single spacecraft instruments. This work develops design architectures, algorithms, and performance analysis of proposed GNC systems for precision dual spacecraft astrometric alignment. These systems employ a variety of GNC sensors and actuators, including laser-based alignment and ranging systems, optical imaging sensors (e.g. guide star telescope), inertial measurement units (IMU), as well as microthruster and precision stabilized platforms. A comprehensive GNC performance analysis is given for Heliophysics dual spacecraft PFF imaging mission concept.
Spacecraft Alignment Determination and Control for Dual Spacecraft Precision Formation Flying
NASA Technical Reports Server (NTRS)
Calhoun, Philip C.; Novo-Gradac, Anne-Marie; Shah, Neerav
2017-01-01
Many proposed formation flying missions seek to advance the state of the art in spacecraft science imaging by utilizing precision dual spacecraft formation flying to enable a virtual space telescope. Using precision dual spacecraft alignment, very long focal lengths can be achieved by locating the optics on one spacecraft and the detector on the other. Proposed science missions include astrophysics concepts with spacecraft separations from 1000 km to 25,000 km, such as the Milli-Arc-Second Structure Imager (MASSIM) and the New Worlds Observer, and Heliophysics concepts for solar coronagraphs and X-ray imaging with smaller separations (50m 500m). All of these proposed missions require advances in guidance, navigation, and control (GNC) for precision formation flying. In particular, very precise astrometric alignment control and estimation is required for precise inertial pointing of the virtual space telescope to enable science imaging orders of magnitude better than can be achieved with conventional single spacecraft instruments. This work develops design architectures, algorithms, and performance analysis of proposed GNC systems for precision dual spacecraft astrometric alignment. These systems employ a variety of GNC sensors and actuators, including laser-based alignment and ranging systems, optical imaging sensors (e.g. guide star telescope), inertial measurement units (IMU), as well as micro-thruster and precision stabilized platforms. A comprehensive GNC performance analysis is given for Heliophysics dual spacecraft PFF imaging mission concept.
A proposed method for wind velocity measurement from space
NASA Technical Reports Server (NTRS)
Censor, D.; Levine, D. M.
1980-01-01
An investigation was made of the feasibility of making wind velocity measurements from space by monitoring the apparent change in the refractive index of the atmosphere induced by motion of the air. The physical principle is the same as that resulting in the phase changes measured in the Fizeau experiment. It is proposed that this phase change could be measured using a three cornered arrangement of satellite borne source and reflectors, around which two laser beams propagate in opposite directions. It is shown that even though the velocity of the satellites is much larger than the wind velocity, factors such as change in satellite position and Doppler shifts can be taken into account in a reasonable manner and the Fizeau phase measured. This phase measurement yields an average wind velocity along the ray path through the atmosphere. The method requires neither high accuracy for satellite position or velocity, nor precise knowledge of the refractive index or its gradient in the atmosphere. However, the method intrinsically yields wind velocity integrated along the ray path; hence to obtain higher spatial resolution, inversion techniques are required.
Potter, Matthew E; Aswegen, Sivan V; Gibson, Emma K; Silverwood, Ian P; Raja, Robert
2016-07-14
The increased demand for bulk hydrocarbons necessitates research into increasingly sustainable, energy-efficient catalytic processes. Owing to intricately designed structure-property correlations, SAPO-34 has become established as a promising material for the low temperature ethanol dehydration to produce ethylene. However, further optimization of this process requires a precise knowledge of the reaction mechanism at a molecular level. In order to achieve this a range of spectroscopic characterization techniques are required to probe both the interaction with the active site, and also the wider role of the framework. To this end we employ a combination of in situ infra-red and neutron scattering techniques to elucidate the influence of the surface ethoxy species in the activation of both diethyl ether and ethanol, towards the improved formation of ethylene at low temperatures. The combined conclusions of these studies is that the formation of ethylene is the rate determining step, which is of fundamental importance towards the development of this process and the introduction of bio-ethanol as a viable feedstock for ethylene production.
Digital LED Pixels: Instructions for use and a characterization of their properties.
Jones, Pete R; Garcia, Sara E; Nardini, Marko
2016-12-01
This article details how to control light emitting diodes (LEDs) using an ordinary desktop computer. By combining digitally addressable LEDs with an off-the-shelf microcontroller (Arduino), multiple LEDs can be controlled independently and with a high degree of temporal, chromatic, and luminance precision. The proposed solution is safe (can be powered by a 5-V battery), tested (has been used in published research), inexpensive (∼ $60 + $2 per LED), highly interoperable (can be controlled by any type of computer/operating system via a USB or Bluetooth connection), requires no prior knowledge of electrical engineering (components simply require plugging together), and uses widely available components for which established help forums already exist. Matlab code is provided, including a 'minimal working example' of use suitable for use by beginners. Properties of the recommended LEDs are also characterized, including their response time, luminance profile, and color gamut. Based on these, it is shown that the LEDs are highly stable in terms of both luminance and chromaticity, and do not suffer from issues of warm-up, chromatic shift, and slow response times associated with traditional CRT and LCD monitor technology.
Image Tiling for Profiling Large Objects
NASA Technical Reports Server (NTRS)
Venkataraman, Ajit; Schock, Harold; Mercer, Carolyn R.
1992-01-01
Three dimensional surface measurements of large objects arc required in a variety of industrial processes. The nature of these measurements is changing as optical instruments arc beginning to replace conventional contact probes scanned over the objects. A common characteristic of the optical surface profilers is the trade off between measurement accuracy and field of view. In order to measure a large object with high accuracy, multiple views arc required. An accurate transformation between the different views is needed to bring about their registration. In this paper, we demonstrate how the transformation parameters can be obtained precisely by choosing control points which lie in the overlapping regions of the images. A good starting point for the transformation parameters is obtained by having a knowledge of the scanner position. The selection of the control points arc independent of the object geometry. By successively recording multiple views and obtaining transformation with respect to a single coordinate system, a complete physical model of an object can be obtained. Since all data arc in the same coordinate system, it can thus be used for building automatic models for free form surfaces.
Universal electronics for miniature and automated chemical assays.
Urban, Pawel L
2015-02-21
This minireview discusses universal electronic modules (generic programmable units) and their use by analytical chemists to construct inexpensive, miniature or automated devices. Recently, open-source platforms have gained considerable popularity among tech-savvy chemists because their implementation often does not require expert knowledge and investment of funds. Thus, chemistry students and researchers can easily start implementing them after a few hours of reading tutorials and trial-and-error. Single-board microcontrollers and micro-computers such as Arduino, Teensy, Raspberry Pi or BeagleBone enable collecting experimental data with high precision as well as efficient control of electric potentials and actuation of mechanical systems. They are readily programmed using high-level languages, such as C, C++, JavaScript or Python. They can also be coupled with mobile consumer electronics, including smartphones as well as teleinformatic networks. More demanding analytical tasks require fast signal processing. Field-programmable gate arrays enable efficient and inexpensive prototyping of high-performance analytical platforms, thus becoming increasingly popular among analytical chemists. This minireview discusses the advantages and drawbacks of universal electronic modules, considering their application in prototyping and manufacture of intelligent analytical instrumentation.
Implementation of a dynamic data entry system for the PHENIX gas system
NASA Astrophysics Data System (ADS)
Hagiwara, Masako
2003-10-01
The PHENIX detector at the BNL RHIC facility uses multiple detector technologies that require a precise gas delivery system, including flammable gases that require additional monitoring. During operation of the detector, it is crucial to maintain stable and safe operating conditions by carefully monitoring flows, pressures, and various other gas properties. These systems are monitored during running periods on a continuous basis. For the most part, these records were kept by hand, filling out a paper logsheet every four hours. A dynamic data entry system was needed to replace the paper logsheets. The solution created was to use a PDA or laptop computer with a wireless connection to enter the data directly into a MySQL database. The system uses PHP to dynamically create and update the data entry pages. The data entered can be viewed in graphs as well as tables. As a result, the data recorded will be easily accessible during PHENIX's next running period. It also allows for long term archiving, making the data available during the analysis phase, providing knowledge of the operating conditions of the gas system.
Statistical analysis of CSP plants by simulating extensive meteorological series
NASA Astrophysics Data System (ADS)
Pavón, Manuel; Fernández, Carlos M.; Silva, Manuel; Moreno, Sara; Guisado, María V.; Bernardos, Ana
2017-06-01
The feasibility analysis of any power plant project needs the estimation of the amount of energy it will be able to deliver to the grid during its lifetime. To achieve this, its feasibility study requires a precise knowledge of the solar resource over a long term period. In Concentrating Solar Power projects (CSP), financing institutions typically requires several statistical probability of exceedance scenarios of the expected electric energy output. Currently, the industry assumes a correlation between probabilities of exceedance of annual Direct Normal Irradiance (DNI) and energy yield. In this work, this assumption is tested by the simulation of the energy yield of CSP plants using as input a 34-year series of measured meteorological parameters and solar irradiance. The results of this work show that, even if some correspondence between the probabilities of exceedance of annual DNI values and energy yields is found, the intra-annual distribution of DNI may significantly affect this correlation. This result highlights the need of standardized procedures for the elaboration of representative DNI time series representative of a given probability of exceedance of annual DNI.
Tsukamoto, S; Hoshino, H; Tamura, T
2008-01-01
This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135
NASA Astrophysics Data System (ADS)
Mills, Cameron; Tiwari, Vaibhav; Fairhurst, Stephen
2018-05-01
The observation of gravitational wave signals from binary black hole and binary neutron star mergers has established the field of gravitational wave astronomy. It is expected that future networks of gravitational wave detectors will possess great potential in probing various aspects of astronomy. An important consideration for successive improvement of current detectors or establishment on new sites is knowledge of the minimum number of detectors required to perform precision astronomy. We attempt to answer this question by assessing the ability of future detector networks to detect and localize binary neutron stars mergers on the sky. Good localization ability is crucial for many of the scientific goals of gravitational wave astronomy, such as electromagnetic follow-up, measuring the properties of compact binaries throughout cosmic history, and cosmology. We find that although two detectors at improved sensitivity are sufficient to get a substantial increase in the number of observed signals, at least three detectors of comparable sensitivity are required to localize majority of the signals, typically to within around 10 deg2 —adequate for follow-up with most wide field of view optical telescopes.
Collaborative knowledge acquisition for the design of context-aware alert systems.
Joffe, Erel; Havakuk, Ofer; Herskovic, Jorge R; Patel, Vimla L; Bernstam, Elmer Victor
2012-01-01
To present a framework for combining implicit knowledge acquisition from multiple experts with machine learning and to evaluate this framework in the context of anemia alerts. Five internal medicine residents reviewed 18 anemia alerts, while 'talking aloud'. They identified features that were reviewed by two or more physicians to determine appropriate alert level, etiology and treatment recommendation. Based on these features, data were extracted from 100 randomly-selected anemia cases for a training set and an additional 82 cases for a test set. Two staff internists assigned an alert level, etiology and treatment recommendation before and after reviewing the entire electronic medical record. The training set of 118 cases (100 plus 18) and the test set of 82 cases were explored using RIDOR and JRip algorithms. The feature set was sufficient to assess 93% of anemia cases (intraclass correlation for alert level before and after review of the records by internists 1 and 2 were 0.92 and 0.95, respectively). High-precision classifiers were constructed to identify low-level alerts (precision p=0.87, recall R=0.4), iron deficiency (p=1.0, R=0.73), and anemia associated with kidney disease (p=0.87, R=0.77). It was possible to identify low-level alerts and several conditions commonly associated with chronic anemia. This approach may reduce the number of clinically unimportant alerts. The study was limited to anemia alerts. Furthermore, clinicians were aware of the study hypotheses potentially biasing their evaluation. Implicit knowledge acquisition, collaborative filtering and machine learning were combined automatically to induce clinically meaningful and precise decision rules.
Sine-Bar Attachment For Machine Tools
NASA Technical Reports Server (NTRS)
Mann, Franklin D.
1988-01-01
Sine-bar attachment for collets, spindles, and chucks helps machinists set up quickly for precise angular cuts that require greater precision than provided by graduations of machine tools. Machinist uses attachment to index head, carriage of milling machine or lathe relative to table or turning axis of tool. Attachment accurate to 1 minute or arc depending on length of sine bar and precision of gauge blocks in setup. Attachment installs quickly and easily on almost any type of lathe or mill. Requires no special clamps or fixtures, and eliminates many trial-and-error measurements. More stable than improvised setups and not jarred out of position readily.
High level continuity for coordinate generation with precise controls
NASA Technical Reports Server (NTRS)
Eiseman, P. R.
1982-01-01
Coordinate generation techniques with precise local controls have been derived and analyzed for continuity requirements up to both the first and second derivatives, and have been projected to higher level continuity requirements from the established pattern. The desired local control precision was obtained when a family of coordinate surfaces could be uniformly distributed without a consequent creation of flat spots on the coordinate curves transverse to the family. Relative to the uniform distribution, the family could be redistributed from an a priori distribution function or from a solution adaptive approach, both without distortion from the underlying transformation which may be independently chosen to fit a nontrivial geometry and topology.
[Value of the space perception test for evaluation of the aptitude for precision work in geodesy].
Remlein-Mozolewska, G
1982-01-01
The visual spatial localization ability of geodesy and cartography - employers and of the pupils trained for the mentioned profession has been examined. The examination has been based on work duration and the time of its performance. A correlation between the localization ability and the precision of the hand - movements required in everyday work has been proven. The better the movement precision, the more efficient the visual spatial localization. The length of work has not been significant. The test concerned appeared to be highly useful in geodesy for qualifying workers for the posts requiring good hands efficiency.
Access to finance from different finance provider types: Farmer knowledge of the requirements.
Wulandari, Eliana; Meuwissen, Miranda P M; Karmana, Maman H; Oude Lansink, Alfons G J M
2017-01-01
Analysing farmer knowledge of the requirements of finance providers can provide valuable insights to policy makers about ways to improve farmers' access to finance. This study compares farmer knowledge of the requirements to obtain finance with the actual requirements set by different finance provider types, and investigates the relation between demographic and socioeconomic factors and farmer knowledge of finance requirements. We use a structured questionnaire to collect data from a sample of finance providers and farmers in Java Island, Indonesia. We find that the most important requirements to acquire finance vary among different finance provider types. We also find that farmers generally have little knowledge of the requirements, which are important to each type of finance provider. Awareness campaigns are needed to increase farmer knowledge of the diversity of requirements among the finance provider types.
Access to finance from different finance provider types: Farmer knowledge of the requirements
Meuwissen, Miranda P. M.; Karmana, Maman H.; Oude Lansink, Alfons G. J. M.
2017-01-01
Analysing farmer knowledge of the requirements of finance providers can provide valuable insights to policy makers about ways to improve farmers’ access to finance. This study compares farmer knowledge of the requirements to obtain finance with the actual requirements set by different finance provider types, and investigates the relation between demographic and socioeconomic factors and farmer knowledge of finance requirements. We use a structured questionnaire to collect data from a sample of finance providers and farmers in Java Island, Indonesia. We find that the most important requirements to acquire finance vary among different finance provider types. We also find that farmers generally have little knowledge of the requirements, which are important to each type of finance provider. Awareness campaigns are needed to increase farmer knowledge of the diversity of requirements among the finance provider types. PMID:28877174
Garrido, Pilar; Aldaz, Azucena; Calleja, Miguel Ángel; De Álava, Enrique; Lamas, María Jesús; Martín, Miguel; Matías-Guiu, Xavier; Palacios, José; Vera, Ruth
2017-11-01
Precision medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle for each person. Precision medicine is transforming clinical and biomedical research, as well as health care itself from a conceptual, as well as a methodological viewpoint, providing extraordinary opportunities to improve public health and lower the costs of the healthcare system. However, the implementation of precision medicine poses ethical-legal, regulatory, organizational and knowledge-related challenges. Without a national strategy, precision medicine, which will be implemented one way or another, could take place without the appropriate planning that can guarantee technical quality, equal access of all citizens to the best practices, violating the rights of patients and professionals and jeopardizing the solvency of the healthcare system. With this paper from the Spanish Societies of Medical Oncology (SEOM), Pathology (SEAP), and Hospital Pharmacy (SEFH) we highlight the need to institute a consensual national strategy for the development of precision medicine in our country, review the national and international context, comment on the opportunities and challenges for implementing precision medicine, and outline the objectives of a national strategy on precision medicine in cancer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
An Archaeoastronomical Adventure.
ERIC Educational Resources Information Center
Russo, Richard
1997-01-01
Describes investigations in archaeoastronomy that combine modern archaeology with the mathematical precision of practical astronomy. Helps students develop an understanding of a society's astronomical systems which can lead to a knowledge of their religion, art, mathematics, writings, calendar, myths, and agricultural practices. (JRH)
Quantitative Determination of Isotope Ratios from Experimental Isotopic Distributions
Kaur, Parminder; O’Connor, Peter B.
2008-01-01
Isotope variability due to natural processes provides important information for studying a variety of complex natural phenomena from the origins of a particular sample to the traces of biochemical reaction mechanisms. These measurements require high-precision determination of isotope ratios of a particular element involved. Isotope Ratio Mass Spectrometers (IRMS) are widely employed tools for such a high-precision analysis, which have some limitations. This work aims at overcoming the limitations inherent to IRMS by estimating the elemental isotopic abundance from the experimental isotopic distribution. In particular, a computational method has been derived which allows the calculation of 13C/12C ratios from the whole isotopic distributions, given certain caveats, and these calculations are applied to several cases to demonstrate their utility. The limitations of the method in terms of the required number of ions and S/N ratio are discussed. For high-precision estimates of the isotope ratios, this method requires very precise measurement of the experimental isotopic distribution abundances, free from any artifacts introduced by noise, sample heterogeneity, or other experimental sources. PMID:17263354
Characterization of spacecraft and environmental disturbances on a SmallSat
NASA Technical Reports Server (NTRS)
Johnson, Thomas A.; Nguyen, Dung Phu Chi; Cuda, Vince; Freesland, Doug
1994-01-01
The objective of this study is to model the on-orbit vibration environment encountered by a SmallSat. Vibration control issues are common to the Earth observing, imaging, and microgravity communities. A spacecraft may contain dozens of support systems and instruments each a potential source of vibration. The quality of payload data depends on constraining vibration so that parasitic disturbances do not affect the payload's pointing or microgravity requirement. In practice, payloads are designed incorporating existing flight hardware in many cases with nonspecific vibration performance. Thus, for the development of a payload, designers require a thorough knowledge of existing mechanical devices and their associated disturbance levels. This study evaluates a SmallSat mission and seeks to answer basic questions concerning on-orbit vibration. Payloads were considered from the Earth observing, microgravity, and imaging communities. Candidate payload requirements were matched to spacecraft bus resources of present day SmallSats. From the set of candidate payloads, the representative payload GLAS (Geoscience Laser Altimeter System) was selected. The requirements of GLAS were considered very stringent for the 150 - 500 kg class of payloads. Once the payload was selected, a generic SmallSat was designed in order to accommodate the payload requirements (weight, size, power, etc.). This study seeks to characterize the on-orbit vibration environment of a SmallSat designed for this type of mission and to determine whether a SmallSat can provide the precision pointing and jitter control required for earth observing payloads.
A review of satellite time-transfer technology: Accomplishments and future applications
NASA Technical Reports Server (NTRS)
Cooper, R. S.; Chi, A. R.
1978-01-01
The research accomplishments by NASA in meeting the needs of the space program for precise time in satellite tracking are presented. As a major user of precise time signals for clock synchronization of NASA's worldwide satellite tracking networks, the agency provides much of the necessary impetus for the development of stable frequency sources and time synchronization technology. The precision time required for both satellite tracking and space science experiments has increased at a rate of about one order of magnitude per decade from 1 millisecond in the 1950's to 100 microseconds during the Apollo era in the 1960's to 10 microseconds in the 1970's. For the Tracking and Data Relay Satellite System, satellite timing requirements will be extended to 1 microsecond and below. These requirements are needed for spacecraft autonomy and data packeting.
Precision injection molding of freeform optics
NASA Astrophysics Data System (ADS)
Fang, Fengzhou; Zhang, Nan; Zhang, Xiaodong
2016-08-01
Precision injection molding is the most efficient mass production technology for manufacturing plastic optics. Applications of plastic optics in field of imaging, illumination, and concentration demonstrate a variety of complex surface forms, developing from conventional plano and spherical surfaces to aspheric and freeform surfaces. It requires high optical quality with high form accuracy and lower residual stresses, which challenges both optical tool inserts machining and precision injection molding process. The present paper reviews recent progress in mold tool machining and precision injection molding, with more emphasis on precision injection molding. The challenges and future development trend are also discussed.
1991-05-01
the problem of the frequency drift is still open. In- this context, the cavity pulling has drawn a lot of attention. Today, to our knowledge, 4...term maser frequency drift associated with the cavity pulling is a well known subject due to the high level of -precision obtainable in principle by...microprocessors. The frequency pulling due to microwave AM = =1:transitions (Ramsey pulling ) has been analyzed and shown to be important. Status of
NASA Technical Reports Server (NTRS)
Bakhshiyan, B. T.; Nazirov, R. R.; Elyasberg, P. E.
1980-01-01
The problem of selecting the optimal algorithm of filtration and the optimal composition of the measurements is examined assuming that the precise values of the mathematical expectancy and the matrix of covariation of errors are unknown. It is demonstrated that the optimal algorithm of filtration may be utilized for making some parameters more precise (for example, the parameters of the gravitational fields) after preliminary determination of the elements of the orbit by a simpler method of processing (for example, the method of least squares).
NASA Astrophysics Data System (ADS)
Rakotondravohitra, Laza
2013-04-01
Current and future neutrino oscillation experiments depend on precise knowledge of neutrino-nucleus cross-sections. Minerva is a neutrino scattering experiment at Fermilab. Minerva was designed to make precision measurements of low energy neutrino and antineutrino cross sections on a variety of different materials (plastic scintillator, C, Fe, Pb, He and H2O). In Order to make these measurements, it is crucial that the detector is carefully calibrated.This talk will describe how MINERvA uses muons from upstream neutrino interactions as a calibration source to convert electronics output to absolute energy deposition.
NASA Astrophysics Data System (ADS)
Aguilar, M.; Aisa, D.; Alpat, B.; Alvino, A.; Ambrosi, G.; Andeen, K.; Arruda, L.; Attig, N.; Azzarello, P.; Bachlechner, A.; Barao, F.; Barrau, A.; Barrin, L.; Bartoloni, A.; Basara, L.; Battarbee, M.; Battiston, R.; Bazo, J.; Becker, U.; Behlmann, M.; Beischer, B.; Berdugo, J.; Bertucci, B.; Bigongiari, G.; Bindi, V.; Bizzaglia, S.; Bizzarri, M.; Boella, G.; de Boer, W.; Bollweg, K.; Bonnivard, V.; Borgia, B.; Borsini, S.; Boschini, M. J.; Bourquin, M.; Burger, J.; Cadoux, F.; Cai, X. D.; Capell, M.; Caroff, S.; Casaus, J.; Cascioli, V.; Castellini, G.; Cernuda, I.; Cerreta, D.; Cervelli, F.; Chae, M. J.; Chang, Y. H.; Chen, A. I.; Chen, H.; Cheng, G. M.; Chen, H. S.; Cheng, L.; Chou, H. Y.; Choumilov, E.; Choutko, V.; Chung, C. H.; Clark, C.; Clavero, R.; Coignet, G.; Consolandi, C.; Contin, A.; Corti, C.; Gil, E. Cortina; Coste, B.; Creus, W.; Crispoltoni, M.; Cui, Z.; Dai, Y. M.; Delgado, C.; Della Torre, S.; Demirköz, M. B.; Derome, L.; Di Falco, S.; Di Masso, L.; Dimiccoli, F.; Díaz, C.; von Doetinchem, P.; Donnini, F.; Du, W. J.; Duranti, M.; D'Urso, D.; Eline, A.; Eppling, F. J.; Eronen, T.; Fan, Y. Y.; Farnesini, L.; Feng, J.; Fiandrini, E.; Fiasson, A.; Finch, E.; Fisher, P.; Galaktionov, Y.; Gallucci, G.; García, B.; García-López, R.; Gargiulo, C.; Gast, H.; Gebauer, I.; Gervasi, M.; Ghelfi, A.; Gillard, W.; Giovacchini, F.; Goglov, P.; Gong, J.; Goy, C.; Grabski, V.; Grandi, D.; Graziani, M.; Guandalini, C.; Guerri, I.; Guo, K. H.; Haas, D.; Habiby, M.; Haino, S.; Han, K. C.; He, Z. H.; Heil, M.; Hoffman, J.; Hsieh, T. H.; Huang, Z. C.; Huh, C.; Incagli, M.; Ionica, M.; Jang, W. Y.; Jinchi, H.; Kanishev, K.; Kim, G. N.; Kim, K. S.; Kirn, Th.; Kossakowski, R.; Kounina, O.; Kounine, A.; Koutsenko, V.; Krafczyk, M. S.; La Vacca, G.; Laudi, E.; Laurenti, G.; Lazzizzera, I.; Lebedev, A.; Lee, H. T.; Lee, S. C.; Leluc, C.; Levi, G.; Li, H. L.; Li, J. Q.; Li, Q.; Li, Q.; Li, T. X.; Li, W.; Li, Y.; Li, Z. H.; Li, Z. Y.; Lim, S.; Lin, C. H.; Lipari, P.; Lippert, T.; Liu, D.; Liu, H.; Lolli, M.; Lomtadze, T.; Lu, M. J.; Lu, S. Q.; Lu, Y. S.; Luebelsmeyer, K.; Luo, J. Z.; Lv, S. S.; Majka, R.; Mañá, C.; Marín, J.; Martin, T.; Martínez, G.; Masi, N.; Maurin, D.; Menchaca-Rocha, A.; Meng, Q.; Mo, D. C.; Morescalchi, L.; Mott, P.; Müller, M.; Ni, J. Q.; Nikonov, N.; Nozzoli, F.; Nunes, P.; Obermeier, A.; Oliva, A.; Orcinha, M.; Palmonari, F.; Palomares, C.; Paniccia, M.; Papi, A.; Pauluzzi, M.; Pedreschi, E.; Pensotti, S.; Pereira, R.; Picot-Clemente, N.; Pilo, F.; Piluso, A.; Pizzolotto, C.; Plyaskin, V.; Pohl, M.; Poireau, V.; Postaci, E.; Putze, A.; Quadrani, L.; Qi, X. M.; Qin, X.; Qu, Z. Y.; Räihä, T.; Rancoita, P. G.; Rapin, D.; Ricol, J. S.; Rodríguez, I.; Rosier-Lees, S.; Rozhkov, A.; Rozza, D.; Sagdeev, R.; Sandweiss, J.; Saouter, P.; Sbarra, C.; Schael, S.; Schmidt, S. M.; von Dratzig, A. Schulz; Schwering, G.; Scolieri, G.; Seo, E. S.; Shan, B. S.; Shan, Y. H.; Shi, J. Y.; Shi, X. Y.; Shi, Y. M.; Siedenburg, T.; Son, D.; Spada, F.; Spinella, F.; Sun, W.; Sun, W. H.; Tacconi, M.; Tang, C. P.; Tang, X. W.; Tang, Z. C.; Tao, L.; Tescaro, D.; Ting, Samuel C. C.; Ting, S. M.; Tomassetti, N.; Torsti, J.; Türkoǧlu, C.; Urban, T.; Vagelli, V.; Valente, E.; Vannini, C.; Valtonen, E.; Vaurynovich, S.; Vecchi, M.; Velasco, M.; Vialle, J. P.; Vitale, V.; Vitillo, S.; Wang, L. Q.; Wang, N. H.; Wang, Q. L.; Wang, R. S.; Wang, X.; Wang, Z. X.; Weng, Z. L.; Whitman, K.; Wienkenhöver, J.; Wu, H.; Wu, X.; Xia, X.; Xie, M.; Xie, S.; Xiong, R. Q.; Xin, G. M.; Xu, N. S.; Xu, W.; Yan, Q.; Yang, J.; Yang, M.; Ye, Q. H.; Yi, H.; Yu, Y. J.; Yu, Z. Q.; Zeissler, S.; Zhang, J. H.; Zhang, M. T.; Zhang, X. B.; Zhang, Z.; Zheng, Z. M.; Zhuang, H. L.; Zhukov, V.; Zichichi, A.; Zimmermann, N.; Zuccon, P.; Zurbach, C.; AMS Collaboration
2015-05-01
A precise measurement of the proton flux in primary cosmic rays with rigidity (momentum/charge) from 1 GV to 1.8 TV is presented based on 300 million events. Knowledge of the rigidity dependence of the proton flux is important in understanding the origin, acceleration, and propagation of cosmic rays. We present the detailed variation with rigidity of the flux spectral index for the first time. The spectral index progressively hardens at high rigidities.
NASA Astrophysics Data System (ADS)
Mazarico, E.; Goossens, S. J.; Barker, M. K.; Neumann, G. A.; Zuber, M. T.; Smith, D. E.
2017-12-01
Two recent NASA missions to the Moon, the Lunar Reconnaissance Orbiter (LRO) and the Gravity Recovery and Interior Laboratory (GRAIL), have obtained highly accurate information about the lunar shape and gravity field. These global geodetic datasets resolve long-standing issues with mission planning; the tidal lock of the Moon long prevented collection of accurate gravity measurements over the farside, and deteriorated precise positioning of topographic data. We describe key datasets and results from the LRO and GRAIL mission that are directly relevant to future lunar missions. SmallSat and CubeSat missions especially would benefit from these recent improvements, as they are typically more resource-constrained. Even with limited radio tracking data, accurate knowledge of topography and gravity enables precise orbit determination (OD) (e.g., limiting the scope of geolocation and co-registration tasks) and long-term predictions of altitude (e.g., dramatically reducing uncertainties in impact time). With one S-band tracking pass per day, LRO OD now routinely achieves total position knowledge better than 10 meters and radial position knowledge around 0.5 meter. Other tracking data, such as Laser Ranging from Earth-based SLR stations, can further support OD. We also show how altimetry can be used to substantially improve orbit reconstruction with the accurate topographic maps now available from Lunar Orbiter Laser Altimeter (LOLA) data. We present new results with SELENE extended mission and LRO orbits processed with direct altimetry measurements. With even a simple laser altimeter onboard, high-quality OD can be achieved for future missions because of the datasets acquired by LRO and GRAIL, without the need for regular radio contact. Onboard processing of altimetric ranges would bring high-quality real-time position knowledge to support autonomous operation. We also describe why optical ranging transponders are ideal payloads for future lunar missions, as they can address both communication and navigation needs with little resources.
NASA Technical Reports Server (NTRS)
Gordon, T. E.
1995-01-01
The mirror assembly of the AXAF observatory consists of four concentric, confocal, Wolter type 1 telescopes. Each telescope includes two conical grazing incidence mirrors, a paraboloid followed by a hyperboloid. Fabrication of these state-or-the-art optics is now complete, with predicted performance that surpasses the goals of the program. The fabrication of these optics, whose size and requirements exceed those of any previous x-ray mirrors, presented a challenging task requiring the use of precision engineering in many different forms. Virtually all of the equipment used for this effort required precision engineering. Accurate metrology required deterministic support of the mirrors in order to model the gravity distortions which will not be present on orbit. The primary axial instrument, known as the Precision Metrology Station (PMS), was a unique scanning Fizeau interferometer. After metrology was complete, the optics were placed in specially designed Glass Support Fixtures (GSF's) for installation on the Automated Cylindrical Grinder/Polishers (ACG/P's). The GSF's were custom molded for each mirror element to match the shape of the outer surface to minimize distortions of the inner surface. The final performance of the telescope is expected to far exceed the original goals and expectations of the program.
The Challenges of Precision Medicine in COPD.
Cazzola, Mario; Calzetta, Luigino; Rogliani, Paola; Matera, Maria Gabriella
2017-08-01
Pheno-/endotyping chronic obstructive pulmonary disease (COPD) is really important because it provides patients with precise and personalized medicine. The central concept of precision medicine is to take individual variability into account when making management decisions. Precision medicine should ensure that patients get the right treatment at the right dose at the right time, with minimum harmful consequences and maximum efficacy. Ideally, we should search for genetic and molecular biomarker-based profiles. Given the clinical complexity of COPD, it seems likely that a panel of several biomarkers will be required to characterize pathogenetic factors and their course over time. The need for biomarkers to guide the clinical care of individuals with COPD and to enhance the possibilities of success in drug development is clear and urgent, but biomarker development is tremendously challenging and expensive, and translation of research efforts to date has been largely ineffective. Furthermore, the development of personalized treatments will require a much more detailed understanding of the clinical and biological heterogeneity of COPD. Therefore, we are still far from being able to apply precision medicine in COPD and the treatable traits and FEV 1 -free approaches are attempts to precision medicine in COPD that must be considered still quite unsophisticated.
Extracting genetic alteration information for personalized cancer therapy from ClinicalTrials.gov
Xu, Jun; Lee, Hee-Jin; Zeng, Jia; Wu, Yonghui; Zhang, Yaoyun; Huang, Liang-Chin; Johnson, Amber; Holla, Vijaykumar; Bailey, Ann M; Cohen, Trevor; Meric-Bernstam, Funda; Bernstam, Elmer V
2016-01-01
Objective: Clinical trials investigating drugs that target specific genetic alterations in tumors are important for promoting personalized cancer therapy. The goal of this project is to create a knowledge base of cancer treatment trials with annotations about genetic alterations from ClinicalTrials.gov. Methods: We developed a semi-automatic framework that combines advanced text-processing techniques with manual review to curate genetic alteration information in cancer trials. The framework consists of a document classification system to identify cancer treatment trials from ClinicalTrials.gov and an information extraction system to extract gene and alteration pairs from the Title and Eligibility Criteria sections of clinical trials. By applying the framework to trials at ClinicalTrials.gov, we created a knowledge base of cancer treatment trials with genetic alteration annotations. We then evaluated each component of the framework against manually reviewed sets of clinical trials and generated descriptive statistics of the knowledge base. Results and Discussion: The automated cancer treatment trial identification system achieved a high precision of 0.9944. Together with the manual review process, it identified 20 193 cancer treatment trials from ClinicalTrials.gov. The automated gene-alteration extraction system achieved a precision of 0.8300 and a recall of 0.6803. After validation by manual review, we generated a knowledge base of 2024 cancer trials that are labeled with specific genetic alteration information. Analysis of the knowledge base revealed the trend of increased use of targeted therapy for cancer, as well as top frequent gene-alteration pairs of interest. We expect this knowledge base to be a valuable resource for physicians and patients who are seeking information about personalized cancer therapy. PMID:27013523
Extracting genetic alteration information for personalized cancer therapy from ClinicalTrials.gov.
Xu, Jun; Lee, Hee-Jin; Zeng, Jia; Wu, Yonghui; Zhang, Yaoyun; Huang, Liang-Chin; Johnson, Amber; Holla, Vijaykumar; Bailey, Ann M; Cohen, Trevor; Meric-Bernstam, Funda; Bernstam, Elmer V; Xu, Hua
2016-07-01
Clinical trials investigating drugs that target specific genetic alterations in tumors are important for promoting personalized cancer therapy. The goal of this project is to create a knowledge base of cancer treatment trials with annotations about genetic alterations from ClinicalTrials.gov. We developed a semi-automatic framework that combines advanced text-processing techniques with manual review to curate genetic alteration information in cancer trials. The framework consists of a document classification system to identify cancer treatment trials from ClinicalTrials.gov and an information extraction system to extract gene and alteration pairs from the Title and Eligibility Criteria sections of clinical trials. By applying the framework to trials at ClinicalTrials.gov, we created a knowledge base of cancer treatment trials with genetic alteration annotations. We then evaluated each component of the framework against manually reviewed sets of clinical trials and generated descriptive statistics of the knowledge base. The automated cancer treatment trial identification system achieved a high precision of 0.9944. Together with the manual review process, it identified 20 193 cancer treatment trials from ClinicalTrials.gov. The automated gene-alteration extraction system achieved a precision of 0.8300 and a recall of 0.6803. After validation by manual review, we generated a knowledge base of 2024 cancer trials that are labeled with specific genetic alteration information. Analysis of the knowledge base revealed the trend of increased use of targeted therapy for cancer, as well as top frequent gene-alteration pairs of interest. We expect this knowledge base to be a valuable resource for physicians and patients who are seeking information about personalized cancer therapy. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ferguson, Michael A.D.; Messier, François
1997-01-01
Aboriginal peoples want their ecological knowledge used in the management of wildlife populations. To accomplish this, management agencies will need regional summaries of aboriginal knowledge about long-term changes in the distribution and abundance of wildlife populations and ecological factors that influence those changes. Between 1983 and 1994, we developed a method for collecting Inuit knowledge about historical changes in a caribou (Rangifer tarandus) population on southern Baffin Island from c. 1900 to 1994. Advice from Inuit allowed us to collect and interpret their oral knowledge in culturally appropriate ways. Local Hunters and Trappers Associations (HTAs) and other Inuit identified potential informants to maximize the spatial and temporal scope of the study. In the final interview protocol, each informant (i) established his biographical map and time line, (ii) described changes in caribou distribution and density during his life, and (iii) discussed ecological factors that may have caused changes in caribou populations. Personal and parental observations of caribou distribution and abundance were reliable and precise. Inuit who had hunted caribou during periods of scarcity provided more extensive information than those hunters who had hunted mainly ringed seals (Phoca hispida); nevertheless, seal hunters provided information about coastal areas where caribou densities were insufficient for the needs of caribou hunters. The wording of our questions influenced the reliability of informants' answers; leading questions were especially problematic. We used only information that we considered reliable after analyzing the wording of both questions and answers from translated transcripts. This analysis may have excluded some reliable information because informants tended to understate certainty in their recollections. We tried to retain the accuracy and precision inherent in Inuit oral traditions; comparisons of information from several informants and comparisons with published and archival historical reports indicate that we retained these qualities of Inuit knowledge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pellin, M. J.; Veryovkin, I. V.; Levine, J.
2010-01-01
There are four generally mutually exclusive requirements that plague many mass spectrometric measurements of trace constituents: (1) the small size (limited by the depth probed) of many interesting materials requires high useful yields to simply detect some trace elements, (2) the low concentrations of interesting elements require efficient discrimination from isobaric interferences, (3) it is often necessary to measure the depth distribution of elements with high surface and low bulk contributions, and (4) many applications require precise isotopic analysis. Resonant ionization mass spectrometry has made dramatic progress in addressing these difficulties over the past five years.
AbdelMalik, Philip; Boulos, Maged N Kamel; Jones, Ray
2008-01-01
Background The "place-consciousness" of public health professionals is on the rise as spatial analyses and Geographic Information Systems (GIS) are rapidly becoming key components of their toolbox. However, "place" is most useful at its most precise, granular scale – which increases identification risks, thereby clashing with privacy issues. This paper describes the views and requirements of public health professionals in Canada and the UK on privacy issues and spatial data, as collected through a web-based survey. Methods Perceptions on the impact of privacy were collected through a web-based survey administered between November 2006 and January 2007. The survey targeted government, non-government and academic GIS labs and research groups involved in public health, as well as public health units (Canada), ministries, and observatories (UK). Potential participants were invited to participate through personally addressed, standardised emails. Results Of 112 invitees in Canada and 75 in the UK, 66 and 28 participated in the survey, respectively. The completion proportion for Canada was 91%, and 86% for the UK. No response differences were observed between the two countries. Ninety three percent of participants indicated a requirement for personally identifiable data (PID) in their public health activities, including geographic information. Privacy was identified as an obstacle to public health practice by 71% of respondents. The overall self-rated median score for knowledge of privacy legislation and policies was 7 out of 10. Those who rated their knowledge of privacy as high (at the median or above) also rated it significantly more severe as an obstacle to research (P < 0.001). The most critical cause cited by participants in both countries was bureaucracy. Conclusion The clash between PID requirements – including granular geography – and limitations imposed by privacy and its associated bureaucracy require immediate attention and solutions, particularly given the increasing utilisation of GIS in public health. Solutions include harmonization of privacy legislation with public health requirements, bureaucratic simplification, increased multidisciplinary discourse, education, and development of toolsets, algorithms and guidelines for using and reporting on disaggregate data. PMID:18471295
AbdelMalik, Philip; Boulos, Maged N Kamel; Jones, Ray
2008-05-09
The "place-consciousness" of public health professionals is on the rise as spatial analyses and Geographic Information Systems (GIS) are rapidly becoming key components of their toolbox. However, "place" is most useful at its most precise, granular scale - which increases identification risks, thereby clashing with privacy issues. This paper describes the views and requirements of public health professionals in Canada and the UK on privacy issues and spatial data, as collected through a web-based survey. Perceptions on the impact of privacy were collected through a web-based survey administered between November 2006 and January 2007. The survey targeted government, non-government and academic GIS labs and research groups involved in public health, as well as public health units (Canada), ministries, and observatories (UK). Potential participants were invited to participate through personally addressed, standardised emails. Of 112 invitees in Canada and 75 in the UK, 66 and 28 participated in the survey, respectively. The completion proportion for Canada was 91%, and 86% for the UK. No response differences were observed between the two countries. Ninety three percent of participants indicated a requirement for personally identifiable data (PID) in their public health activities, including geographic information. Privacy was identified as an obstacle to public health practice by 71% of respondents. The overall self-rated median score for knowledge of privacy legislation and policies was 7 out of 10. Those who rated their knowledge of privacy as high (at the median or above) also rated it significantly more severe as an obstacle to research (P < 0.001). The most critical cause cited by participants in both countries was bureaucracy. The clash between PID requirements - including granular geography - and limitations imposed by privacy and its associated bureaucracy require immediate attention and solutions, particularly given the increasing utilisation of GIS in public health. Solutions include harmonization of privacy legislation with public health requirements, bureaucratic simplification, increased multidisciplinary discourse, education, and development of toolsets, algorithms and guidelines for using and reporting on disaggregate data.
Robotics: A New Challenge For Industrial Arts.
ERIC Educational Resources Information Center
Lovedahl, Gerald G.
1983-01-01
The author argues that jobs in the future will depend less on manual skill and more on perceptual aptitude, formal knowledge, and precision. Industrial arts classes must include robotics in their curriculum if they intend to reflect accurately American industry. (Author/SSH)
Gene regulation knowledge commons: community action takes care of DNA binding transcription factors
Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin
2016-01-01
A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715
Fundamentals of endoscopic surgery: creation and validation of the hands-on test.
Vassiliou, Melina C; Dunkin, Brian J; Fried, Gerald M; Mellinger, John D; Trus, Thadeus; Kaneva, Pepa; Lyons, Calvin; Korndorffer, James R; Ujiki, Michael; Velanovich, Vic; Kochman, Michael L; Tsuda, Shawn; Martinez, Jose; Scott, Daniel J; Korus, Gary; Park, Adrian; Marks, Jeffrey M
2014-03-01
The Fundamentals of Endoscopic Surgery™ (FES) program consists of online materials and didactic and skills-based tests. All components were designed to measure the skills and knowledge required to perform safe flexible endoscopy. The purpose of this multicenter study was to evaluate the reliability and validity of the hands-on component of the FES examination, and to establish the pass score. Expert endoscopists identified the critical skill set required for flexible endoscopy. They were then modeled in a virtual reality simulator (GI Mentor™ II, Simbionix™ Ltd., Airport City, Israel) to create five tasks and metrics. Scores were designed to measure both speed and precision. Validity evidence was assessed by correlating performance with self-reported endoscopic experience (surgeons and gastroenterologists [GIs]). Internal consistency of each test task was assessed using Cronbach's alpha. Test-retest reliability was determined by having the same participant perform the test a second time and comparing their scores. Passing scores were determined by a contrasting groups methodology and use of receiver operating characteristic curves. A total of 160 participants (17 % GIs) performed the simulator test. Scores on the five tasks showed good internal consistency reliability and all had significant correlations with endoscopic experience. Total FES scores correlated 0.73, with participants' level of endoscopic experience providing evidence of their validity, and their internal consistency reliability (Cronbach's alpha) was 0.82. Test-retest reliability was assessed in 11 participants, and the intraclass correlation was 0.85. The passing score was determined and is estimated to have a sensitivity (true positive rate) of 0.81 and a 1-specificity (false positive rate) of 0.21. The FES hands-on skills test examines the basic procedural components required to perform safe flexible endoscopy. It meets rigorous standards of reliability and validity required for high-stakes examinations, and, together with the knowledge component, may help contribute to the definition and determination of competence in endoscopy.
Clinical professional governance for detailed clinical models.
Goossen, William; Goossen-Baremans, Anneke
2013-01-01
This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models. Finally, collections of clinical models do require a repository in which they can be stored, searched, and maintained. Governance of Detailed Clinical Models is required at local, national, and international levels.
Systems and synthetic biology approaches to alter plant cell walls and reduce biomass recalcitrance
Kalluri, Udaya C.; Yin, Hengfu; Yang, Xiaohan; ...
2014-11-03
Fine-tuning plant cell wall properties to render plant biomass more amenable to biofuel conversion is a colossal challenge. A deep knowledge of the biosynthesis and regulation of plant cell wall and a high-precision genome engineering toolset are the two essential pillars of efforts to alter plant cell walls and reduce biomass recalcitrance. The past decade has seen a meteoric rise in use of transcriptomics and high-resolution imaging methods resulting in fresh insights into composition, structure, formation and deconstruction of plant cell walls. Subsequent gene manipulation approaches, however, commonly include ubiquitous mis-expression of a single candidate gene in a host thatmore » carries an intact copy of the native gene. The challenges posed by pleiotropic and unintended changes resulting from such an approach are moving the field towards synthetic biology approaches. Finally, synthetic biology builds on a systems biology knowledge base and leverages high-precision tools for high-throughput assembly of multigene constructs and pathways, precision genome editing and site-specific gene stacking, silencing and/or removal. Here, we summarize the recent breakthroughs in biosynthesis and remodelling of major secondary cell wall components, assess the impediments in obtaining a systems-level understanding and explore the potential opportunities in leveraging synthetic biology approaches to reduce biomass recalcitrance.« less
Systems and synthetic biology approaches to alter plant cell walls and reduce biomass recalcitrance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalluri, Udaya C.; Yin, Hengfu; Yang, Xiaohan
Fine-tuning plant cell wall properties to render plant biomass more amenable to biofuel conversion is a colossal challenge. A deep knowledge of the biosynthesis and regulation of plant cell wall and a high-precision genome engineering toolset are the two essential pillars of efforts to alter plant cell walls and reduce biomass recalcitrance. The past decade has seen a meteoric rise in use of transcriptomics and high-resolution imaging methods resulting in fresh insights into composition, structure, formation and deconstruction of plant cell walls. Subsequent gene manipulation approaches, however, commonly include ubiquitous mis-expression of a single candidate gene in a host thatmore » carries an intact copy of the native gene. The challenges posed by pleiotropic and unintended changes resulting from such an approach are moving the field towards synthetic biology approaches. Finally, synthetic biology builds on a systems biology knowledge base and leverages high-precision tools for high-throughput assembly of multigene constructs and pathways, precision genome editing and site-specific gene stacking, silencing and/or removal. Here, we summarize the recent breakthroughs in biosynthesis and remodelling of major secondary cell wall components, assess the impediments in obtaining a systems-level understanding and explore the potential opportunities in leveraging synthetic biology approaches to reduce biomass recalcitrance.« less
Zhu, Zhou; Ihle, Nathan T; Rejto, Paul A; Zarrinkar, Patrick P
2016-06-13
Genome-scale functional genomic screens across large cell line panels provide a rich resource for discovering tumor vulnerabilities that can lead to the next generation of targeted therapies. Their data analysis typically has focused on identifying genes whose knockdown enhances response in various pre-defined genetic contexts, which are limited by biological complexities as well as the incompleteness of our knowledge. We thus introduce a complementary data mining strategy to identify genes with exceptional sensitivity in subsets, or outlier groups, of cell lines, allowing an unbiased analysis without any a priori assumption about the underlying biology of dependency. Genes with outlier features are strongly and specifically enriched with those known to be associated with cancer and relevant biological processes, despite no a priori knowledge being used to drive the analysis. Identification of exceptional responders (outliers) may not lead only to new candidates for therapeutic intervention, but also tumor indications and response biomarkers for companion precision medicine strategies. Several tumor suppressors have an outlier sensitivity pattern, supporting and generalizing the notion that tumor suppressors can play context-dependent oncogenic roles. The novel application of outlier analysis described here demonstrates a systematic and data-driven analytical strategy to decipher large-scale functional genomic data for oncology target and precision medicine discoveries.
Star Tracker Performance Estimate with IMU
NASA Technical Reports Server (NTRS)
Aretskin-Hariton, Eliot D.; Swank, Aaron J.
2015-01-01
A software tool for estimating cross-boresight error of a star tracker combined with an inertial measurement unit (IMU) was developed to support trade studies for the Integrated Radio and Optical Communication project (iROC) at the National Aeronautics and Space Administration Glenn Research Center. Typical laser communication systems, such as the Lunar Laser Communication Demonstration (LLCD) and the Laser Communication Relay Demonstration (LCRD), use a beacon to locate ground stations. iROC is investigating the use of beaconless precision laser pointing to enable laser communication at Mars orbits and beyond. Precision attitude knowledge is essential to the iROC mission to enable high-speed steering of the optical link. The preliminary concept to achieve this precision attitude knowledge is to use star trackers combined with an IMU. The Star Tracker Accuracy (STAcc) software was developed to rapidly assess the capabilities of star tracker and IMU configurations. STAcc determines the overall cross-boresight error of a star tracker with an IMU given the characteristic parameters: quantum efficiency, aperture, apparent star magnitude, exposure time, field of view, photon spread, detector pixels, spacecraft slew rate, maximum stars used for quaternion estimation, and IMU angular random walk. This paper discusses the supporting theory used to construct STAcc, verification of the program and sample results.
Scale and the evolutionarily based approximate number system: an exploratory study
NASA Astrophysics Data System (ADS)
Delgado, Cesar; Jones, M. Gail; You, Hye Sun; Robertson, Laura; Chesnutt, Katherine; Halberda, Justin
2017-05-01
Crosscutting concepts such as scale, proportion, and quantity are recognised by U.S. science standards as a potential vehicle for students to integrate their scientific and mathematical knowledge; yet, U.S. students and adults trail their international peers in scale and measurement estimation. Culturally based knowledge of scale such as measurement units may be built on evolutionarily-based systems of number such as the approximate number system (ANS), which processes approximate representations of numerical magnitude. ANS is related to mathematical achievement in pre-school and early elementary students, but there is little research on ANS among older students or in science-related areas such as scale. Here, we investigate the relationship between ANS precision in public school U.S. seventh graders and their accuracy estimating the length of standard units of measurement in SI and U.S. customary units. We also explored the relationship between ANS and science and mathematics achievement. Accuracy estimating the metre was positively and significantly related to ANS precision. Mathematics achievement, science achievement, and accuracy estimating other units were not significantly related to ANS. We thus suggest that ANS precision may be related to mathematics understanding beyond arithmetic, beyond the early school years, and to the crosscutting concepts of scale, proportion, and quantity.
de Vries, Reinout E; Bakker-Pieper, Angelique; Oostenveld, Wyneke
2010-09-01
PURPOSE: The purpose of this study was to investigate the relations between leaders' communication styles and charismatic leadership, human-oriented leadership (leader's consideration), task-oriented leadership (leader's initiating structure), and leadership outcomes. METHODOLOGY: A survey was conducted among 279 employees of a governmental organization. The following six main communication styles were operationalized: verbal aggressiveness, expressiveness, preciseness, assuredness, supportiveness, and argumentativeness. Regression analyses were employed to test three main hypotheses. FINDINGS: In line with expectations, the study showed that charismatic and human-oriented leadership are mainly communicative, while task-oriented leadership is significantly less communicative. The communication styles were strongly and differentially related to knowledge sharing behaviors, perceived leader performance, satisfaction with the leader, and subordinate's team commitment. Multiple regression analyses showed that the leadership styles mediated the relations between the communication styles and leadership outcomes. However, leader's preciseness explained variance in perceived leader performance and satisfaction with the leader above and beyond the leadership style variables. IMPLICATIONS: This study offers potentially invaluable input for leadership training programs by showing the importance of leader's supportiveness, assuredness, and preciseness when communicating with subordinates. ORIGINALITY/VALUE: Although one of the core elements of leadership is interpersonal communication, this study is one of the first to use a comprehensive communication styles instrument in the study of leadership.
Quantum metrology and estimation of Unruh effect
Wang, Jieci; Tian, Zehua; Jing, Jiliang; Fan, Heng
2014-01-01
We study the quantum metrology for a pair of entangled Unruh-Dewitt detectors when one of them is accelerated and coupled to a massless scalar field. Comparing with previous schemes, our model requires only local interaction and avoids the use of cavities in the probe state preparation process. We show that the probe state preparation and the interaction between the accelerated detector and the external field have significant effects on the value of quantum Fisher information, correspondingly pose variable ultimate limit of precision in the estimation of Unruh effect. We find that the precision of the estimation can be improved by a larger effective coupling strength and a longer interaction time. Alternatively, the energy gap of the detector has a range that can provide us a better precision. Thus we may adjust those parameters and attain a higher precision in the estimation. We also find that an extremely high acceleration is not required in the quantum metrology process. PMID:25424772
Concepts and analysis for precision segmented reflector and feed support structures
NASA Technical Reports Server (NTRS)
Miller, Richard K.; Thomson, Mark W.; Hedgepeth, John M.
1990-01-01
Several issues surrounding the design of a large (20-meter diameter) Precision Segmented Reflector are investigated. The concerns include development of a reflector support truss geometry that will permit deployment into the required doubly-curved shape without significant member strains. For deployable and erectable reflector support trusses, the reduction of structural redundancy was analyzed to achieve reduced weight and complexity for the designs. The stiffness and accuracy of such reduced member trusses, however, were found to be affected to a degree that is unexpected. The Precision Segmented Reflector designs were developed with performance requirements that represent the Reflector application. A novel deployable sunshade concept was developed, and a detailed parametric study of various feed support structural concepts was performed. The results of the detailed study reveal what may be the most desirable feed support structure geometry for Precision Segmented Reflector/Large Deployable Reflector applications.
Establishment of National Gravity Base Network of Iran
NASA Astrophysics Data System (ADS)
Hatam Chavari, Y.; Bayer, R.; Hinderer, J.; Ghazavi, K.; Sedighi, M.; Luck, B.; Djamour, Y.; Le Moign, N.; Saadat, R.; Cheraghi, H.
2009-04-01
A gravity base network is supposed to be a set of benchmarks uniformly distributed across the country and the absolute gravity values at the benchmarks are known to the best accessible accuracy. The gravity at the benchmark stations are either measured directly with absolute devices or transferred by gravity difference measurements by gravimeters from known stations. To decrease the accumulation of random measuring errors arising from these transfers, the number of base stations distributed across the country should be as small as possible. This is feasible if the stations are selected near to the national airports long distances apart but faster accessible and measurable by a gravimeter carried in an airplane between the stations. To realize the importance of such a network, various applications of a gravity base network are firstly reviewed. A gravity base network is the required reference frame for establishing 1st , 2nd and 3rd order gravity networks. Such a gravity network is used for the following purposes: a. Mapping of the structure of upper crust in geology maps. The required accuracy for the measured gravity values is about 0.2 to 0.4 mGal. b. Oil and mineral explorations. The required accuracy for the measured gravity values is about 5 µGal. c. Geotechnical studies in mining areas for exploring the underground cavities as well as archeological studies. The required accuracy is about 5 µGal and better. d. Subsurface water resource explorations and mapping crustal layers which absorb it. An accuracy of the same level of previous applications is required here too. e. Studying the tectonics of the Earth's crust. Repeated precise gravity measurements at the gravity network stations can assist us in identifying systematic height changes. The accuracy of the order of 5 µGal and more is required. f. Studying volcanoes and their evolution. Repeated precise gravity measurements at the gravity network stations can provide valuable information on the gradual upward movement of lava. g. Producing precise mean gravity anomaly for precise geoid determination. Replacing precise spirit leveling by the GPS leveling using precise geoid model is one of the forth coming application of the precise geoid. A gravity base network of 28 stations established over Iran. The stations were built mainly at bedrocks. All stations were measured by an FG5 absolute gravimeter, at least 12 hours at each station, to obtain an accuracy of a few micro gals. Several stations were repeated several times during recent years to estimate the gravity changes.
Efficient exploration of cosmology dependence in the EFT of LSS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cataneo, Matteo; Foreman, Simon; Senatore, Leonardo, E-mail: matteoc@dark-cosmology.dk, E-mail: sfore@stanford.edu, E-mail: senatore@stanford.edu
The most effective use of data from current and upcoming large scale structure (LSS) and CMB observations requires the ability to predict the clustering of LSS with very high precision. The Effective Field Theory of Large Scale Structure (EFTofLSS) provides an instrument for performing analytical computations of LSS observables with the required precision in the mildly nonlinear regime. In this paper, we develop efficient implementations of these computations that allow for an exploration of their dependence on cosmological parameters. They are based on two ideas. First, once an observable has been computed with high precision for a reference cosmology, formore » a new cosmology the same can be easily obtained with comparable precision just by adding the difference in that observable, evaluated with much less precision. Second, most cosmologies of interest are sufficiently close to the Planck best-fit cosmology that observables can be obtained from a Taylor expansion around the reference cosmology. These ideas are implemented for the matter power spectrum at two loops and are released as public codes. When applied to cosmologies that are within 3σ of the Planck best-fit model, the first method evaluates the power spectrum in a few minutes on a laptop, with results that have 1% or better precision, while with the Taylor expansion the same quantity is instantly generated with similar precision. The ideas and codes we present may easily be extended for other applications or higher-precision results.« less
Efficient exploration of cosmology dependence in the EFT of LSS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cataneo, Matteo; Foreman, Simon; Senatore, Leonardo
The most effective use of data from current and upcoming large scale structure (LSS) and CMB observations requires the ability to predict the clustering of LSS with very high precision. The Effective Field Theory of Large Scale Structure (EFTofLSS) provides an instrument for performing analytical computations of LSS observables with the required precision in the mildly nonlinear regime. In this paper, we develop efficient implementations of these computations that allow for an exploration of their dependence on cosmological parameters. They are based on two ideas. First, once an observable has been computed with high precision for a reference cosmology, formore » a new cosmology the same can be easily obtained with comparable precision just by adding the difference in that observable, evaluated with much less precision. Second, most cosmologies of interest are sufficiently close to the Planck best-fit cosmology that observables can be obtained from a Taylor expansion around the reference cosmology. These ideas are implemented for the matter power spectrum at two loops and are released as public codes. When applied to cosmologies that are within 3σ of the Planck best-fit model, the first method evaluates the power spectrum in a few minutes on a laptop, with results that have 1% or better precision, while with the Taylor expansion the same quantity is instantly generated with similar precision. Finally, the ideas and codes we present may easily be extended for other applications or higher-precision results.« less
Efficient exploration of cosmology dependence in the EFT of LSS
Cataneo, Matteo; Foreman, Simon; Senatore, Leonardo
2017-04-18
The most effective use of data from current and upcoming large scale structure (LSS) and CMB observations requires the ability to predict the clustering of LSS with very high precision. The Effective Field Theory of Large Scale Structure (EFTofLSS) provides an instrument for performing analytical computations of LSS observables with the required precision in the mildly nonlinear regime. In this paper, we develop efficient implementations of these computations that allow for an exploration of their dependence on cosmological parameters. They are based on two ideas. First, once an observable has been computed with high precision for a reference cosmology, formore » a new cosmology the same can be easily obtained with comparable precision just by adding the difference in that observable, evaluated with much less precision. Second, most cosmologies of interest are sufficiently close to the Planck best-fit cosmology that observables can be obtained from a Taylor expansion around the reference cosmology. These ideas are implemented for the matter power spectrum at two loops and are released as public codes. When applied to cosmologies that are within 3σ of the Planck best-fit model, the first method evaluates the power spectrum in a few minutes on a laptop, with results that have 1% or better precision, while with the Taylor expansion the same quantity is instantly generated with similar precision. Finally, the ideas and codes we present may easily be extended for other applications or higher-precision results.« less
NASA Astrophysics Data System (ADS)
Ni, Wei-Tou; Han, Sen; Jin, Tao
2016-11-01
With the LIGO announcement of the first direct detection of gravitational waves (GWs), the GW Astronomy was formally ushered into our age. After one-hundred years of theoretical investigation and fifty years of experimental endeavor, this is a historical landmark not just for physics and astronomy, but also for industry and manufacturing. The challenge and opportunity for industry is precision and innovative manufacturing in large size - production of large and homogeneous optical components, optical diagnosis of large components, high reflectance dielectric coating on large mirrors, manufacturing of components for ultrahigh vacuum of large volume, manufacturing of high attenuating vibration isolation system, production of high-power high-stability single-frequency lasers, production of high-resolution positioning systems etc. In this talk, we address the requirements and methods to satisfy these requirements. Optical diagnosis of large optical components requires large phase-shifting interferometer; the 1.06 μm Phase Shifting Interferometer for testing LIGO optics and the recently built 24" phase-shifting Interferometer in Chengdu, China are examples. High quality mirrors are crucial for laser interferometric GW detection, so as for ring laser gyroscope, high precision laser stabilization via optical cavities, quantum optomechanics, cavity quantum electrodynamics and vacuum birefringence measurement. There are stringent requirements on the substrate materials and coating methods. For cryogenic GW interferometer, appropriate coating on sapphire or silicon are required for good thermal and homogeneity properties. Large ultrahigh vacuum components and high attenuating vibration system together with an efficient metrology system are required and will be addressed. For space interferometry, drag-free technology and weak-light manipulation technology are must. Drag-free technology is well-developed. Weak-light phase locking is demonstrated in the laboratories while weak-light manipulation technology still needs developments.
Sakurai Prize: The Future of Higgs Physics
NASA Astrophysics Data System (ADS)
Dawson, Sally
2017-01-01
The discovery of the Higgs boson relied critically on precision calculations. The quantum contributions from the Higgs boson to the W and top quark masses suggested long before the Higgs discovery that a Standard Model Higgs boson should have a mass in the 100-200 GeV range. The experimental extraction of Higgs properties requires normalization to the predicted Higgs production and decay rates, for which higher order corrections are also essential. As Higgs physics becomes a mature subject, more and more precise calculations will be required. If there is new physics at high scales, it will contribute to the predictions and precision Higgs physics will be a window to beyond the Standard Model physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Horvitz-Thompson survey sample methods for estimating large-scale animal abundance
Samuel, M.D.; Garton, E.O.
1994-01-01
Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey objectives and optimize decisions related to survey bias and variance. Finally, managers and researchers involved in the survey design process must realize that obtaining the best survey results requires an interactive and recursive process of survey design, execution, analysis and redesign. Survey refinements will be possible as further knowledge is gained on the actual abundance and distribution of the population and on the most efficient techniques for detection animals.
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
An informatics research agenda to support precision medicine: seven key areas
Avillach, Paul; Benham-Hutchins, Marge; Breitenstein, Matthew K; Crowgey, Erin L; Hoffman, Mark A; Jiang, Xia; Madhavan, Subha; Mattison, John E; Nagarajan, Radhakrishnan; Ray, Bisakha; Shin, Dmitriy; Visweswaran, Shyam; Zhao, Zhongming; Freimuth, Robert R
2016-01-01
The recent announcement of the Precision Medicine Initiative by President Obama has brought precision medicine (PM) to the forefront for healthcare providers, researchers, regulators, innovators, and funders alike. As technologies continue to evolve and datasets grow in magnitude, a strong computational infrastructure will be essential to realize PM’s vision of improved healthcare derived from personal data. In addition, informatics research and innovation affords a tremendous opportunity to drive the science underlying PM. The informatics community must lead the development of technologies and methodologies that will increase the discovery and application of biomedical knowledge through close collaboration between researchers, clinicians, and patients. This perspective highlights seven key areas that are in need of further informatics research and innovation to support the realization of PM. PMID:27107452
Reasoning and Data Representation in a Health and Lifestyle Support System.
Hanke, Sten; Kreiner, Karl; Kropf, Johannes; Scase, Marc; Gossy, Christian
2017-01-01
Case-based reasoning and data interpretation is an artificial intelligence approach that capitalizes on past experience to solve current problems and this can be used as a method for practical intelligent systems. Case-based data reasoning is able to provide decision support for experts and clinicians in health systems as well as lifestyle systems. In this project we were focusing on developing a solution for healthy ageing considering daily activities, nutrition as well as cognitive activities. The data analysis of the reasoner followed state of the art guidelines from clinical practice. Guidelines provide a general framework to guide clinicians, and require consequent background knowledge to become operational, which is precisely the kind of information recorded in practice cases; cases complement guidelines very well and helps to interpret them. It is expected that the interest in case-based reasoning systems in the health.
Distributed Wireless Monitoring System for Ullage and Temperature in Wine Barrels
Zhang, Wenqi; Skouroumounis, George K.; Monro, Tanya M.; Taylor, Dennis K.
2015-01-01
This paper presents a multipurpose and low cost sensor for the simultaneous monitoring of temperature and ullage of wine in barrels in two of the most important stages of winemaking, that being fermentation and maturation. The distributed sensor subsystem is imbedded within the bung of the barrel and runs on battery for a period of at least 12 months and costs around $27 AUD for all parts. In addition, software was designed which allows for the remote transmission and easy visual interpretation of the data for the winemaker. Early warning signals can be sent when the temperature or ullage deviates from a winemakers expectations so remedial action can be taken, such as when topping is required or the movement of the barrels to a cooler cellar location. Such knowledge of a wine’s properties or storage conditions allows for a more precise control of the final wine quality. PMID:26266410
OSIRIS-REx Contamination Control Strategy and Implementation
NASA Technical Reports Server (NTRS)
Dworkin, J. P.; Adelman, L. A.; Ajluni, T.; Andronikov, A. V.; Aponte, J. C.; Bartels, A. E.; Beshore, E.; Bierhaus, E. B.; Brucato, J. R.; Bryan, B. H.;
2017-01-01
OSIRIS-REx will return pristine samples of carbonaceous asteroid Bennu. This manuscript describes how pristine was defined based on expectations of Bennu and on a realistic understanding of what is achievable with a constrained schedule and budget, and how that definition flowed to requirements and implementation. To return a pristine sample, the OSIRIS-REx spacecraft sampling hardware was maintained at Level 100 A/2 and less than 180 nanograms per square centimeter of amino acids and hydrazine on the sampler head through precision cleaning, control of materials, and vigilance. Contamination is further characterized via witness material exposed to the spacecraft assembly and testing environment as well as in space. This characterization provided knowledge of the expected background and will be used in conjunction with archived spacecraft components for comparison with the samples when they are delivered to Earth for analysis. Most of all, the cleanliness of the OSIRIS-REx spacecraft was achieved through communication between scientists, engineers, managers, and technicians.
Remote control of the industry processes. POWERLINK protocol application
NASA Astrophysics Data System (ADS)
Wóbel, A.; Paruzel, D.; Paszkiewicz, B.
2017-08-01
The present technological development enables the use of solutions characterized by a lower failure rate, and work with greater precision. This allows you to obtain the most efficient production, high speed production and reliability of individual components. The main scope of this article was POWERLINK protocol application for communication with the controller B & R through communication Ethernet for recording process parameters. This enables control of run production cycle using an internal network connected to the PC industry. Knowledge of the most important parameters of the production in real time allows detecting of a failure immediately after occurrence. For this purpose, the position of diagnostic use driver X20CP1301 B&R to record measurement data such as pressure, temperature valve between the parties and the torque required to change the valve setting was made. The use of POWERLINK protocol allows for the transmission of information on the status of every 200 μs.
Yan, Hongping; Wang, Cheng; McCarn, Allison R; Ade, Harald
2013-04-26
A practical and accurate method to obtain the index of refraction, especially the decrement δ, across the carbon 1s absorption edge is demonstrated. The combination of absorption spectra scaled to the Henke atomic scattering factor database, the use of the doubly subtractive Kramers-Kronig relations, and high precision specular reflectivity measurements from thin films allow the notoriously difficult-to-measure δ to be determined with high accuracy. No independent knowledge of the film thickness or density is required. High confidence interpolation between relatively sparse measurements of δ across an absorption edge is achieved. Accurate optical constants determined by this method are expected to greatly improve the simulation and interpretation of resonant soft x-ray scattering and reflectivity data. The method is demonstrated using poly(methyl methacrylate) and should be extendable to all organic materials.
Development of a cerebral circulation model for the automatic control of brain physiology.
Utsuki, T
2015-01-01
In various clinical guidelines of brain injury, intracranial pressure (ICP), cerebral blood flow (CBF) and brain temperature (BT) are essential targets for precise management for brain resuscitation. In addition, the integrated automatic control of BT, ICP, and CBF is required for improving therapeutic effects and reducing medical costs and staff burden. Thus, a new model of cerebral circulation was developed in this study for integrative automatic control. With this model, the CBF and cerebral perfusion pressure of a normal adult male were regionally calculated according to cerebrovascular structure, blood viscosity, blood distribution, CBF autoregulation, and ICP. The analysis results were consistent with physiological knowledge already obtained with conventional studies. Therefore, the developed model is potentially available for the integrative control of the physiological state of the brain as a reference model of an automatic control system, or as a controlled object in various control simulations.
Weak scratch detection and defect classification methods for a large-aperture optical element
NASA Astrophysics Data System (ADS)
Tao, Xian; Xu, De; Zhang, Zheng-Tao; Zhang, Feng; Liu, Xi-Long; Zhang, Da-Peng
2017-03-01
Surface defects on optics cause optic failure and heavy loss to the optical system. Therefore, surface defects on optics must be carefully inspected. This paper proposes a coarse-to-fine detection strategy of weak scratches in complicated dark-field images. First, all possible scratches are detected based on bionic vision. Then, each possible scratch is precisely positioned and connected to a complete scratch by the LSD and a priori knowledge. Finally, multiple scratches with various types can be detected in dark-field images. To classify defects and pollutants, a classification method based on GIST features is proposed. This paper uses many real dark-field images as experimental images. The results show that this method can detect multiple types of weak scratches in complex images and that the defects can be correctly distinguished with interference. This method satisfies the real-time and accurate detection requirements of surface defects.
Macroscale delivery systems for molecular and cellular payloads
NASA Astrophysics Data System (ADS)
Kearney, Cathal J.; Mooney, David J.
2013-11-01
Macroscale drug delivery (MDD) devices are engineered to exert spatiotemporal control over the presentation of a wide range of bioactive agents, including small molecules, proteins and cells. In contrast to systemically delivered drugs, MDD systems act as a depot of drug localized to the treatment site, which can increase drug effectiveness while reducing side effects and confer protection to labile drugs. In this Review, we highlight the key advantages of MDD systems, describe their mechanisms of spatiotemporal control and provide guidelines for the selection of carrier materials. We also discuss the combination of MDD technologies with classic medical devices to create multifunctional MDD devices that improve integration with host tissue, and the use of MDD technology in tissue-engineering strategies to direct cell behaviour. As our ever-expanding knowledge of human biology and disease provides new therapeutic targets that require precise control over their application, the importance of MDD devices in medicine is expected to increase.
OSIRIS-REx Contamination Control Strategy and Implementation
NASA Astrophysics Data System (ADS)
Dworkin, J. P.; Adelman, L. A.; Ajluni, T.; Andronikov, A. V.; Aponte, J. C.; Bartels, A. E.; Beshore, E.; Bierhaus, E. B.; Brucato, J. R.; Bryan, B. H.; Burton, A. S.; Callahan, M. P.; Castro-Wallace, S. L.; Clark, B. C.; Clemett, S. J.; Connolly, H. C.; Cutlip, W. E.; Daly, S. M.; Elliott, V. E.; Elsila, J. E.; Enos, H. L.; Everett, D. F.; Franchi, I. A.; Glavin, D. P.; Graham, H. V.; Hendershot, J. E.; Harris, J. W.; Hill, S. L.; Hildebrand, A. R.; Jayne, G. O.; Jenkens, R. W.; Johnson, K. S.; Kirsch, J. S.; Lauretta, D. S.; Lewis, A. S.; Loiacono, J. J.; Lorentson, C. C.; Marshall, J. R.; Martin, M. G.; Matthias, L. L.; McLain, H. L.; Messenger, S. R.; Mink, R. G.; Moore, J. L.; Nakamura-Messenger, K.; Nuth, J. A.; Owens, C. V.; Parish, C. L.; Perkins, B. D.; Pryzby, M. S.; Reigle, C. A.; Righter, K.; Rizk, B.; Russell, J. F.; Sandford, S. A.; Schepis, J. P.; Songer, J.; Sovinski, M. F.; Stahl, S. E.; Thomas-Keprta, K.; Vellinga, J. M.; Walker, M. S.
2018-02-01
OSIRIS-REx will return pristine samples of carbonaceous asteroid Bennu. This article describes how pristine was defined based on expectations of Bennu and on a realistic understanding of what is achievable with a constrained schedule and budget, and how that definition flowed to requirements and implementation. To return a pristine sample, the OSIRIS-REx spacecraft sampling hardware was maintained at level 100 A/2 and <180 ng/cm2 of amino acids and hydrazine on the sampler head through precision cleaning, control of materials, and vigilance. Contamination is further characterized via witness material exposed to the spacecraft assembly and testing environment as well as in space. This characterization provided knowledge of the expected background and will be used in conjunction with archived spacecraft components for comparison with the samples when they are delivered to Earth for analysis. Most of all, the cleanliness of the OSIRIS-REx spacecraft was achieved through communication among scientists, engineers, managers, and technicians.
Whisker growth studies under conditions which resemble those available on an orbiting space station
NASA Technical Reports Server (NTRS)
Hobbs, Herman H.
1992-01-01
Minimal funding was provided by NASA with one designated 'mission' being the clear demonstration of the relevance of previously supported whisker growth studies to microgravity research. While in one sense this work has shown the converse, namely, that ambient gravitational fields as high as 1 Earth normal have no relevance to growth of whiskers by hydrogen reduction of metal halides, a case is made that this does not demonstrate lack of relevance to microgravity research. On the contrary, the driving forces for this growth are precisely those which must be understood in order to understand growth in microgravity. The results described suggest that knowledge gained from this work may be highly fundamental to our understanding of the genesis of metal crystals. Time and money ran out before this work could be considered complete. At least another year's study and analysis will be required before publications could be justified.
Actin Engine in Immunological Synapse
Piragyte, Indre
2012-01-01
T cell activation and function require physical contact with antigen presenting cells at a specialized junctional structure known as the immunological synapse. Once formed, the immunological synapse leads to sustained T cell receptor-mediated signalling and stabilized adhesion. High resolution microscopy indeed had a great impact in understanding the function and dynamic structure of immunological synapse. Trends of recent research are now moving towards understanding the mechanical part of immune system, expanding our knowledge in mechanosensitivity, force generation, and biophysics of cell-cell interaction. Actin cytoskeleton plays inevitable role in adaptive immune system, allowing it to bear dynamic and precise characteristics at the same time. The regulation of mechanical engine seems very complicated and overlapping, but it enables cells to be very sensitive to external signals such as surface rigidity. In this review, we focus on actin regulators and how immune cells regulate dynamic actin rearrangement process to drive the formation of immunological synapse. PMID:22916042
OSIRIS-REx Contamination Control Strategy and Implementation
NASA Technical Reports Server (NTRS)
Dworkin, J. P.; Adelman, L. A.; Ajluni, T. M.; Andronikov, A. V.; Aponte, J. S.; Bartels, A. E.; Beshore, E.; Bierhaus, E. B.; Brucato, J. R.; Bryan, B. H.;
2017-01-01
OSIRIS-REx will return pristine samples of carbonaceous asteroid Bennu. This article describes how pristine was defined based on expectations of Bennu and on a realistic understanding of what is achievable with a constrained schedule and budget, and how that definition flowed to requirements and implementation. To return a pristine sample, the OSIRIS-REx spacecraft sampling hardware was maintained at level 100 A/2 and less than 180 ng/cm(exp 2) of amino acids and hydrazine on the sampler head through precision cleaning, control of materials, and vigilance. Contamination is further characterized via witness material exposed to the spacecraft assembly and testing environment as well as in space. This characterization provided knowledge of the expected background and will be used in conjunction with archived spacecraft components for comparison with the samples when they are delivered to Earth for analysis. Most of all, the cleanliness of the OSIRIS-REx spacecraft was achieved through communication among scientists, engineers, managers, and technicians.
A Review on Investigation and Assessment of Path Loss Models in Urban and Rural Environment
NASA Astrophysics Data System (ADS)
Maurya, G. R.; Kokate, P. A.; Lokhande, S. K.; Shrawankar, J. A.
2017-08-01
This paper aims at providing a clear knowledge of Path Loss (PL) to the researcher. The important data have been extracted from the papers and mentioned in clear and precise manner. The limited studies were based on identification of PL due to FM frequency. Majority of studies based on identification of PL considering telephonic frequency as a source. In this paper the PL in urban and rural areas of different places due to various factors like buildings, trees, antenna height, forest etc. have been studied. The common parameters like frequency, model and location based studies were done. The studies were segregated based on various parameters in tabular format and they were compared based on frequency, location and best fit model in that table. Scatter chart was drawn in order to make the things clearer and more understandable. However, location specific PL models are required to investigate the RF propagation in identified terrain.
Measuring Ionization in Highly Compressed, Near-Degenerate Plasmas
NASA Astrophysics Data System (ADS)
Doeppner, Tilo; Kraus, D.; Neumayer, P.; Bachmann, B.; Collins, G. W.; Divol, L.; Kritcher, A.; Landen, O. L.; Pak, A.; Weber, C.; Fletcher, L.; Glenzer, S. H.; Falcone, R. W.; Saunders, A.; Chapman, D.; Baggott, R.; Gericke, D. O.; Yi, A.
2016-10-01
A precise knowledge of ionization at given temperature and density is required to accurately model compressibility and heat capacity of materials at extreme conditions. We use x-ray Thomson scattering to characterize the plasma conditions in plastic and beryllium capsules near stagnation in implosion experiments at the National Ignition Facility. We expect the capsules to be compressed to more than 20x and electron densities approaching 1025 cm-3, corresponding to a Fermi energy of 170 eV. Zinc Heα x-rays (9 keV) scattering at 120° off the plasma yields high sensitivity to K-shell ionization, while at the same time constraining density and temperature. We will discuss recent results in the context of ionization potential depression at these extreme conditions. This work was performed under the auspices of the US Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Distributed Wireless Monitoring System for Ullage and Temperature in Wine Barrels.
Zhang, Wenqi; Skouroumounis, George K; Monro, Tanya M; Taylor, Dennis
2015-08-10
This paper presents a multipurpose and low cost sensor for the simultaneous monitoring of temperature and ullage of wine in barrels in two of the most important stages of winemaking, that being fermentation and maturation. The distributed sensor subsystem is imbedded within the bung of the barrel and runs on battery for a period of at least 12 months and costs around $27 AUD for all parts. In addition, software was designed which allows for the remote transmission and easy visual interpretation of the data for the winemaker. Early warning signals can be sent when the temperature or ullage deviates from a winemakers expectations so remedial action can be taken, such as when topping is required or the movement of the barrels to a cooler cellar location. Such knowledge of a wine's properties or storage conditions allows for a more precise control of the final wine quality.
Time to rethink the neural mechanisms of learning and memory
Gallistel, Charles R.; Balsam, Peter D
2014-01-01
Most studies in the neurobiology of learning assume that the underlying learning process is a pairing – dependent change in synaptic strength that requires repeated experience of events presented in close temporal contiguity. However, much learning is rapid and does not depend on temporal contiguity which has never been precisely defined. These points are well illustrated by studies showing that temporal relationships between events are rapidly learned-even over long delays- and this knowledge governs the form and timing of behavior. The speed with which anticipatory responses emerge in conditioning paradigms is determined by the information that cues provide about the timing of rewards. The challenge for understanding the neurobiology of learning is to understand the mechanisms in the nervous system that encode information from even a single experience, the nature of the memory mechanisms that can encode quantities such as time, and how the brain can flexibly perform computations based on this information. PMID:24309167
Southern Hemisphere and deep-sea warming led deglacial atmospheric CO2 rise and tropical warming.
Stott, Lowell; Timmermann, Axel; Thunell, Robert
2007-10-19
Establishing what caused Earth's largest climatic changes in the past requires a precise knowledge of both the forcing and the regional responses. We determined the chronology of high- and low-latitude climate change at the last glacial termination by radiocarbon dating benthic and planktonic foraminiferal stable isotope and magnesium/calcium records from a marine core collected in the western tropical Pacific. Deep-sea temperatures warmed by approximately 2 degrees C between 19 and 17 thousand years before the present (ky B.P.), leading the rise in atmospheric CO2 and tropical-surface-ocean warming by approximately 1000 years. The cause of this deglacial deep-water warming does not lie within the tropics, nor can its early onset between 19 and 17 ky B.P. be attributed to CO2 forcing. Increasing austral-spring insolation combined with sea-ice albedo feedbacks appear to be the key factors responsible for this warming.
Airborne gravimetry, altimetry, and GPS navigation errors
NASA Technical Reports Server (NTRS)
Colombo, Oscar L.
1992-01-01
Proper interpretation of airborne gravimetry and altimetry requires good knowledge of aircraft trajectory. Recent advances in precise navigation with differential GPS have made it possible to measure gravity from the air with accuracies of a few milligals, and to obtain altimeter profiles of terrain or sea surface correct to one decimeter. These developments are opening otherwise inaccessible regions to detailed geophysical mapping. Navigation with GPS presents some problems that grow worse with increasing distance from a fixed receiver: the effect of errors in tropospheric refraction correction, GPS ephemerides, and the coordinates of the fixed receivers. Ionospheric refraction and orbit error complicate ambiguity resolution. Optimal navigation should treat all error sources as unknowns, together with the instantaneous vehicle position. To do so, fast and reliable numerical techniques are needed: efficient and stable Kalman filter-smoother algorithms, together with data compression and, sometimes, the use of simplified dynamics.
Saterbak, Ann; Moturu, Anoosha; Volz, Tracy
2018-03-01
Rice University's bioengineering department incorporates written, oral, and visual communication instruction into its undergraduate curriculum to aid student learning and to prepare students to communicate their knowledge and discoveries precisely and persuasively. In a tissue culture lab course, we used a self- and peer-review tool called Calibrated Peer Review™ (CPR) to diagnose student learning gaps in visual communication skills on a poster assignment. We then designed an active learning intervention that required students to practice the visual communication skills that needed improvement and used CPR to measure the changes. After the intervention, we observed that students performed significantly better in their ability to develop high quality graphs and tables that represent experimental data. Based on these outcomes, we conclude that guided task practice, collaborative learning, and calibrated peer review can be used to improve engineering students' visual communication skills.
Neven, Sylvie
2016-01-01
In the Middle Ages and the premodern period knowledge of alchemical practices and materials was transmitted via collections of recipes often grouped concomitantly with art-technological instructions. In both alchemy and chemical technology particular importance is placed on artisanal and craft practices. Both are concerned with the description of colours. Both require procedures involving precise and specifically defined actions, prescriptions and ingredients. Assuming that alchemical and artistic texts have the same textual format, this raises the question: were they produced, diffused and read by the same people? This paper investigates the authorship and the context of production behind a sample of German alchemical manuscripts dating from the fourteenth to the sixteenth century. It scrutinizes their process of production, compilation and dissemination. This paper also sheds light on the various types of marginalia, and correlates them with their diverse functions. It thus delivers significant information about the readers and users of these manuscripts.
Nas transgenic mouse line allows visualization of Notch pathway activity in vivo.
Souilhol, Céline; Cormier, Sarah; Monet, Marie; Vandormael-Pournin, Sandrine; Joutel, Anne; Babinet, Charles; Cohen-Tannoudji, Michel
2006-06-01
The Notch signaling pathway plays multiple and important roles in mammals. However, several aspects of its action, in particular, the precise mapping of its sites of activity, remain unclear. To address this issue, we generated a transgenic line carrying a construct consisting of a nls-lacZ reporter gene under the control of a minimal promoter and multiple RBP-Jkappa binding sites. Here we show that this transgenic line, which we termed NAS (for Notch Activity Sensor), displays an expression profile that is consistent with current knowledge on Notch activity sites in mice, even though it may not report on all these sites. Moreover, we observe that NAS transgene expression is abolished in a RBP-Jkappa-deficient background, indicating that it indeed requires Notch/RBP-Jkappa signaling pathway activity. Thus, the NAS transgenic line constitutes a valuable and versatile tool to gain further insights into the complex and various functions of the Notch signaling pathway.
The influence of train leakage currents on the LEP dipole field
NASA Astrophysics Data System (ADS)
Bravin, E.; Brun, G.; Dehning, B.; Drees, A.; Galbraith, P.; Geitz, M.; Henrichsen, K.; Koratzinos, M.; Mugnai, G.; Tonutti, M.
The determination of the mass and the width of the Z boson at CERN's LEP accelerator, an e+e- storage ring with a circumference of approximately 27 km, imposes heavy demands on the knowledge of the LEP counter-rotating electron and positron beam energies. The precision required is of the order of 1 MeV or ≈ 20 ppm. Due to its size, the LEP collider is influenced by various macroscopic and regional factors such as the position of the moon or seasonal changes of the rainfall in the area, as reported earlier. A new and not less surprising effect on the LEP energy was observed in 1995: railroad trains in the Geneva region perturb the dipole field. A parasitic flow of electricity, originating from the trains, travels along the LEP vacuum chamber, affecting the LEP dipole field. An account of the phenomenon with its explanation substantiated by dedicated measurements is presented.
A review of satellite time transfer technology - Accomplishments and future applications
NASA Technical Reports Server (NTRS)
Cooper, R. S.; Chi, A. R.
1979-01-01
A brief review of the research accomplishments by NASA in meeting the needs of the space program for precise time in satellite tracking is presented. As a major user of precise time signals for clock synchronization of NASA's worldwide satellite tracking networks, the agency provided much of the necessary impetus for the development of stable frequency sources and time synchronization technology. The precision in time required for both satellite tracking and space science experiments has increased at a rate of about 1 order of magnitude per decade from 1 ms in the 1950's to 100 microsec during the Apollo era in the 1960's to 10 microsec in the 1970's. In the 1980's, when the Tracking and Data Relay Satellite System (TDRSS) comes into operation, satellite timing requirements will be extended to 1 microsec and below. These requirements are needed for spacecraft autonomy and data packeting which are now in active planning stages.
Precision, accuracy, and efficiency of four tools for measuring soil bulk density or strength.
Richard E. Miller; John Hazard; Steven Howes
2001-01-01
Monitoring soil compaction is time consuming. A desire for speed and lower costs, however, must be balanced with the appropriate precision and accuracy required of the monitoring task. We compared three core samplers and a cone penetrometer for measuring soil compaction after clearcut harvest on a stone-free and a stony soil. Precision (i.e., consistency) of each tool...
49 CFR 383.117 - Requirements for passenger endorsement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... COMMERCIAL DRIVER'S LICENSE STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.117... following additional knowledge and skills test requirements. (a) Knowledge test. All applicants for the... procedures not otherwise specified. (b) Skills test. To obtain a passenger endorsement applicable to a...
Climate forcings and feedbacks
NASA Technical Reports Server (NTRS)
Hansen, James
1993-01-01
Global temperature has increased significantly during the past century. Understanding the causes of observed global temperature change is impossible in the absence of adequate monitoring of changes in global climate forcings and radiative feedbacks. Climate forcings are changes imposed on the planet's energy balance, such as change of incoming sunlight or a human-induced change of surface properties due to deforestation. Radiative feedbacks are radiative changes induced by climate change, such as alteration of cloud properties or the extent of sea ice. Monitoring of global climate forcings and feedbacks, if sufficiently precise and long-term, can provide a very strong constraint on interpretation of observed temperature change. Such monitoring is essential to eliminate uncertainties about the relative importance of various climate change mechanisms including tropospheric sulfate aerosols from burning of coal and oil smoke from slash and burn agriculture, changes of solar irradiance changes of several greenhouse gases, and many other mechanisms. The considerable variability of observed temperature, together with evidence that a substantial portion of this variability is unforced indicates that observations of climate forcings and feedbacks must be continued for decades. Since the climate system responds to the time integral of the forcing, a further requirement is that the observations be carried out continuously. However, precise observations of forcings and feedbacks will also be able to provide valuable conclusions on shorter time scales. For example, knowledge of the climate forcing by increasing CFC's relative to the forcing by changing ozone is important to policymakers, as is information on the forcing by CO2 relative to the forcing by sulfate aerosols. It will also be possible to obtain valuable tests of climate models on short time scales, if there is precise monitoring of all forcings and feedbacks during and after events such as a large volcanic eruption or an El Nino.
A Simulated Geochemical Rover Mission to the Taurus-Littrow Valley of the Moon
NASA Technical Reports Server (NTRS)
Korotev, Randy L.; Haskin, Larry A.; Jolliff, Bradley L.
1995-01-01
We test the effectiveness of using an alpha backscatter, alpha-proton, X ray spectrometer on a remotely operated rover to analyze soils and provide geologically useful information about the Moon during a simulated mission to a hypothetical site resembling the Apollo 17 landing site. On the mission, 100 soil samples are "analyzed" for major elements at moderate analytical precision (e.g., typical relative sample standard deviation from counting statistics: Si[11%], Al[18%], Fe[6%], Mg[20%], Ca[5%]). Simulated compositions of soils are generated by combining compositions of components representing the major lithologies occurring at the site in known proportions. Simulated analyses are generated by degrading the simulated compositions according to the expected analytical precision of the analyzer. Compositions obtained from the simulated analyses are modeled by least squares mass balance as mixtures of the components, and the relative proportions of those components as predicted by the model are compared with the actual proportions used to generate the simulated composition. Boundary conditions of the modeling exercise are that all important lithologic components of the regolith are known and are represented by model components, and that the compositions of these components are well known. The effect of having the capability of determining one incompatible element at moderate precision (25%) is compared with the effect of the lack of this capability. We discuss likely limitations and ambiguities that would be encountered, but conclude that much of our knowledge about the Apollo 17 site (based on the return samples) regarding the distribution and relative abundances of lithologies in the regolith could be obtained. This success requires, however, that at least one incompatible element be determined.
Sturgeon, Catharine; Hill, Robert; Hortin, Glen L; Thompson, Douglas
2010-01-01
There is increasing pressure to provide cost-effective healthcare based on “best practice.” Consequently, new biomarkers are only likely to be introduced into routine clinical biochemistry departments if they are supported by a strong evidence base and if the results will improve patient management and outcome. This requires convincing evidence of the benefits of introducing the new test, ideally reflected in fewer hospital admissions, fewer additional investigations and/or fewer clinic visits. Carefully designed audit and cost-benefit studies in relevant patient groups must demonstrate that introducing the biomarker delivers an improved and more effective clinical pathway. From the laboratory perspective, pre-analytical requirements must be thoroughly investigated at an early stage. Good stability of the biomarker in relevant physiological matrices is essential to avoid the need for special processing. Absence of specific timing requirements for sampling and knowledge of the effect of medications that might be used to treat the patients in whom the biomarker will be measured is also highly desirable. Analytically, automation is essential in modern high-throughput clinical laboratories. Assays must therefore be robust, fulfilling standard requirements for linearity on dilution, precision and reproducibility, both within- and between-run. Provision of measurements by a limited number of specialized reference laboratories may be most appropriate, especially when a new biomarker is first introduced into routine practice. PMID:21137030
Independent Component Analysis applied to Ground-based observations
NASA Astrophysics Data System (ADS)
Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas
2018-01-01
Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements
Zubkov, Mikhail; Stait-Gardner, Timothy; Price, William S
2014-06-01
Precise NMR diffusion measurements require detailed knowledge of the cumulative dephasing effect caused by the numerous gradient pulses present in most NMR pulse sequences. This effect, which ultimately manifests itself as the diffusion-related NMR signal attenuation, is usually described by the b-value or the b-matrix in the case of multidirectional diffusion weighting, the latter being common in diffusion-weighted NMR imaging. Neglecting some of the gradient pulses introduces an error in the calculated diffusion coefficient reaching in some cases 100% of the expected value. Therefore, ensuring the b-matrix calculation includes all the known gradient pulses leads to significant error reduction. Calculation of the b-matrix for simple gradient waveforms is rather straightforward, yet it grows cumbersome when complexly shaped and/or numerous gradient pulses are introduced. Making three broad assumptions about the gradient pulse arrangement in a sequence results in an efficient framework for calculation of b-matrices as well providing some insight into optimal gradient pulse placement. The framework allows accounting for the diffusion-sensitising effect of complexly shaped gradient waveforms with modest computational time and power. This is achieved by using the b-matrix elements of the simple unmodified pulse sequence and minimising the integration of the complexly shaped gradient waveform in the modified sequence. Such re-evaluation of the b-matrix elements retains all the analytical relevance of the straightforward approach, yet at least halves the amount of symbolic integration required. The application of the framework is demonstrated with the evaluation of the expression describing the diffusion-sensitizing effect, caused by different bipolar gradient pulse modules. Copyright © 2014 Elsevier Inc. All rights reserved.
Independent Component Analysis applied to Ground-based observations
NASA Astrophysics Data System (ADS)
Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert
2017-10-01
Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulations of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitudes smaller.The effects of the terrestrial atmosphere and some of the time dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analyses (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition this technique has the advantage of requiring no reference star.Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements.
Spinal motor control system incorporates an internal model of limb dynamics.
Shimansky, Y P
2000-10-01
The existence and utilization of an internal representation of the controlled object is one of the most important features of the functioning of neural motor control systems. This study demonstrates that this property already exists at the level of the spinal motor control system (SMCS), which is capable of generating motor patterns for reflex rhythmic movements, such as locomotion and scratching, without the aid of the peripheral afferent feedback, but substantially modifies the generated activity in response to peripheral afferent stimuli. The SMCS is presented as an optimal control system whose optimality requires that it incorporate an internal model (IM) of the controlled object's dynamics. A novel functional mechanism for the integration of peripheral sensory signals with the corresponding predictive output from the IM, the summation of information precision (SIP) is proposed. In contrast to other models in which the correction of the internal representation of the controlled object's state is based on the calculation of a mismatch between the internal and external information sources, the SIP mechanism merges the information from these sources in order to optimize the precision of the controlled object's state estimate. It is demonstrated, based on scratching in decerebrate cats as an example of the spinal control of goal-directed movements, that the results of computer modeling agree with the experimental observations related to the SMCS's reactions to phasic and tonic peripheral afferent stimuli. It is also shown that the functional requirements imposed by the mathematical model of the SMCS comply with the current knowledge about the related properties of spinal neuronal circuitry. The crucial role of the spinal presynaptic inhibition mechanism in the neuronal implementation of SIP is elucidated. Important differences between the IM and a state predictor employed for compensating for a neural reflex time delay are discussed.
Knowledge representation in fuzzy logic
NASA Technical Reports Server (NTRS)
Zadeh, Lotfi A.
1989-01-01
The author presents a summary of the basic concepts and techniques underlying the application of fuzzy logic to knowledge representation. He then describes a number of examples relating to its use as a computational system for dealing with uncertainty and imprecision in the context of knowledge, meaning, and inference. It is noted that one of the basic aims of fuzzy logic is to provide a computational framework for knowledge representation and inference in an environment of uncertainty and imprecision. In such environments, fuzzy logic is effective when the solutions need not be precise and/or it is acceptable for a conclusion to have a dispositional rather than categorical validity. The importance of fuzzy logic derives from the fact that there are many real-world applications which fit these conditions, especially in the realm of knowledge-based systems for decision-making and control.
Patella instability: building bridges across the ocean a historic review.
Arendt, Elizabeth A; Dejour, David
2013-02-01
The diagnosis of and treatment for musculoskeletal disease and injuries have seen an explosion of new knowledge. More precise imaging, correlative injury anatomy, more focused physical examination features, among others, have led this upsurge of current insight. Crucial to this knowledge revolution is the expansion of international knowledge, which is aided by an adoption of a universal scientific language, electronic transfer of information, and personal communication of surgeons and scientists across national boundaries. One area where this is particularly evident is in our knowledge and treatment for patellofemoral disorders. This article will review the developments in the management of patellar dislocations by tracing their historical roots. This is not meant to be a comprehensive review, but rather to give current readers a "historical memory" upon which to judge and interpret our present-day bridge of knowledge. Level of evidence V.
Development of the One Centimeter Accuracy Geoid Model of Latvia for GNSS Measurements
NASA Astrophysics Data System (ADS)
Balodis, J.; Silabriedis, G.; Haritonova, D.; Kaļinka, M.; Janpaule, I.; Morozova, K.; Jumāre, I.; Mitrofanovs, I.; Zvirgzds, J.; Kaminskis, J.; Liepiņš, I.
2015-11-01
There is an urgent necessity for a highly accurate and reliable geoid model to enable prompt determination of normal height with the use of GNSS coordinate determination due to the high precision requirements in geodesy, building and high precision road construction development. Additionally, the Latvian height system is in the process of transition from BAS- 77 (Baltic Height System) to EVRS2007 system. The accuracy of the geoid model must approach the precision of about ∼1 cm looking forward to the Baltic Rail and other big projects. The use of all the available and verified data sources is planned, including the use of enlarged set of GNSS/levelling data, gravimetric measurement data and, additionally, the vertical deflection measurements over the territory of Latvia. The work is going ahead stepwise. Just the issue of GNSS reference network stability is discussed. In order to achieve the ∼1 cm precision geoid, it is required to have a homogeneous high precision GNSS network as a basis for ellipsoidal height determination for GNSS/levelling points. Both the LatPos and EUPOS® - Riga network have been examined in this article.
[Precision and personalized medicine].
Sipka, Sándor
2016-10-01
The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.
ERIC Educational Resources Information Center
Siegler, Robert S.; Braithwaite, David W.
2016-01-01
In this review, we attempt to integrate two crucial aspects of numerical development: learning the magnitudes of individual numbers and learning arithmetic. Numerical magnitude development involves gaining increasingly precise knowledge of increasing ranges and types of numbers: from non-symbolic to small symbolic numbers, from smaller to larger…
NASA Astrophysics Data System (ADS)
Mawet, D.; Absil, O.; Montagnier, G.; Riaud, P.; Surdej, J.; Ducourant, C.; Augereau, J.-C.; Röttinger, S.; Girard, J.; Krist, J.; Stapelfeldt, K.
2012-08-01
Context. Most exoplanet imagers consist of ground-based adaptive optics coronagraphic cameras which are currently limited in contrast, sensitivity and astrometric precision, but advantageously observe in the near-infrared window (1-5 μm). Because of these practical limitations, our current observational aim at detecting and characterizing planets puts heavy constraints on target selection, observing strategies, data reduction, and follow-up. Most surveys so far have thus targeted young systems (1-100 Myr) to catch the putative remnant thermal radiation of giant planets, which peaks in the near-infrared. They also favor systems in the solar neighborhood (d < 80 pc), which eases angular resolution requirements but also ensures a good knowledge of the distance and proper motion, which are critical to secure the planet status, and enable subsequent characterization. Aims: Because of their youth, it is very tempting to target the nearby star forming regions, which are typically twice as far as the bulk of objects usually combed for planets by direct imaging. Probing these interesting reservoirs sets additional constraints that we review in this paper by presenting the planet search that we initiated in 2008 around the disk-bearing T Tauri star IM Lup, which is part of the Lupus star forming region (140-190 pc). Methods: We show and discuss why age determination, the choice of evolutionary model for both the central star and the planet, precise knowledge of the host star proper motion, relative or absolute (between different instruments) astrometric accuracy (including plate scale calibration), and patience are the key ingredients for exoplanet searches around more distant young stars. Results: Unfortunately, most of the time, precision and perseverance are not paying off: we discovered a candidate companion around IM Lup in 2008, which we report here to be an unbound background object. We nevertheless review in details the lessons learned from our endeavor, and additionally present the best detection limits ever calculated for IM Lup. We also accessorily report on the successful use of innovative data reduction techniques, such as the damped-LOCI and iterative roll subtraction. Based on the ESO observing programs 380.C-0910, 084.C-0444, 287.C-5040; and HST observing program 10177.
Libertus, Melissa E.; Feigenson, Lisa; Halberda, Justin
2013-01-01
Previous research has found a relationship between individual differences in children’s precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the present study we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of two years. Additionally, at the last time point, we tested children’s informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3; Ginsburg & Baroody, 2003). We found that children’s numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned, non-symbolic system of quantity representation and the system of mathematical reasoning that children come to master through instruction. PMID:24076381
Libertus, Melissa E; Feigenson, Lisa; Halberda, Justin
2013-12-01
Previous research has found a relationship between individual differences in children's precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the current study, we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of 2years. In addition, at the final time point, we tested children's informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3). We found that children's numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned nonsymbolic system of quantity representation and the system of mathematics reasoning that children come to master through instruction. Copyright © 2013 Elsevier Inc. All rights reserved.
Huang, Linda; Fernandes, Helen; Zia, Hamid; Tavassoli, Peyman; Rennert, Hanna; Pisapia, David; Imielinski, Marcin; Sboner, Andrea; Rubin, Mark A; Kluk, Michael; Elemento, Olivier
2017-05-01
This paper describes the Precision Medicine Knowledge Base (PMKB; https://pmkb.weill.cornell.edu ), an interactive online application for collaborative editing, maintenance, and sharing of structured clinical-grade cancer mutation interpretations. PMKB was built using the Ruby on Rails Web application framework. Leveraging existing standards such as the Human Genome Variation Society variant description format, we implemented a data model that links variants to tumor-specific and tissue-specific interpretations. Key features of PMKB include support for all major variant types, standardized authentication, distinct user roles including high-level approvers, and detailed activity history. A REpresentational State Transfer (REST) application-programming interface (API) was implemented to query the PMKB programmatically. At the time of writing, PMKB contains 457 variant descriptions with 281 clinical-grade interpretations. The EGFR, BRAF, KRAS, and KIT genes are associated with the largest numbers of interpretable variants. PMKB's interpretations have been used in over 1500 AmpliSeq tests and 750 whole-exome sequencing tests. The interpretations are accessed either directly via the Web interface or programmatically via the existing API. An accurate and up-to-date knowledge base of genomic alterations of clinical significance is critical to the success of precision medicine programs. The open-access, programmatically accessible PMKB represents an important attempt at creating such a resource in the field of oncology. The PMKB was designed to help collect and maintain clinical-grade mutation interpretations and facilitate reporting for clinical cancer genomic testing. The PMKB was also designed to enable the creation of clinical cancer genomics automated reporting pipelines via an API. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Collaborative knowledge acquisition for the design of context-aware alert systems
Joffe, Erel; Havakuk, Ofer; Herskovic, Jorge R; Patel, Vimla L
2012-01-01
Objective To present a framework for combining implicit knowledge acquisition from multiple experts with machine learning and to evaluate this framework in the context of anemia alerts. Materials and Methods Five internal medicine residents reviewed 18 anemia alerts, while ‘talking aloud’. They identified features that were reviewed by two or more physicians to determine appropriate alert level, etiology and treatment recommendation. Based on these features, data were extracted from 100 randomly-selected anemia cases for a training set and an additional 82 cases for a test set. Two staff internists assigned an alert level, etiology and treatment recommendation before and after reviewing the entire electronic medical record. The training set of 118 cases (100 plus 18) and the test set of 82 cases were explored using RIDOR and JRip algorithms. Results The feature set was sufficient to assess 93% of anemia cases (intraclass correlation for alert level before and after review of the records by internists 1 and 2 were 0.92 and 0.95, respectively). High-precision classifiers were constructed to identify low-level alerts (precision p=0.87, recall R=0.4), iron deficiency (p=1.0, R=0.73), and anemia associated with kidney disease (p=0.87, R=0.77). Discussion It was possible to identify low-level alerts and several conditions commonly associated with chronic anemia. This approach may reduce the number of clinically unimportant alerts. The study was limited to anemia alerts. Furthermore, clinicians were aware of the study hypotheses potentially biasing their evaluation. Conclusion Implicit knowledge acquisition, collaborative filtering and machine learning were combined automatically to induce clinically meaningful and precise decision rules. PMID:22744961
Griffon, N; Schuers, M; Dhombres, F; Merabti, T; Kerdelhué, G; Rollin, L; Darmoni, S J
2016-08-02
Despite international initiatives like Orphanet, it remains difficult to find up-to-date information about rare diseases. The aim of this study is to propose an exhaustive set of queries for PubMed based on terminological knowledge and to evaluate it versus the queries based on expertise provided by the most frequently used resource in Europe: Orphanet. Four rare disease terminologies (MeSH, OMIM, HPO and HRDO) were manually mapped to each other permitting the automatic creation of expended terminological queries for rare diseases. For 30 rare diseases, 30 citations retrieved by Orphanet expert query and/or query based on terminological knowledge were assessed for relevance by two independent reviewers unaware of the query's origin. An adjudication procedure was used to resolve any discrepancy. Precision, relative recall and F-measure were all computed. For each Orphanet rare disease (n = 8982), there was a corresponding terminological query, in contrast with only 2284 queries provided by Orphanet. Only 553 citations were evaluated due to queries with 0 or only a few hits. There were no significant differences between the Orpha query and terminological query in terms of precision, respectively 0.61 vs 0.52 (p = 0.13). Nevertheless, terminological queries retrieved more citations more often than Orpha queries (0.57 vs. 0.33; p = 0.01). Interestingly, Orpha queries seemed to retrieve older citations than terminological queries (p < 0.0001). The terminological queries proposed in this study are now currently available for all rare diseases. They may be a useful tool for both precision or recall oriented literature search.
Huang, Linda; Fernandes, Helen; Zia, Hamid; Tavassoli, Peyman; Rennert, Hanna; Pisapia, David; Imielinski, Marcin; Sboner, Andrea; Rubin, Mark A; Kluk, Michael
2017-01-01
Objective: This paper describes the Precision Medicine Knowledge Base (PMKB; https://pmkb.weill.cornell.edu), an interactive online application for collaborative editing, maintenance, and sharing of structured clinical-grade cancer mutation interpretations. Materials and Methods: PMKB was built using the Ruby on Rails Web application framework. Leveraging existing standards such as the Human Genome Variation Society variant description format, we implemented a data model that links variants to tumor-specific and tissue-specific interpretations. Key features of PMKB include support for all major variant types, standardized authentication, distinct user roles including high-level approvers, and detailed activity history. A REpresentational State Transfer (REST) application-programming interface (API) was implemented to query the PMKB programmatically. Results: At the time of writing, PMKB contains 457 variant descriptions with 281 clinical-grade interpretations. The EGFR, BRAF, KRAS, and KIT genes are associated with the largest numbers of interpretable variants. PMKB’s interpretations have been used in over 1500 AmpliSeq tests and 750 whole-exome sequencing tests. The interpretations are accessed either directly via the Web interface or programmatically via the existing API. Discussion: An accurate and up-to-date knowledge base of genomic alterations of clinical significance is critical to the success of precision medicine programs. The open-access, programmatically accessible PMKB represents an important attempt at creating such a resource in the field of oncology. Conclusion: The PMKB was designed to help collect and maintain clinical-grade mutation interpretations and facilitate reporting for clinical cancer genomic testing. The PMKB was also designed to enable the creation of clinical cancer genomics automated reporting pipelines via an API. PMID:27789569
Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan
2013-06-01
The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.
Development of a 0.5m clear aperture Cassegrain type collimator telescope
NASA Astrophysics Data System (ADS)
Ekinci, Mustafa; Selimoǧlu, Özgür
2016-07-01
Collimator is an optical instrument used to evaluate performance of high precision instruments, especially space-born high resolution telescopes. Optical quality of the collimator telescope needs to be better than the instrument to be measured. This requirement leads collimator telescope to be a very precise instrument with high quality mirrors and a stable structure to keep it operational under specified conditions. In order to achieve precision requirements and to ensure repeatability of the mounts for polishing and metrology, opto-mechanical principles are applied to mirror mounts. Finite Element Method is utilized to simulate gravity effects, integration errors and temperature variations. Finite element analyses results of deformed optical surfaces are imported to optical domain by using Zernike polynomials to evaluate the design against specified WFE requirements. Both mirrors are aspheric and made from Zerodur for its stability and near zero CTE, M1 is further light-weighted. Optical quality measurements of the mirrors are achieved by using custom made CGHs on an interferometric test setup. Spider of the Cassegrain collimator telescope has a flexural adjustment mechanism driven by precise micrometers to overcome tilt errors originating from finite stiffness of the structure and integration errors. Collimator telescope is assembled and alignment methods are proposed.
A method for exploring implicit concept relatedness in biomedical knowledge network.
Bai, Tian; Gong, Leiguang; Wang, Ye; Wang, Yan; Kulikowski, Casimir A; Huang, Lan
2016-07-19
Biomedical information and knowledge, structural and non-structural, stored in different repositories can be semantically connected to form a hybrid knowledge network. How to compute relatedness between concepts and discover valuable but implicit information or knowledge from it effectively and efficiently is of paramount importance for precision medicine, and a major challenge facing the biomedical research community. In this study, a hybrid biomedical knowledge network is constructed by linking concepts across multiple biomedical ontologies as well as non-structural biomedical knowledge sources. To discover implicit relatedness between concepts in ontologies for which potentially valuable relationships (implicit knowledge) may exist, we developed a Multi-Ontology Relatedness Model (MORM) within the knowledge network, for which a relatedness network (RN) is defined and computed across multiple ontologies using a formal inference mechanism of set-theoretic operations. Semantic constraints are designed and implemented to prune the search space of the relatedness network. Experiments to test examples of several biomedical applications have been carried out, and the evaluation of the results showed an encouraging potential of the proposed approach to biomedical knowledge discovery.
Hardron production and neutrino beams
NASA Astrophysics Data System (ADS)
Guglielmi, A.
2006-11-01
The precise measurements of the neutrino mixing parameters in the oscillation experiments at accelerators require new high-intensity and high-purity neutrino beams. Ancillary hadron-production measurements are then needed as inputs to precise calculation of neutrino beams and of atmospheric neutrino fluxes.
Forming Mandrels for X-Ray Mirror Substrates
NASA Technical Reports Server (NTRS)
Blake, Peter N.; Saha. To,p; Zhang, Will; O'Dell, Stephen; Kester, Thomas; Jones, William
2011-01-01
Precision forming mandrels are one element in X-ray mirror development at NASA. Current mandrel fabrication process is capable of meeting the allocated precision requirements for a 5 arcsec telescope. A manufacturing plan is outlined for a large IXO-scale program.
Precise time technology for selected Air Force systems: Present status and future requirements
NASA Technical Reports Server (NTRS)
Yannoni, N. F.
1981-01-01
Precise time and time interval (PTTI) technology is becoming increasingly significant to Air Force operations as digital techniques find expanded utility in military missions. Timing has a key role in the function as well as in navigation. A survey of the PTTI needs of several Air Force systems is presented. Current technology supporting these needs was reviewed and new requirements are emphasized for systems as they transfer from initial development to final operational deployment.
Peer Assessment with Online Tools to Improve Student Modeling
NASA Astrophysics Data System (ADS)
Atkins, Leslie J.
2012-11-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.
Toward Precision Healthcare: Context and Mathematical Challenges
Colijn, Caroline; Jones, Nick; Johnston, Iain G.; Yaliraki, Sophia; Barahona, Mauricio
2017-01-01
Precision medicine refers to the idea of delivering the right treatment to the right patient at the right time, usually with a focus on a data-centered approach to this task. In this perspective piece, we use the term “precision healthcare” to describe the development of precision approaches that bridge from the individual to the population, taking advantage of individual-level data, but also taking the social context into account. These problems give rise to a broad spectrum of technical, scientific, policy, ethical and social challenges, and new mathematical techniques will be required to meet them. To ensure that the science underpinning “precision” is robust, interpretable and well-suited to meet the policy, ethical and social questions that such approaches raise, the mathematical methods for data analysis should be transparent, robust, and able to adapt to errors and uncertainties. In particular, precision methodologies should capture the complexity of data, yet produce tractable descriptions at the relevant resolution while preserving intelligibility and traceability, so that they can be used by practitioners to aid decision-making. Through several case studies in this domain of precision healthcare, we argue that this vision requires the development of new mathematical frameworks, both in modeling and in data analysis and interpretation. PMID:28377724
49 CFR 383.110 - General requirement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.110 General requirement. All drivers of CMVs must have the knowledge and skills necessary to operate a CMV safely as contained in this subpart. The specific types of items that a State must include in the knowledge and skills tests that it...
DOT National Transportation Integrated Search
1983-01-01
A variety of measurements are sensitive to alcoholism; some may be applicable to screening programs, but more precise knowledge of sensitivity and specificity would help to select a minimal test battery. This study assessed the sensitivity of some te...
Measurement and estimation of performance characteristics (i.e., precision, bias, performance range, interferences and sensitivity) are often neglected in the development and use of new biological sampling methods. However, knowledge of this information is critical in enabling p...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, John Asher; Cargile, Phillip A.; Sinukoff, Evan
We present stellar and planetary properties for 1305 Kepler Objects of Interest hosting 2025 planet candidates observed as part of the California- Kepler Survey. We combine spectroscopic constraints, presented in Paper I, with stellar interior modeling to estimate stellar masses, radii, and ages. Stellar radii are typically constrained to 11%, compared to 40% when only photometric constraints are used. Stellar masses are constrained to 4%, and ages are constrained to 30%. We verify the integrity of the stellar parameters through comparisons with asteroseismic studies and Gaia parallaxes. We also recompute planetary radii for 2025 planet candidates. Because knowledge of planetarymore » radii is often limited by uncertainties in stellar size, we improve the uncertainties in planet radii from typically 42% to 12%. We also leverage improved knowledge of stellar effective temperature to recompute incident stellar fluxes for the planets, now precise to 21%, compared to a factor of two when derived from photometry.« less
Graph-based signal integration for high-throughput phenotyping
2012-01-01
Background Electronic Health Records aggregated in Clinical Data Warehouses (CDWs) promise to revolutionize Comparative Effectiveness Research and suggest new avenues of research. However, the effectiveness of CDWs is diminished by the lack of properly labeled data. We present a novel approach that integrates knowledge from the CDW, the biomedical literature, and the Unified Medical Language System (UMLS) to perform high-throughput phenotyping. In this paper, we automatically construct a graphical knowledge model and then use it to phenotype breast cancer patients. We compare the performance of this approach to using MetaMap when labeling records. Results MetaMap's overall accuracy at identifying breast cancer patients was 51.1% (n=428); recall=85.4%, precision=26.2%, and F1=40.1%. Our unsupervised graph-based high-throughput phenotyping had accuracy of 84.1%; recall=46.3%, precision=61.2%, and F1=52.8%. Conclusions We conclude that our approach is a promising alternative for unsupervised high-throughput phenotyping. PMID:23320851
The NCI Genomic Data Commons as an engine for precision medicine.
Jensen, Mark A; Ferretti, Vincent; Grossman, Robert L; Staudt, Louis M
2017-07-27
The National Cancer Institute Genomic Data Commons (GDC) is an information system for storing, analyzing, and sharing genomic and clinical data from patients with cancer. The recent high-throughput sequencing of cancer genomes and transcriptomes has produced a big data problem that precludes many cancer biologists and oncologists from gleaning knowledge from these data regarding the nature of malignant processes and the relationship between tumor genomic profiles and treatment response. The GDC aims to democratize access to cancer genomic data and to foster the sharing of these data to promote precision medicine approaches to the diagnosis and treatment of cancer.
Bringing mirrors to rest: grating concepts for ultra-precise interferometry
NASA Astrophysics Data System (ADS)
Kroker, Stefanie; Kley, Ernst-Bernhard; Tünnermann, Andreas
2015-02-01
Experiments in the field of high precision metrology such as the detection of gravitational waves are crucially limited by the thermal fluctuations of the optical components. In this contribution we present the current state of knowledge of high contrast gratings (HCGs) as low-noise elements for gravitational wave interferometers. We discuss how the properties of HCGs can be tailored such that beside highly reflective mirrors also diffractive beam splitters can be realized. Further, we show the impact of such gratings on the sensitivity of future gravitational wave detectors which can pave the way for the new field of gravitational wave astronomy.
Aguilar, M; Aisa, D; Alpat, B; Alvino, A; Ambrosi, G; Andeen, K; Arruda, L; Attig, N; Azzarello, P; Bachlechner, A; Barao, F; Barrau, A; Barrin, L; Bartoloni, A; Basara, L; Battarbee, M; Battiston, R; Bazo, J; Becker, U; Behlmann, M; Beischer, B; Berdugo, J; Bertucci, B; Bigongiari, G; Bindi, V; Bizzaglia, S; Bizzarri, M; Boella, G; de Boer, W; Bollweg, K; Bonnivard, V; Borgia, B; Borsini, S; Boschini, M J; Bourquin, M; Burger, J; Cadoux, F; Cai, X D; Capell, M; Caroff, S; Casaus, J; Cascioli, V; Castellini, G; Cernuda, I; Cerreta, D; Cervelli, F; Chae, M J; Chang, Y H; Chen, A I; Chen, H; Cheng, G M; Chen, H S; Cheng, L; Chou, H Y; Choumilov, E; Choutko, V; Chung, C H; Clark, C; Clavero, R; Coignet, G; Consolandi, C; Contin, A; Corti, C; Cortina Gil, E; Coste, B; Creus, W; Crispoltoni, M; Cui, Z; Dai, Y M; Delgado, C; Della Torre, S; Demirköz, M B; Derome, L; Di Falco, S; Di Masso, L; Dimiccoli, F; Díaz, C; von Doetinchem, P; Donnini, F; Du, W J; Duranti, M; D'Urso, D; Eline, A; Eppling, F J; Eronen, T; Fan, Y Y; Farnesini, L; Feng, J; Fiandrini, E; Fiasson, A; Finch, E; Fisher, P; Galaktionov, Y; Gallucci, G; García, B; García-López, R; Gargiulo, C; Gast, H; Gebauer, I; Gervasi, M; Ghelfi, A; Gillard, W; Giovacchini, F; Goglov, P; Gong, J; Goy, C; Grabski, V; Grandi, D; Graziani, M; Guandalini, C; Guerri, I; Guo, K H; Haas, D; Habiby, M; Haino, S; Han, K C; He, Z H; Heil, M; Hoffman, J; Hsieh, T H; Huang, Z C; Huh, C; Incagli, M; Ionica, M; Jang, W Y; Jinchi, H; Kanishev, K; Kim, G N; Kim, K S; Kirn, Th; Kossakowski, R; Kounina, O; Kounine, A; Koutsenko, V; Krafczyk, M S; La Vacca, G; Laudi, E; Laurenti, G; Lazzizzera, I; Lebedev, A; Lee, H T; Lee, S C; Leluc, C; Levi, G; Li, H L; Li, J Q; Li, Q; Li, Q; Li, T X; Li, W; Li, Y; Li, Z H; Li, Z Y; Lim, S; Lin, C H; Lipari, P; Lippert, T; Liu, D; Liu, H; Lolli, M; Lomtadze, T; Lu, M J; Lu, S Q; Lu, Y S; Luebelsmeyer, K; Luo, J Z; Lv, S S; Majka, R; Mañá, C; Marín, J; Martin, T; Martínez, G; Masi, N; Maurin, D; Menchaca-Rocha, A; Meng, Q; Mo, D C; Morescalchi, L; Mott, P; Müller, M; Ni, J Q; Nikonov, N; Nozzoli, F; Nunes, P; Obermeier, A; Oliva, A; Orcinha, M; Palmonari, F; Palomares, C; Paniccia, M; Papi, A; Pauluzzi, M; Pedreschi, E; Pensotti, S; Pereira, R; Picot-Clemente, N; Pilo, F; Piluso, A; Pizzolotto, C; Plyaskin, V; Pohl, M; Poireau, V; Postaci, E; Putze, A; Quadrani, L; Qi, X M; Qin, X; Qu, Z Y; Räihä, T; Rancoita, P G; Rapin, D; Ricol, J S; Rodríguez, I; Rosier-Lees, S; Rozhkov, A; Rozza, D; Sagdeev, R; Sandweiss, J; Saouter, P; Sbarra, C; Schael, S; Schmidt, S M; Schulz von Dratzig, A; Schwering, G; Scolieri, G; Seo, E S; Shan, B S; Shan, Y H; Shi, J Y; Shi, X Y; Shi, Y M; Siedenburg, T; Son, D; Spada, F; Spinella, F; Sun, W; Sun, W H; Tacconi, M; Tang, C P; Tang, X W; Tang, Z C; Tao, L; Tescaro, D; Ting, Samuel C C; Ting, S M; Tomassetti, N; Torsti, J; Türkoğlu, C; Urban, T; Vagelli, V; Valente, E; Vannini, C; Valtonen, E; Vaurynovich, S; Vecchi, M; Velasco, M; Vialle, J P; Vitale, V; Vitillo, S; Wang, L Q; Wang, N H; Wang, Q L; Wang, R S; Wang, X; Wang, Z X; Weng, Z L; Whitman, K; Wienkenhöver, J; Wu, H; Wu, X; Xia, X; Xie, M; Xie, S; Xiong, R Q; Xin, G M; Xu, N S; Xu, W; Yan, Q; Yang, J; Yang, M; Ye, Q H; Yi, H; Yu, Y J; Yu, Z Q; Zeissler, S; Zhang, J H; Zhang, M T; Zhang, X B; Zhang, Z; Zheng, Z M; Zhuang, H L; Zhukov, V; Zichichi, A; Zimmermann, N; Zuccon, P; Zurbach, C
2015-05-01
A precise measurement of the proton flux in primary cosmic rays with rigidity (momentum/charge) from 1 GV to 1.8 TV is presented based on 300 million events. Knowledge of the rigidity dependence of the proton flux is important in understanding the origin, acceleration, and propagation of cosmic rays. We present the detailed variation with rigidity of the flux spectral index for the first time. The spectral index progressively hardens at high rigidities.
Development and validation of the AFIT scene and sensor emulator for testing (ASSET)
NASA Astrophysics Data System (ADS)
Young, Shannon R.; Steward, Bryan J.; Gross, Kevin C.
2017-05-01
ASSET is a physics-based model used to generate synthetic data sets of wide field of view (WFOV) electro-optical and infrared (EO/IR) sensors with realistic radiometric properties, noise characteristics, and sensor artifacts. It was developed to meet the need for applications where precise knowledge of the underlying truth is required but is impractical to obtain for real sensors. For example, due to accelerating advances in imaging technology, the volume of data available from WFOV EO/IR sensors has drastically increased over the past several decades, and as a result, there is a need for fast, robust, automatic detection and tracking algorithms. Evaluation of these algorithms is difficult for objects that traverse a wide area (100-10,000 km) because obtaining accurate truth for the full object trajectory often requires costly instrumentation. Additionally, tracking and detection algorithms perform differently depending on factors such as the object kinematics, environment, and sensor configuration. A variety of truth data sets spanning these parameters are needed for thorough testing, which is often cost prohibitive. The use of synthetic data sets for algorithm development allows for full control of scene parameters with full knowledge of truth. However, in order for analysis using synthetic data to be meaningful, the data must be truly representative of real sensor collections. ASSET aims to provide a means of generating such representative data sets for WFOV sensors operating in the visible through thermal infrared. The work reported here describes the ASSET model, as well as provides validation results from comparisons to laboratory imagers and satellite data (e.g. Landsat-8).
Özdemir, Vural; Kolker, Eugene
2016-02-01
Nutrition is central to sustenance of good health, not to mention its role as a cultural object that brings together or draws lines among societies. Undoubtedly, understanding the future paths of nutrition science in the current era of Big Data remains firmly on science, technology, and innovation strategy agendas around the world. Nutrigenomics, the confluence of nutrition science with genomics, brought about a new focus on and legitimacy for "variability science" (i.e., the study of mechanisms of person-to-person and population differences in response to food, and the ways in which food variably impacts the host, for example, nutrient-related disease outcomes). Societal expectations, both public and private, and claims over genomics-guided and individually-tailored precision diets continue to proliferate. While the prospects of nutrition science, and nutrigenomics in particular, are established, there is a need to integrate the efforts in four Big Data domains that are naturally allied--agrigenomics, nutrigenomics, nutriproteomics, and nutrimetabolomics--that address complementary variability questions pertaining to individual differences in response to food-related environmental exposures. The joint use of these four omics knowledge domains, coined as Precision Nutrition 4.0 here, has sadly not been realized to date, but the potentials for such integrated knowledge innovation are enormous. Future personalized nutrition practices would benefit from a seamless planning of life sciences funding, research, and practice agendas from "farm to clinic to supermarket to society," and from "genome to proteome to metabolome." Hence, this innovation foresight analysis explains the already existing potentials waiting to be realized, and suggests ways forward for innovation in both technology and ethics foresight frames on precision nutrition. We propose the creation of a new Precision Nutrition Evidence Barometer for periodic, independent, and ongoing retrieval, screening, and aggregation of the relevant life sciences data. For innovation in Big Data ethics oversight, we suggest "nested governance" wherein the processes of knowledge production are made transparent in the continuum from life sciences and social sciences to humanities, and where each innovation actor reports to another accountability and transparency layer: scientists to ethicists, and ethicists to scholars in the emerging field of ethics-of-ethics. Such nested innovation ecosystems offer safety against innovation blind spots, calibrate visible/invisible power differences in the cultures of science or ethics, and ultimately, reducing the risk of "paper values"--what people say--and "real values"--what innovation actors actually do. We are optimistic that the convergence of nutrigenomics with nutriproteomics, nutrimetabolomics, and agrigenomics can build a robust, sustainable, and trustworthy precision nutrition 4.0 agenda, as articulated in this Big Data and ethics foresight analysis.
The Role of Genetics in Advancing Precision Medicine for Alzheimer's Disease-A Narrative Review.
Freudenberg-Hua, Yun; Li, Wentian; Davies, Peter
2018-01-01
Alzheimer's disease (AD) is the most common type of dementia, which has a substantial genetic component. AD affects predominantly older people. Accordingly, the prevalence of dementia has been rising as the population ages. To date, there are no effective interventions that can cure or halt the progression of AD. The only available treatments are the management of certain symptoms and consequences of dementia. The current state-of-the-art medical care for AD comprises three simple principles: prevent the preventable, achieve early diagnosis, and manage the manageable symptoms. This review provides a summary of the current state of knowledge of risk factors for AD, biological diagnostic testing, and prospects for treatment. Special emphasis is given to recent advances in genetics of AD and the way genomic data may support prevention, early intervention, and development of effective pharmacological treatments. Mutations in the APP, PSEN1 , and PSEN2 genes cause early onset Alzheimer's disease (EOAD) that follows a Mendelian inheritance pattern. For late onset Alzheimer's disease (LOAD), APOE4 was identified as a major risk allele more than two decades ago. Population-based genome-wide association studies of late onset AD have now additionally identified common variants at roughly 30 genetic loci. Furthermore, rare variants (allele frequency <1%) that influence the risk for LOAD have been identified in several genes. These genetic advances have broadened our insights into the biological underpinnings of AD. Moreover, the known genetic risk variants could be used to identify presymptomatic individuals at risk for AD and support diagnostic assessment of symptomatic subjects. Genetic knowledge may also facilitate precision medicine. The goal of precision medicine is to use biological knowledge and other health information to predict individual disease risk, understand disease etiology, identify disease subcategories, improve diagnosis, and provide personalized treatment strategies. We discuss the potential role of genetics in advancing precision medicine for AD along with its ethical challenges. We outline strategies to implement genomics into translational clinical research that will not only improve accuracy of dementia diagnosis, thus enabling more personalized treatment strategies, but may also speed up the discovery of novel drugs and interventions.
The Role of Genetics in Advancing Precision Medicine for Alzheimer’s Disease—A Narrative Review
Freudenberg-Hua, Yun; Li, Wentian; Davies, Peter
2018-01-01
Alzheimer’s disease (AD) is the most common type of dementia, which has a substantial genetic component. AD affects predominantly older people. Accordingly, the prevalence of dementia has been rising as the population ages. To date, there are no effective interventions that can cure or halt the progression of AD. The only available treatments are the management of certain symptoms and consequences of dementia. The current state-of-the-art medical care for AD comprises three simple principles: prevent the preventable, achieve early diagnosis, and manage the manageable symptoms. This review provides a summary of the current state of knowledge of risk factors for AD, biological diagnostic testing, and prospects for treatment. Special emphasis is given to recent advances in genetics of AD and the way genomic data may support prevention, early intervention, and development of effective pharmacological treatments. Mutations in the APP, PSEN1, and PSEN2 genes cause early onset Alzheimer’s disease (EOAD) that follows a Mendelian inheritance pattern. For late onset Alzheimer’s disease (LOAD), APOE4 was identified as a major risk allele more than two decades ago. Population-based genome-wide association studies of late onset AD have now additionally identified common variants at roughly 30 genetic loci. Furthermore, rare variants (allele frequency <1%) that influence the risk for LOAD have been identified in several genes. These genetic advances have broadened our insights into the biological underpinnings of AD. Moreover, the known genetic risk variants could be used to identify presymptomatic individuals at risk for AD and support diagnostic assessment of symptomatic subjects. Genetic knowledge may also facilitate precision medicine. The goal of precision medicine is to use biological knowledge and other health information to predict individual disease risk, understand disease etiology, identify disease subcategories, improve diagnosis, and provide personalized treatment strategies. We discuss the potential role of genetics in advancing precision medicine for AD along with its ethical challenges. We outline strategies to implement genomics into translational clinical research that will not only improve accuracy of dementia diagnosis, thus enabling more personalized treatment strategies, but may also speed up the discovery of novel drugs and interventions. PMID:29740579
Ab Initio Computation of Dynamical Properties: Pressure Broadening
NASA Astrophysics Data System (ADS)
Wiesenfeld, Laurent; Drouin, Brian
2014-06-01
Rotational spectroscopy of polar molecules is the main observational tool in many areas of astrophysics, for gases of low densities (n ˜ 102 - 108 cm-3). Spectral line shapes in astrophysical media are largely dominated by turbulence-induced Doppler effects and natural line broadening are negligible. However line broadening remains an important tool for denser gases, like planetary high atmospheres. Understanding the excitation schemes of polar molecules requires the knowledge of excitation transfer rate due to collisional excitation, between the polar molecule and the ambient gas, usually H2. Transport properties in ionized media also require a precise knowledge of momentum transfer rates by elastic collisions. In order to assess the theoretically computed cross section and energy/momentum transfer rates, direct absolute experiments are scarce. The best way is to measure not individual scattering events but rather the global effect of the buffer gas, thanks to the pressure broadening cross sections, whose magnitude can be measured without any scaling parameters. At low temperatures, both elastic and inelastic scattering amplitudes are tested. At higher temperature, depending on the interaction strength, only inelastic scattering cross section are shown to play a significant role 1 ,2. Thanks to the advances of computer capabilities, it has become practical to compute spectral line parameters fromab initio quantum chemistry. In particular, the theory of rotational line broadening is readily incorporated into scattering quantum dynamical theory, like close-coupling schemes. The only approximations used in the computation are the isolated collision/isolated line approximations. We compute the non-binding interaction potential with high precision quantum chemistry and fit the resulting ab initio points onto a suitable functional. We have recently computed several such systems, for molecules in H2 buffer gas: H2O,3 H2CO,4 HCO+ .5 Detailed computations taking into account the ortho or para state of H2 were performed, at temperatures ranging from 10 K to 100K, typically. Reliable results are found, that compare favorably to experiments. In particular, the water-molecular hydrogen system has been thoroughly computed and successfully experimentally tested 6. New projects consider other simple molecules as well as heavier systems, relevant for cometary comae and planetary high atmospheres. as part of the GNU EPrints system
Evaluating gold standard corpora against gene/protein tagging solutions and lexical resources
2013-01-01
Motivation The identification of protein and gene names (PGNs) from the scientific literature requires semantic resources: Terminological and lexical resources deliver the term candidates into PGN tagging solutions and the gold standard corpora (GSC) train them to identify term parameters and contextual features. Ideally all three resources, i.e. corpora, lexica and taggers, cover the same domain knowledge, and thus support identification of the same types of PGNs and cover all of them. Unfortunately, none of the three serves as a predominant standard and for this reason it is worth exploring, how these three resources comply with each other. We systematically compare different PGN taggers against publicly available corpora and analyze the impact of the included lexical resource in their performance. In particular, we determine the performance gains through false positive filtering, which contributes to the disambiguation of identified PGNs. Results In general, machine learning approaches (ML-Tag) for PGN tagging show higher F1-measure performance against the BioCreative-II and Jnlpba GSCs (exact matching), whereas the lexicon based approaches (LexTag) in combination with disambiguation methods show better results on FsuPrge and PennBio. The ML-Tag solutions balance precision and recall, whereas the LexTag solutions have different precision and recall profiles at the same F1-measure across all corpora. Higher recall is achieved with larger lexical resources, which also introduce more noise (false positive results). The ML-Tag solutions certainly perform best, if the test corpus is from the same GSC as the training corpus. As expected, the false negative errors characterize the test corpora and – on the other hand – the profiles of the false positive mistakes characterize the tagging solutions. Lex-Tag solutions that are based on a large terminological resource in combination with false positive filtering produce better results, which, in addition, provide concept identifiers from a knowledge source in contrast to ML-Tag solutions. Conclusion The standard ML-Tag solutions achieve high performance, but not across all corpora, and thus should be trained using several different corpora to reduce possible biases. The LexTag solutions have different profiles for their precision and recall performance, but with similar F1-measure. This result is surprising and suggests that they cover a portion of the most common naming standards, but cope differently with the term variability across the corpora. The false positive filtering applied to LexTag solutions does improve the results by increasing their precision without compromising significantly their recall. The harmonisation of the annotation schemes in combination with standardized lexical resources in the tagging solutions will enable their comparability and will pave the way for a shared standard. PMID:24112383
46 CFR 11.713 - Requirements for maintaining current knowledge of waters to be navigated.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 1 2010-10-01 2010-10-01 false Requirements for maintaining current knowledge of waters... § 11.713 Requirements for maintaining current knowledge of waters to be navigated. (a) If a first class... current knowledge of the route. Persons using this method of re-familiarization shall certify, when...
Van Hecke, Wim; Sijbers, Jan; De Backer, Steve; Poot, Dirk; Parizel, Paul M; Leemans, Alexander
2009-07-01
Although many studies are starting to use voxel-based analysis (VBA) methods to compare diffusion tensor images between healthy and diseased subjects, it has been demonstrated that VBA results depend heavily on parameter settings and implementation strategies, such as the applied coregistration technique, smoothing kernel width, statistical analysis, etc. In order to investigate the effect of different parameter settings and implementations on the accuracy and precision of the VBA results quantitatively, ground truth knowledge regarding the underlying microstructural alterations is required. To address the lack of such a gold standard, simulated diffusion tensor data sets are developed, which can model an array of anomalies in the diffusion properties of a predefined location. These data sets can be employed to evaluate the numerous parameters that characterize the pipeline of a VBA algorithm and to compare the accuracy, precision, and reproducibility of different post-processing approaches quantitatively. We are convinced that the use of these simulated data sets can improve the understanding of how different diffusion tensor image post-processing techniques affect the outcome of VBA. In turn, this may possibly lead to a more standardized and reliable evaluation of diffusion tensor data sets of large study groups with a wide range of white matter altering pathologies. The simulated DTI data sets will be made available online (http://www.dti.ua.ac.be).
Armbruster, W. Scott
2014-01-01
Plant reproduction by means of flowers has long been thought to promote the success and diversification of angiosperms. It remains unclear, however, how this success has come about. Do flowers, and their capacity to have specialized functions, increase speciation rates or decrease extinction rates? Is floral specialization fundamental or incidental to the diversification? Some studies suggest that the conclusions we draw about the role of flowers in the diversification and increased phenotypic disparity (phenotypic diversity) of angiosperms depends on the system. For orchids, for example, specialized pollination may have increased speciation rates, in part because in most orchids pollen is packed in discrete units so that pollination is precise enough to contribute to reproductive isolation. In most plants, however, granular pollen results in low realized pollination precision, and thus key innovations involving flowers more likely reflect reduced extinction rates combined with opportunities for evolution of greater phenotypic disparity (phenotypic diversity) and occupation of new niches. Understanding the causes and consequences of the evolution of specialized flowers requires knowledge of both the selective regimes and the potential fitness trade-offs in using more than one pollinator functional group. The study of floral function and flowering-plant diversification remains a vibrant evolutionary field. Application of new methods, from measuring natural selection to estimating speciation rates, holds much promise for improving our understanding of the relationship between floral specialization and evolutionary success. PMID:24790124
The Deep Underground Neutrino Experiment: The precision era of neutrino physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemp, E.
The last decade was remarkable for neutrino physics. In particular, the phenomenon of neutrino flavor oscillations has been firmly established by a series of independent measurements. All parameters of the neutrino mixing are now known, and we have the elements to plan a judicious exploration of new scenarios that are opened by these recent advances. With precise measurements, we can test the three-neutrino paradigm, neutrino mass hierarchy, and charge conjugation parity (CP) asymmetry in the lepton sector. The future long-baseline experiments are considered to be a fundamental tool to deepen our knowledge of electroweak interactions. The Deep Underground Neutrino Experimentmore » (DUNE) will detect a broadband neutrino beam from Fermilab in an underground massive liquid argon time-projection chamber at an L/E of about 103 km GeV-1 to reach good sensitivity for CP-phase measurements and the determination of the mass hierarchy. The dimensions and the depth of the far detector also create an excellent opportunity to look for rare signals like proton decay to study violation of the baryonic number, as well as supernova neutrino bursts, broadening the scope of the experiment to astrophysics and associated impacts in cosmology. In this paper, we discuss the physics motivations and the main experimental features of the DUNE project required to reach its scientific goals.« less
Zeng, Rong; Wang, Wei; Zhao, Haijian; Fei, Yang; Wang, Zhiguo
2015-01-01
The narrow gap of HbA1 value of mass fraction between "normal" (< 6.0%) and "diabetes" (≥ 6.5%) necessitates tight control of inter-assay standardization, assay precision, and trueness. This survey was initiated to obtain knowledge of the current situation of internal quality control (IQC) practice for HbA(1c) in China and find out the most appropriate quality specifications. Data of IQC for HbA(1c) in 331 institutions participating in the national proficiency testing (PT) programs in China were evaluated using four levels of quality specifications, and the percentages of laboratories meeting the quality requirement were calculated to find out the most appropriate quality specifications for control materials of HbA(1c) in China. The IQC data varied vastly among 331 clinical laboratories in China. The measurement of control materials covered a wide range from 4.52% to 12.24% (inter-quartile range) and there were significant differences among the CVs of different methods, including LPLC, CE-HPLC, AC-HPLC, immunoturbidimetry, and others. Among the four main methods, CE-HPLC and AC-HPLC achieved a better precision. As we can see, the performance of laboratories for HbA(1c) has yet to be improved. Clinical laboratories in China should improve their performance with a stricter imprecision criteria.
Zhu, Haixin; Zhou, Xianfeng; Su, Fengyu; Tian, Yanqing; Ashili, Shashanka; Holl, Mark R.; Meldrum, Deirdre R.
2012-01-01
We report a novel method for wafer level, high throughput optical chemical sensor patterning, with precise control of the sensor volume and capability of producing arbitrary microscale patterns. Monomeric oxygen (O2) and pH optical probes were polymerized with 2-hydroxyethyl methacrylate (HEMA) and acrylamide (AM) to form spin-coatable and further crosslinkable polymers. A micro-patterning method based on micro-fabrication techniques (photolithography, wet chemical process and reactive ion etch) was developed to miniaturize the sensor film onto glass substrates in arbitrary sizes and shapes. The sensitivity of fabricated micro-patterns was characterized under various oxygen concentrations and pH values. The process for spatially integration of two sensors (Oxygen and pH) on the same substrate surface was also developed, and preliminary fabrication and characterization results were presented. To the best of our knowledge, it is the first time that poly (2-hydroxylethyl methacrylate)-co-poly (acrylamide) (PHEMA-co-PAM)-based sensors had been patterned and integrated at the wafer level with micron scale precision control using microfabrication techniques. The developed methods can provide a feasible way to miniaturize and integrate the optical chemical sensor system and can be applied to any lab-on-a-chip system, especially the biological micro-systems requiring optical sensing of single or multiple analytes. PMID:23175599
Spectral optimization and uncertainty quantification in combustion modeling
NASA Astrophysics Data System (ADS)
Sheen, David Allan
Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.
[Knowledge identification and management in a surgery department].
Rodríguez-Montes, José Antonio
2006-08-01
The hospital is an enterprise in which the surgery department represents a specific healthcare unit. The purpose of the surgery department, like that of any other enterprise, is assumed to be indefinite survival; to that end, it must be able to achieve and maintain a competitive advantage in the long term. Nevertheless, each surgery department, like each enterprise, can precisely define the scope of the above-mentioned terms, the main source of an enterprise's competitive advantage being its knowledge stock. Knowledge is recognized as being the basis of competitive success among institutions. This article presents the concept and classification of knowledge and discusses how it should be identified, inventoried, and managed. Special emphasis is placed on healthcare activity, since this sector presents certain characteristics distinguishing it from other sectors of economic and business activity.
Warner, Jeremy L; Rioth, Matthew J; Mandl, Kenneth D; Mandel, Joshua C; Kreda, David A; Kohane, Isaac S; Carbone, Daniel; Oreto, Ross; Wang, Lucy; Zhu, Shilin; Yao, Heming; Alterovitz, Gil
2016-07-01
Precision cancer medicine (PCM) will require ready access to genomic data within the clinical workflow and tools to assist clinical interpretation and enable decisions. Since most electronic health record (EHR) systems do not yet provide such functionality, we developed an EHR-agnostic, clinico-genomic mobile app to demonstrate several features that will be needed for point-of-care conversations. Our prototype, called Substitutable Medical Applications and Reusable Technology (SMART)® PCM, visualizes genomic information in real time, comparing a patient's diagnosis-specific somatic gene mutations detected by PCR-based hotspot testing to a population-level set of comparable data. The initial prototype works for patient specimens with 0 or 1 detected mutation. Genomics extensions were created for the Health Level Seven® Fast Healthcare Interoperability Resources (FHIR)® standard; otherwise, the prototype is a normal SMART on FHIR app. The PCM prototype can rapidly present a visualization that compares a patient's somatic genomic alterations against a distribution built from more than 3000 patients, along with context-specific links to external knowledge bases. Initial evaluation by oncologists provided important feedback about the prototype's strengths and weaknesses. We added several requested enhancements and successfully demonstrated the app at the inaugural American Society of Clinical Oncology Interoperability Demonstration; we have also begun to expand visualization capabilities to include cancer specimens with multiple mutations. PCM is open-source software for clinicians to present the individual patient within the population-level spectrum of cancer somatic mutations. The app can be implemented on any SMART on FHIR-enabled EHRs, and future versions of PCM should be able to evolve in parallel with external knowledge bases. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Synergies between optical and physical variables in intercepting parabolic targets
Gómez, José; López-Moliner, Joan
2013-01-01
Interception requires precise estimation of time-to-contact (TTC) information. A long-standing view posits that all relevant information for extracting TTC is available in the angular variables, which result from the projection of distal objects onto the retina. The different timing models rooted in this tradition have consequently relied on combining visual angle and its rate of expansion in different ways with tau being the most well-known solution for TTC. The generalization of these models to timing parabolic trajectories is not straightforward. For example, these different combinations rely on isotropic expansion and usually assume first-order information only, neglecting acceleration. As a consequence no optical formulations have been put forward so far to specify TTC of parabolic targets with enough accuracy. It is only recently that context-dependent physical variables have been shown to play an important role in TTC estimation. Known physical size and gravity can adequately explain observed data of linear and free-falling trajectories, respectively. Yet, a full timing model for specifying parabolic TTC has remained elusive. We here derive two formulations that specify TTC for parabolic ball trajectories. The first specification extends previous models in which known size is combined with thresholding visual angle or its rate of expansion to the case of fly balls. To efficiently use this model, observers need to recover the 3D radial velocity component of the trajectory which conveys the isotropic expansion. The second one uses knowledge of size and gravity combined with ball visual angle and elevation angle. Taking into account the noise due to sensory measurements, we simulate the expected performance of these models in terms of accuracy and precision. While the model that combines expansion information and size knowledge is more efficient during the late trajectory, the second one is shown to be efficient along all the flight. PMID:23720614
A semi-supervised learning framework for biomedical event extraction based on hidden topics.
Zhou, Deyu; Zhong, Dayou
2015-05-01
Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.
On the topographic bias and density distribution in modelling the geoid and orthometric heights
NASA Astrophysics Data System (ADS)
Sjöberg, Lars E.
2018-03-01
It is well known that the success in precise determinations of the gravimetric geoid height (N) and the orthometric height (H) rely on the knowledge of the topographic mass distribution. We show that the residual topographic bias due to an imprecise information on the topographic density is practically the same for N and H, but with opposite signs. This result is demonstrated both for the Helmert orthometric height and for a more precise orthometric height derived by analytical continuation of the external geopotential to the geoid. This result leads to the conclusion that precise gravimetric geoid heights cannot be validated by GNSS-levelling geoid heights in mountainous regions for the errors caused by the incorrect modelling of the topographic mass distribution, because this uncertainty is hidden in the difference between the two geoid estimators.
Study of parameters in precision optical glass molding
NASA Astrophysics Data System (ADS)
Ni, Ying; Wang, Qin-hua; Yu, Jing-chi
2010-10-01
Precision glass compression molding is an attractive approach to manufacture small precision optics in large volume over traditional manufacturing techniques because of its advantages such as lower cost, faster time to market and being environment friendly. In order to study the relationship between the surface figures of molded lenses and molding process parameters such as temperature, pressure, heating rate, cooling rate and so on, we present some glass compression molding experiments using same low Tg (transition temperature) glass material to produce two different kinds of aspheric lenses by different molding process parameters. Based on results from the experiments, we know the major factors influencing surface figure of molded lenses and the changing range of these parameters. From the knowledge we could easily catch proper molding parameters which are suitable for aspheric lenses with diameter from 10mm to 30mm.