Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J
2015-01-01
Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867
Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J
2015-01-01
Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy.
AMCP Partnership Forum: Managing Care in the Wave of Precision Medicine.
2018-05-23
Precision medicine, the customization of health care to an individual's genetic profile while accounting for biomarkers and lifestyle, has increasingly been adopted by health care stakeholders to guide the development of treatment options, improve treatment decision making, provide more patient-centered care, and better inform coverage and reimbursement decisions. Despite these benefits, key challenges prevent its broader use and adoption. On December 7-8, 2017, the Academy of Managed Care Pharmacy convened a group of stakeholders to discuss these challenges and provide recommendations to facilitate broader adoption and use of precision medicine across health care settings. These stakeholders represented the pharmaceutical industry, clinicians, patient advocacy, private payers, device manufacturers, health analytics, information technology, academia, and government agencies. Throughout the 2-day forum, participants discussed evidence requirements for precision medicine, including consistent ways to measure the utility and validity of precision medicine tests and therapies, limitations of traditional clinical trial designs, and limitations of value assessment framework methods. They also highlighted the challenges with evidence collection and data silos in precision medicine. Interoperability within and across health systems is hindering clinical advancements. Current medical coding systems also cannot account for the heterogeneity of many diseases, preventing health systems from having a complete understanding of their patient population to inform resource allocation. Challenges faced by payers, such as evidence limitations, to inform coverage and reimbursement decisions in precision medicine, as well as legal and regulatory barriers that inhibit more widespread data sharing, were also identified. While a broad range of perspectives was shared throughout the forum, participants reached consensus across 2 overarching areas. First, there is a greater need for common definitions, thresholds, and standards to guide evidence generation in precision medicine. Second, current information silos are preventing the sharing of valuable data. Collaboration among stakeholders is needed to support better information sharing, awareness, and education of precision medicine for patients. The recommendations brought forward by this diverse group of experts provide a set of solutions to spur widespread use and application of precision medicine. Taken together, successful adoption and use of precision medicine will require input and collaboration from all sectors of health care, especially patients. DISCLOSURES This AMCP Partnership Forum and the development of the proceedings document were supported by Amgen, Foundation Medicine, Genentech, Gilead, MedImpact, National Pharmaceutical Council, Precision for Value, Sanofi, Takeda, and Xcenda.
Dahlstrom, Michael F; Dudo, Anthony; Brossard, Dominique
2012-01-01
Studies that investigate how the mass media cover risk issues often assume that certain characteristics of content are related to specific risk perceptions and behavioral intentions. However, these relationships have seldom been empirically assessed. This study tests the influence of three message-level media variables--risk precision information, sensational information, and self-efficacy information--on perceptions of risk, individual worry, and behavioral intentions toward a pervasive health risk. Results suggest that more precise risk information leads to increased risk perceptions and that the effect of sensational information is moderated by risk precision information. Greater self-efficacy information is associated with greater intention to change behavior, but none of the variables influence individual worry. The results provide a quantitative understanding of how specific characteristics of informational media content can influence individuals' responses to health threats of a global and uncertain nature. © 2011 Society for Risk Analysis.
Ogden R. Lindsley and the historical development of precision teaching
Potts, Lisa; Eshleman, John W.; Cooper, John O.
1993-01-01
This paper presents the historical developments of precision teaching, a technological offshoot of radical behaviorism and free-operant conditioning. The sequence progresses from the scientific precursors of precision teaching and the beginnings of precision teaching to principal developments since 1965. Information about the persons, events, and accomplishments presented in this chronology was compiled in several ways. Journals, books, and conference presentations provided the essential information. The most important source for this account was Ogden Lindsley himself, because Lindsley and his students established the basic practices that define precision teaching. PMID:22478145
Using hyperspectral data in precision farming applications
USDA-ARS?s Scientific Manuscript database
Precision farming practices such as variable rate applications of fertilizer and agricultural chemicals require accurate field variability mapping. This chapter investigated the value of hyperspectral remote sensing in providing useful information for five applications of precision farming: (a) Soil...
Williams, Marc S; Buchanan, Adam H; Davis, F Daniel; Faucett, W Andrew; Hallquist, Miranda L G; Leader, Joseph B; Martin, Christa L; McCormick, Cara Z; Meyer, Michelle N; Murray, Michael F; Rahm, Alanna K; Schwartz, Marci L B; Sturm, Amy C; Wagner, Jennifer K; Williams, Janet L; Willard, Huntington F; Ledbetter, David H
2018-05-01
Health care delivery is increasingly influenced by the emerging concepts of precision health and the learning health care system. Although not synonymous with precision health, genomics is a key enabler of individualized care. Delivering patient-centered, genomics-informed care based on individual-level data in the current national landscape of health care delivery is a daunting challenge. Problems to overcome include data generation, analysis, storage, and transfer; knowledge management and representation for patients and providers at the point of care; process management; and outcomes definition, collection, and analysis. Development, testing, and implementation of a genomics-informed program requires multidisciplinary collaboration and building the concepts of precision health into a multilevel implementation framework. Using the principles of a learning health care system provides a promising solution. This article describes the implementation of population-based genomic medicine in an integrated learning health care system-a working example of a precision health program.
NASA Astrophysics Data System (ADS)
Turner, Alexander J.; Jacob, Daniel J.; Benmergui, Joshua; Brandman, Jeremy; White, Laurent; Randles, Cynthia A.
2018-06-01
Anthropogenic methane emissions originate from a large number of fine-scale and often transient point sources. Satellite observations of atmospheric methane columns are an attractive approach for monitoring these emissions but have limitations from instrument precision, pixel resolution, and measurement frequency. Dense observations will soon be available in both low-Earth and geostationary orbits, but the extent to which they can provide fine-scale information on methane sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) to assess the capabilities of different satellite observing system configurations. We conduct a 1-week WRF-STILT simulation to generate methane column footprints at 1.3 × 1.3 km2 spatial resolution and hourly temporal resolution over a 290 × 235 km2 domain in the Barnett Shale, a major oil and gas field in Texas with a large number of point sources. We sub-sample these footprints to match the observing characteristics of the recently launched TROPOMI instrument (7 × 7 km2 pixels, 11 ppb precision, daily frequency), the planned GeoCARB instrument (2.7 × 3.0 km2 pixels, 4 ppb precision, nominal twice-daily frequency), and other proposed observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its eigenvalues. We find that a week of TROPOMI observations should provide information on temporally invariant emissions at ˜ 30 km spatial resolution. GeoCARB should provide information available on temporally invariant emissions ˜ 2-7 km spatial resolution depending on sampling frequency (hourly to daily). Improvements to the instrument precision yield greater increases in information content than improved sampling frequency. A precision better than 6 ppb is critical for GeoCARB to achieve fine resolution of emissions. Transient emissions would be missed with either TROPOMI or GeoCARB. An aspirational high-resolution geostationary instrument with 1.3 × 1.3 km2 pixel resolution, hourly return time, and 1 ppb precision would effectively constrain the temporally invariant emissions in the Barnett Shale at the kilometer scale and provide some information on hourly variability of sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Henry, Michael J.; Burtner, IV, E. R.
The International Atomic Energy Agency (IAEA) is interested in increasing capabilities of IAEA safeguards inspectors to access information that would improve their situational awareness on the job. A mobile information platform could potentially provide access to information, analytics, and technical and logistical support to inspectors in the field, as well as providing regular updates to analysts at IAEA Headquarters in Vienna or at satellite offices. To demonstrate the potential capability of such a system, Pacific Northwest National Laboratory (PNNL) implemented a number of example capabilities within a PNNL-developed precision information environment (PIE), and using a tablet as a mobile informationmore » platform. PNNL’s safeguards proof-of-concept PIE intends to; demonstrate novel applications of mobile information platforms to international safeguards use cases; demonstrate proof-of-principle capability implementation; and provide “vision” for capabilities that could be implemented. This report documents the lessons learned from this two-year development activity for the Precision Information Environment for International Safeguards (PIE-IS), describing the developed capabilities, technical challenges, and considerations for future development, so that developers working to develop a similar system for the IAEA or other safeguards agencies might benefit from our work.« less
[What kind of information do German health information pamphlets provide on mammography screening?].
Kurzenhäuser, Stephanie
2003-02-01
To make an informed decision on participation in mammography screening, women have to be educated about all the risks and benefits of the procedure in a manner that is detailed and understandable. But an analysis of 27 German health pamphlets on mammography screening shows that many relevant pieces of information about the benefits, the risks, and especially the meaning of screening results are only insufficiently communicated. Many statements were presented narratively rather than as precise statistics. Depending on content, 17 to 62% of the quantifiable statements were actually given as numerical data. To provide comprehensive information and to avoid misunderstandings, it is necessary to supplement the currently available health pamphlets and make the information on mammography screening more precise.
Millisecond-timescale local network coding in the rat primary somatosensory cortex.
Eldawlatly, Seif; Oweiss, Karim G
2011-01-01
Correlation among neocortical neurons is thought to play an indispensable role in mediating sensory processing of external stimuli. The role of temporal precision in this correlation has been hypothesized to enhance information flow along sensory pathways. Its role in mediating the integration of information at the output of these pathways, however, remains poorly understood. Here, we examined spike timing correlation between simultaneously recorded layer V neurons within and across columns of the primary somatosensory cortex of anesthetized rats during unilateral whisker stimulation. We used bayesian statistics and information theory to quantify the causal influence between the recorded cells with millisecond precision. For each stimulated whisker, we inferred stable, whisker-specific, dynamic bayesian networks over many repeated trials, with network similarity of 83.3±6% within whisker, compared to only 50.3±18% across whiskers. These networks further provided information about whisker identity that was approximately 6 times higher than what was provided by the latency to first spike and 13 times higher than what was provided by the spike count of individual neurons examined separately. Furthermore, prediction of individual neurons' precise firing conditioned on knowledge of putative pre-synaptic cell firing was 3 times higher than predictions conditioned on stimulus onset alone. Taken together, these results suggest the presence of a temporally precise network coding mechanism that integrates information across neighboring columns within layer V about vibrissa position and whisking kinetics to mediate whisker movement by motor areas innervated by layer V.
The Better Part of Not Knowing: Virtuous Ignorance
ERIC Educational Resources Information Center
Kominsky, Jonathan F.; Langthorne, Philip; Keil, Frank C.
2016-01-01
Suppose you are presented with 2 informants who have provided answers to the same question. One provides a precise and confident answer, and the other says that they do not know. If you were asked which of these 2 informants was more of an expert, intuitively you would select the informant who provided the certain answer over the ignorant…
The better part of not knowing: Virtuous ignorance.
Kominsky, Jonathan F; Langthorne, Philip; Keil, Frank C
2016-01-01
Suppose you are presented with 2 informants who have provided answers to the same question. One provides a precise and confident answer, and the other says that they do not know. If you were asked which of these 2 informants was more of an expert, intuitively you would select the informant who provided the certain answer over the ignorant informant. However, for cases in which precise information is practically or actually unknowable (e.g., the number of leaves on all the trees in the world), certainty and confidence indicate a lack of competence, while expressions of ignorance may indicate greater expertise. In 3 experiments, we investigated whether children and adults are able to use this "virtuous ignorance" as a cue to expertise. Experiment 1 found that adults and children older than 9 years selected confident informants for knowable information and ignorant informants for unknowable information. However, 5-6-year-olds overwhelmingly favored the confident informant, even when such certainty was completely implausible. In Experiment 2 we replicated the results of Experiment 1 with a new set of items focused on predictions about the future, rather than numerical information. In Experiment 3, we demonstrated that 5-8-year-olds and adults are both able to distinguish between knowable and unknowable items when asked how difficult the information would be to acquire, but those same children failed to reject the precise and confident informant for unknowable items. We suggest that children have difficulty integrating information about the knowability of particular facts into their evaluations of expertise. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
The Better Part of Not Knowing: Virtuous Ignorance
Kominsky, Jonathan F.; Langthorne, Philip; Keil, Frank C.
2015-01-01
Suppose you are presented with two informants who have provided answers to the same question. One provides a precise and confident answer, and the other says that they do not know. If you were asked which of these two informants was more of an expert, intuitively you would select the informant who provided the certain answer over the ignorant informant. However, for cases in which precise information is practically or actually unknowable (e.g., the number of leaves on all the trees in the world), certainty and confidence indicate a lack of competence, while expressions of ignorance may indicate greater expertise. In three experiments, we investigated whether children and adults are able to use this “virtuous ignorance” as a cue to expertise. Experiment 1 found that adults and children older than 9 years selected confident informants for knowable information and ignorant informants for unknowable information. However, 5–6-year-olds overwhelmingly favored the confident informant, even when such certainty was completely implausible. In Experiment 2 we replicated the results of Experiment 1 with a new set of items focused on predictions about the future, rather than numerical information. In Experiment 3, we demonstrated that 5–8-year-olds and adults are both able to distinguish between knowable and unknowable items when asked how difficult the information would be to acquire, but those same children failed to reject the precise and confident informant for unknowable items. We suggest that children have difficulty integrating information about the knowability of particular facts into their evaluations of expertise. PMID:26479546
Benchmark dose and the three Rs. Part I. Getting more information from the same number of animals.
Slob, Wout
2014-08-01
Evaluating dose-response data using the Benchmark dose (BMD) approach rather than by the no observed adverse effect (NOAEL) approach implies a considerable step forward from the perspective of the Reduction, Replacement, and Refinement, three Rs, in particular the R of reduction: more information is obtained from the same number of animals, or, vice versa, similar information may be obtained from fewer animals. The first part of this twin paper focusses on the former, the second on the latter aspect. Regarding the former, the BMD approach provides more information from any given dose-response dataset in various ways. First, the BMDL (= BMD lower confidence bound) provides more information by its more explicit definition. Further, as compared to the NOAEL approach the BMD approach results in more statistical precision in the value of the point of departure (PoD), for deriving exposure limits. While part of the animals in the study do not directly contribute to the numerical value of a NOAEL, all animals are effectively used and do contribute to a BMDL. In addition, the BMD approach allows for combining similar datasets for the same chemical (e.g., both sexes) in a single analysis, which further increases precision. By combining a dose-response dataset with similar historical data for other chemicals, the precision can even be substantially increased. Further, the BMD approach results in more precise estimates for relative potency factors (RPFs, or TEFs). And finally, the BMD approach is not only more precise, it also allows for quantification of the precision in the BMD estimate, which is not possible in the NOAEL approach.
Precision medicine for psychopharmacology: a general introduction.
Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A
2016-07-01
Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.
Opportunities for the Cardiovascular Community in the Precision Medicine Initiative.
Shah, Svati H; Arnett, Donna; Houser, Steven R; Ginsburg, Geoffrey S; MacRae, Calum; Mital, Seema; Loscalzo, Joseph; Hall, Jennifer L
2016-01-12
The Precision Medicine Initiative recently announced by President Barack Obama seeks to move the field of precision medicine more rapidly into clinical care. Precision medicine revolves around the concept of integrating individual-level data including genomics, biomarkers, lifestyle and other environmental factors, wearable device physiological data, and information from electronic health records to ultimately provide better clinical care to individual patients. The Precision Medicine Initiative as currently structured will primarily fund efforts in cancer genomics with longer-term goals of advancing precision medicine to all areas of health, and will be supported through creation of a 1 million person cohort study across the United States. This focused effort on precision medicine provides scientists, clinicians, and patients within the cardiovascular community an opportunity to work together boldly to advance clinical care; the community needs to be aware and engaged in the process as it progresses. This article provides a framework for potential involvement of the cardiovascular community in the Precision Medicine Initiative, while highlighting significant challenges for its successful implementation. © 2016 American Heart Association, Inc.
Is There Space for the Objective Force?
2003-04-07
force through the combination of precision weapons and knowledge-based warfare. Army forces will survive through information dominance , provided by a...Objective Forces. Space-based systems will be foundational building blocks for the Objective Force to achieve information dominance and satellite...communications required for information dominance across a distributed battlefield? Second, what exists to provide the Objective Force information
Terrain matching image pre-process and its format transform in autonomous underwater navigation
NASA Astrophysics Data System (ADS)
Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang
2007-06-01
Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.
Precision Farming and Precision Pest Management: The Power of New Crop Production Technologies
Strickland, R. Mack; Ess, Daniel R.; Parsons, Samuel D.
1998-01-01
The use of new technologies including Geographic Information Systems (GIS), the Global Positioning System (GPS), Variable Rate Technology (VRT), and Remote Sensing (RS) is gaining acceptance in the present high-technology, precision agricultural industry. GIS provides the ability to link multiple data values for the same geo-referenced location, and provides the user with a graphical visualization of such data. When GIS is coupled with GPS and RS, management decisions can be applied in a more precise "micro-managed" manner by using VRT techniques. Such technology holds the potential to reduce agricultural crop production costs as well as crop and environmental damage. PMID:19274236
Simmons, Michael; Singhal, Ayush; Lu, Zhiyong
2018-01-01
The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text — found in biomedical publications and clinical notes — is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine. PMID:27807747
Simmons, Michael; Singhal, Ayush; Lu, Zhiyong
2016-01-01
The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next-generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text-found in biomedical publications and clinical notes-is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine.
NASA Astrophysics Data System (ADS)
Li, Da; Cheung, Chifai; Zhao, Xing; Ren, Mingjun; Zhang, Juan; Zhou, Liqiu
2016-10-01
Autostereoscopy based three-dimensional (3D) digital reconstruction has been widely applied in the field of medical science, entertainment, design, industrial manufacture, precision measurement and many other areas. The 3D digital model of the target can be reconstructed based on the series of two-dimensional (2D) information acquired by the autostereoscopic system, which consists multiple lens and can provide information of the target from multiple angles. This paper presents a generalized and precise autostereoscopic three-dimensional (3D) digital reconstruction method based on Direct Extraction of Disparity Information (DEDI) which can be used to any transform autostereoscopic systems and provides accurate 3D reconstruction results through error elimination process based on statistical analysis. The feasibility of DEDI method has been successfully verified through a series of optical 3D digital reconstruction experiments on different autostereoscopic systems which is highly efficient to perform the direct full 3D digital model construction based on tomography-like operation upon every depth plane with the exclusion of the defocused information. With the absolute focused information processed by DEDI method, the 3D digital model of the target can be directly and precisely formed along the axial direction with the depth information.
Airborne and satellite remote sensors for precision agriculture
USDA-ARS?s Scientific Manuscript database
Remote sensing provides an important source of information to characterize soil and crop variability for both within-season and after-season management despite the availability of numerous ground-based soil and crop sensors. Remote sensing applications in precision agriculture have been steadily inc...
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Attentional priority determines working memory precision.
Klyszejko, Zuzanna; Rahmati, Masih; Curtis, Clayton E
2014-12-01
Visual working memory is a system used to hold information actively in mind for a limited time. The number of items and the precision with which we can store information has limits that define its capacity. How much control do we have over the precision with which we store information when faced with these severe capacity limitations? Here, we tested the hypothesis that rank-ordered attentional priority determines the precision of multiple working memory representations. We conducted two psychophysical experiments that manipulated the priority of multiple items in a two-alternative forced choice task (2AFC) with distance discrimination. In Experiment 1, we varied the probabilities with which memorized items were likely to be tested. To generalize the effects of priority beyond simple cueing, in Experiment 2, we manipulated priority by varying monetary incentives contingent upon successful memory for items tested. Moreover, we illustrate our hypothesis using a simple model that distributed attentional resources across items with rank-ordered priorities. Indeed, we found evidence in both experiments that priority affects the precision of working memory in a monotonic fashion. Our results demonstrate that representations of priority may provide a mechanism by which resources can be allocated to increase the precision with which we encode and briefly store information. Copyright © 2014 Elsevier Ltd. All rights reserved.
Toward precision medicine and health: Opportunities and challenges in allergic diseases.
Galli, Stephen Joseph
2016-05-01
Precision medicine (also called personalized, stratified, or P4 medicine) can be defined as the tailoring of preventive measures and medical treatments to the characteristics of each patient to obtain the best clinical outcome for each person while ideally also enhancing the cost-effectiveness of such interventions for patients and society. Clearly, the best clinical outcome for allergic diseases is not to get them in the first place. To emphasize the importance of disease prevention, a critical component of precision medicine can be referred to as precision health, which is defined herein as the use of all available information pertaining to specific subjects (including family history, individual genetic and other biometric information, and exposures to risk factors for developing or exacerbating disease), as well as features of their environments, to sustain and enhance health and prevent the development of disease. In this article I will provide a personal perspective on how the precision health-precision medicine approach can be applied to the related goals of preventing the development of allergic disorders and providing the most effective diagnosis, disease monitoring, and care for those with these prevalent diseases. I will also mention some of the existing and potential challenges to achieving these ambitious goals. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study.
Kaplan, David; Chen, Jianshen
2012-07-01
A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for three methods of implementation: propensity score stratification, weighting, and optimal full matching. Three simulation studies and one case study are presented to elaborate the proposed two-step Bayesian propensity score approach. Results of the simulation studies reveal that greater precision in the propensity score equation yields better recovery of the frequentist-based treatment effect. A slight advantage is shown for the Bayesian approach in small samples. Results also reveal that greater precision around the wrong treatment effect can lead to seriously distorted results. However, greater precision around the correct treatment effect parameter yields quite good results, with slight improvement seen with greater precision in the propensity score equation. A comparison of coverage rates for the conventional frequentist approach and proposed Bayesian approach is also provided. The case study reveals that credible intervals are wider than frequentist confidence intervals when priors are non-informative.
76 FR 52956 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... (HPBS) Transmission File Layouts for HPBS Work Measures OMB No.: 0970-0230 Description: There is no... provide us less precise information on States' performance. The Transmission File Layouts form provides... requested to transmit similar files. We are not requesting any changes to the Transmission File Layouts form...
GNSS, Satellite Altimetry and Formosat-3/COSMIC for Determination of Ionosphere Parameters
NASA Astrophysics Data System (ADS)
Mahdi Alizadeh Elizei, M.; Schuh, Harald; Schmidt, Michael; Todorova, Sonya
The dispersion of ionosphere with respect to the microwave signals allows gaining information about the parameters of this medium in terms of the electron density (Ne), or the Total Elec-tron Content (TEC). In the last decade space geodetic techniques, such as Global Navigation Satellite System (GNSS), satellite altimetry missions, and Low Earth Orbiting (LEO) satel-lites have turned into a promising tool for remote sensing the ionosphere. The dual-frequency GNSS observations provide the main input data for development of Global Ionosphere Maps (GIM). However, the GNSS stations are heterogeneously distributed, with large gaps particu-larly over the sea surface, which lowers the precision of the GIM over these areas. Conversely, dual-frequency satellite altimetry missions provide information about the ionosphere precisely above the sea surface. In addition, LEO satellites such as Formosat-3/COSMIC (F-3/C) pro-vide well-distributed information of ionosphere around the world. In this study we developed GIMs of VTEC from combination of GNSS, satellite altimetry and F-3/C data with temporal resolution of 2 hours and spatial resolution of 5 degree in longitude and 2.5 degree in latitude. The combined GIMs provide a more homogeneous global coverage and higher precision and reliability than results of each individual technique.
Precision Time Protocol-Based Trilateration for Planetary Navigation
NASA Technical Reports Server (NTRS)
Murdock, Ron
2015-01-01
Progeny Systems Corporation has developed a high-fidelity, field-scalable, non-Global Positioning System (GPS) navigation system that offers precision localization over communications channels. The system is bidirectional, providing position information to both base and mobile units. It is the first-ever wireless use of the Institute of Electrical and Electronics Engineers (IEEE) Precision Time Protocol (PTP) in a bidirectional trilateration navigation system. The innovation provides a precise and reliable navigation capability to support traverse-path planning systems and other mapping applications, and it establishes a core infrastructure for long-term lunar and planetary occupation. Mature technologies are integrated to provide navigation capability and to support data and voice communications on the same network. On Earth, the innovation is particularly well suited for use in unmanned aerial vehicles (UAVs), as it offers a non-GPS precision navigation and location service for use in GPS-denied environments. Its bidirectional capability provides real-time location data to the UAV operator and to the UAV. This approach optimizes assisted GPS techniques and can be used to determine the presence of GPS degradation, spoofing, or jamming.
Informed spectral analysis: audio signal parameter estimation using side information
NASA Astrophysics Data System (ADS)
Fourer, Dominique; Marchand, Sylvain
2013-12-01
Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.
NASA Technical Reports Server (NTRS)
Schmidlin, F. J.; Northam, E. T.; Michel, W. R.
1985-01-01
The inflatable sphere technique represents a relatively inexpensive approach for obtaining density and wind data between 30 and 90 km. The procedure in its current form is adequate for operational rocket network type application. However, detailed information is lost because of oversmoothing. The present study had the objective to determine whether more detailed wind profiles could be obtained using the inflatable falling sphere and Hirobin. Hirobin is the name for the sphere reduction program used at NASA Wallops Island, VA. In connection with the aim of the study, information had to be obtained regarding the precision of the radar used to track the sphere. For this purpose, data from three C-band radars, each with a different tracking precision, were simulated. On the basis of the results of the investigation, it is concluded that, given a radar with a known precision and a perfectly performing sphere, the Hirobin filters can be adjusted to provide small-scale wind information to about 70 km.
Klonoff, David C; Price, W Nicholson
2017-03-01
Privacy is an important concern for the Precision Medicine Initiative (PMI) because success of this initiative will require the public to be willing to participate by contributing large amounts of genetic/genomic information and sensor data. This sensitive personal information is intended to be used only for specified research purposes. Public willingness to participate will depend on the public's level of trust that their information will be protected and kept private. Medical devices may constantly provide information. Therefore, assuring privacy for device-generated information may be essential for broad participation in the PMI. Privacy standards for devices should be an important early step in the development of the PMI.
Pharmacogenomic Biomarkers: an FDA Perspective on Utilization in Biological Product Labeling.
Schuck, Robert N; Grillo, Joseph A
2016-05-01
Precision medicine promises to improve both the efficacy and safety of therapeutic products by better informing why some patients respond well to a drug, and some experience adverse reactions, while others do not. Pharmacogenomics is a key component of precision medicine and can be utilized to select optimal doses for patients, more precisely identify individuals who will respond to a treatment and avoid serious drug-related toxicities. Since pharmacogenomic biomarker information can help inform drug dosing, efficacy, and safety, pharmacogenomic data are critically reviewed by FDA staff to ensure effective use of pharmacogenomic strategies in drug development and appropriate incorporation into product labels. Pharmacogenomic information may be provided in drug or biological product labeling to inform health care providers about the impact of genotype on response to a drug through description of relevant genomic markers, functional effects of genomic variants, dosing recommendations based on genotype, and other applicable genomic information. The format and content of labeling for biologic drugs will generally follow that of small molecule drugs; however, there are notable differences in pharmacogenomic information that might be considered useful for biologic drugs in comparison to small molecule drugs. Furthermore, the rapid entry of biologic drugs for treatment of rare genetic diseases and molecularly defined subsets of common diseases will likely lead to increased use of pharmacogenomic information in biologic drug labels in the near future. In this review, we outline the general principles of therapeutic product labeling and discuss the utilization of pharmacogenomic information in biologic drug labels.
ERIC Educational Resources Information Center
Texas State Technical Coll., Waco.
This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…
Data and Time Transfer Using SONET Radio
NASA Technical Reports Server (NTRS)
Graceffo, Gary M.
1996-01-01
The need for precise knowledge of time and frequency has become ubiquitous throughout our society. The areas of astronomy, navigation, and high speed wide-area networks are among a few of the many consumers of this type of information. The Global Positioning System (GPS) has the potential to be the most comprehensive source of precise timing information developed to date; however, the introduction of selective availability has made it difficult for many users to recover this information from the GPS system with the precision required for today's systems. The system described in this paper is a 'Synchronous Optical NetWORK (SONET) Radio Data and Time Transfer System'. The objective of this system is to provide precise time and frequency information to a variety of end-users using a two-way data and time-transfer system. Although time and frequency transfers have been done for many years, this system is unique in that time and frequency information are embedded into existing communications traffic. This eliminates the need to make the transfer of time and frequency informatio a dedicated function of the communications system. For this system SONET has been selected as the transport format from which precise time is derived. SONET has been selected because of its high data rates and its increasing acceptance throughout the industry. This paper details a proof-of-concept initiative to perform embedded time and frequency transfers using SONET Radio.
What can neuromorphic event-driven precise timing add to spike-based pattern recognition?
Akolkar, Himanshu; Meyer, Cedric; Clady, Zavier; Marre, Olivier; Bartolozzi, Chiara; Panzeri, Stefano; Benosman, Ryad
2015-03-01
This letter introduces a study to precisely measure what an increase in spike timing precision can add to spike-driven pattern recognition algorithms. The concept of generating spikes from images by converting gray levels into spike timings is currently at the basis of almost every spike-based modeling of biological visual systems. The use of images naturally leads to generating incorrect artificial and redundant spike timings and, more important, also contradicts biological findings indicating that visual processing is massively parallel, asynchronous with high temporal resolution. A new concept for acquiring visual information through pixel-individual asynchronous level-crossing sampling has been proposed in a recent generation of asynchronous neuromorphic visual sensors. Unlike conventional cameras, these sensors acquire data not at fixed points in time for the entire array but at fixed amplitude changes of their input, resulting optimally sparse in space and time-pixel individually and precisely timed only if new, (previously unknown) information is available (event based). This letter uses the high temporal resolution spiking output of neuromorphic event-based visual sensors to show that lowering time precision degrades performance on several recognition tasks specifically when reaching the conventional range of machine vision acquisition frequencies (30-60 Hz). The use of information theory to characterize separability between classes for each temporal resolution shows that high temporal acquisition provides up to 70% more information that conventional spikes generated from frame-based acquisition as used in standard artificial vision, thus drastically increasing the separability between classes of objects. Experiments on real data show that the amount of information loss is correlated with temporal precision. Our information-theoretic study highlights the potentials of neuromorphic asynchronous visual sensors for both practical applications and theoretical investigations. Moreover, it suggests that representing visual information as a precise sequence of spike times as reported in the retina offers considerable advantages for neuro-inspired visual computations.
Geostatistics, remote sensing and precision farming.
Mulla, D J
1997-01-01
Precision farming is possible today because of advances in farming technology, procedures for mapping and interpolating spatial patterns, and geographic information systems for overlaying and interpreting several soil, landscape and crop attributes. The key component of precision farming is the map showing spatial patterns in field characteristics. Obtaining information for this map is often achieved by soil sampling. This approach, however, can be cost-prohibitive for grain crops. Soil sampling strategies can be simplified by use of auxiliary data provided by satellite or aerial photo imagery. This paper describes geostatistical methods for estimating spatial patterns in soil organic matter, soil test phosphorus and wheat grain yield from a combination of Thematic Mapper imaging and soil sampling.
Precision diagnostics: moving towards protein biomarker signatures of clinical utility in cancer.
Borrebaeck, Carl A K
2017-03-01
Interest in precision diagnostics has been fuelled by the concept that early detection of cancer would benefit patients; that is, if detected early, more tumours should be resectable and treatment more efficacious. Serum contains massive amounts of potentially diagnostic information, and affinity proteomics has risen as an accurate approach to decipher this, to generate actionable information that should result in more precise and evidence-based options to manage cancer. To achieve this, we need to move from single to multiplex biomarkers, a so-called signature, that can provide significantly increased diagnostic accuracy. This Opinion article focuses on the progress being made in identifying protein biomarker signatures of clinical utility, using blood-based proteomics.
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
Zheng, Y.
2013-01-01
Temporal sound cues are essential for sound recognition, pitch, rhythm, and timbre perception, yet how auditory neurons encode such cues is subject of ongoing debate. Rate coding theories propose that temporal sound features are represented by rate tuned modulation filters. However, overwhelming evidence also suggests that precise spike timing is an essential attribute of the neural code. Here we demonstrate that single neurons in the auditory midbrain employ a proportional code in which spike-timing precision and firing reliability covary with the sound envelope cues to provide an efficient representation of the stimulus. Spike-timing precision varied systematically with the timescale and shape of the sound envelope and yet was largely independent of the sound modulation frequency, a prominent cue for pitch. In contrast, spike-count reliability was strongly affected by the modulation frequency. Spike-timing precision extends from sub-millisecond for brief transient sounds up to tens of milliseconds for sounds with slow-varying envelope. Information theoretic analysis further confirms that spike-timing precision depends strongly on the sound envelope shape, while firing reliability was strongly affected by the sound modulation frequency. Both the information efficiency and total information were limited by the firing reliability and spike-timing precision in a manner that reflected the sound structure. This result supports a temporal coding strategy in the auditory midbrain where proportional changes in spike-timing precision and firing reliability can efficiently signal shape and periodicity temporal cues. PMID:23636724
NASA Astrophysics Data System (ADS)
Tu, Rui; Zhang, Rui; Zhang, Pengfei; Liu, Jinhai; Lu, Xiaochun
2018-07-01
This study proposes an approach to facilitate real-time fast point positioning of the BeiDou Navigation Satellite System (BDS) based on regional augmentation information. We term this as the precise positioning based on augmentation information (BPP) approach. The coordinates of the reference stations were highly constrained to extract the augmentation information, which contained not only the satellite orbit clock error correlated with the satellite running state, but also included the atmosphere error and unmodeled error, which are correlated with the spatial and temporal states. Based on these mixed augmentation corrections, a precise point positioning (PPP) model could be used for the coordinates estimation of the user stations, and the float ambiguity could be easily fixed for the single-difference between satellites. Thus, this technique provided a quick and high-precision positioning service. Three different datasets with small, medium, and large baselines (0.6 km, 30 km and 136 km) were used to validate the feasibility and effectiveness of the proposed BPP method. The validations showed that using the BPP model, 1–2 cm positioning service can be provided in a 100 km wide area after just 2 s of initialization. Thus, as the proposed approach not only capitalized on both PPP and RTK but also provided consistent application, it can be used for area augmentation positioning.
Taylor, Seth; Carroll, Adam; Lord, Jessi
2016-07-01
Amplion, Inc. (OR, USA) is focused on progressing the primary drivers of precision medicine. Focused on enabling the front end of the healthcare value chain, pharmaceutical developers and diagnostic test developers, Amplion zeros in on the research and market components that will make precision medicine a reality. With BiomarkerBase™, Amplion's flagship product, Amplion provides evidence-based biomarker information that support the key strategic decisions pharmaceutical and diagnostic developers need to make to be successful in the emerging world of precision medicine. A passion for saving lives and improving patient outcomes using precision medicine inspires Amplion's product BiomarkerBase™. A unique combination of hard science and data science positions Amplion to surface the relationships of biomarkers and clinical evidence that gives pharmaceutical and diagnostic companies unique insight into the technical realities and market opportunities provided by biomarkers.
Tassé, Marc J; Schalock, Robert L; Thissen, David; Balboni, Giulia; Bersani, Henry Hank; Borthwick-Duffy, Sharon A; Spreat, Scott; Widaman, Keith F; Zhang, Dalun; Navas, Patricia
2016-03-01
The Diagnostic Adaptive Behavior Scale (DABS) was developed using item response theory (IRT) methods and was constructed to provide the most precise and valid adaptive behavior information at or near the cutoff point of making a decision regarding a diagnosis of intellectual disability. The DABS initial item pool consisted of 260 items. Using IRT modeling and a nationally representative standardization sample, the item set was reduced to 75 items that provide the most precise adaptive behavior information at the cutoff area determining the presence or not of significant adaptive behavior deficits across conceptual, social, and practical skills. The standardization of the DABS is described and discussed.
The Chronotron: A Neuron That Learns to Fire Temporally Precise Spike Patterns
Florian, Răzvan V.
2012-01-01
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm. PMID:22879876
C2 of Space: The Key to Full Spectrum Dominance
1999-01-01
created the Air Force Research Laboratory in 1997, AFRL/IF was tasked to provide Information Dominance technologies to the warfighter. These critical...allowing the future Battle Manager’s control of the battlespace. The first five ITTPs come under AFRL’s Information Dominance thrust area...time Sensor-to-Shooter, falls under the Precision Strike thrust area. This paper provides a brief background regarding Information Dominance and
JOHNSON, MARK B.; VOAS, ROBERT B.; KELLEY-BAKER, TARA; FURR-HOLDEN, C. DEBRA M.
2009-01-01
Objective We examined the effect of providing drinkers with blood alcohol concentration (BAC) information on subjective assessments of alcohol impairment and drunk-driving risk. Method We sampled 959 drinking participants from a natural drinking environment and asked them to self-administer a personal saliva-based alcohol test. Participants then were asked to rate their alcohol impairment and to indicate whether they could drive legally under one of four BAC feedback conditions (assigned at random): (1) control condition (no BAC feedback provided before the ratings); (2) categorical BAC information (low, high, and highest risk) from the saliva test; (3) categorical BAC information corroborated by a calibrated police breath alcohol analyzer; and (4) precise (three-digit) BAC information from the breath alcohol analyzer. Results Both control participants and participants who received precise BAC feedback gave subjective impairment ratings that correlated with actual BACs. For participants who received categorical BAC information from the saliva test, subjective impairment did not correlate with the actual BAC. Providing drinkers with BAC information, however, did help them predict more accurately if their BAC was higher than the legal BAC driving limit. Conclusions Although BAC information can influence drinkers’ assessments of alcohol impairment and drunk-driving risk, there is no strong evidence that personal saliva-based alcohol tests are particularly useful. PMID:18612570
Klonoff, David C.; Price, W. Nicholson
2017-01-01
Privacy is an important concern for the Precision Medicine Initiative (PMI) because success of this initiative will require the public to be willing to participate by contributing large amounts of genetic/genomic information and sensor data. This sensitive personal information is intended to be used only for specified research purposes. Public willingness to participate will depend on the public’s level of trust that their information will be protected and kept private. Medical devices may constantly provide information. Therefore, assuring privacy for device-generated information may be essential for broad participation in the PMI. Privacy standards for devices should be an important early step in the development of the PMI. PMID:27920271
1983-05-01
serve as inter- faces to provide a compatible hook-up. Recent advances in microprocessors are leaoing to the development of new techniques known as...killed. This sort of threat is new to warfare. Because of the lack of experience against such a threat, effective tactics have not been developed ...Recent developments have been on satellite-to-earth communications links, antijam capabilities, adaptive array antennas, and new equipment to exploit the
An integrated clinical and genomic information system for cancer precision medicine.
Jang, Yeongjun; Choi, Taekjin; Kim, Jongho; Park, Jisub; Seo, Jihae; Kim, Sangok; Kwon, Yeajee; Lee, Seungjae; Lee, Sanghyuk
2018-04-20
Increasing affordability of next-generation sequencing (NGS) has created an opportunity for realizing genomically-informed personalized cancer therapy as a path to precision oncology. However, the complex nature of genomic information presents a huge challenge for clinicians in interpreting the patient's genomic alterations and selecting the optimum approved or investigational therapy. An elaborate and practical information system is urgently needed to support clinical decision as well as to test clinical hypotheses quickly. Here, we present an integrated clinical and genomic information system (CGIS) based on NGS data analyses. Major components include modules for handling clinical data, NGS data processing, variant annotation and prioritization, drug-target-pathway analysis, and population cohort explorer. We built a comprehensive knowledgebase of genes, variants, drugs by collecting annotated information from public and in-house resources. Structured reports for molecular pathology are generated using standardized terminology in order to help clinicians interpret genomic variants and utilize them for targeted cancer therapy. We also implemented many features useful for testing hypotheses to develop prognostic markers from mutation and gene expression data. Our CGIS software is an attempt to provide useful information for both clinicians and scientists who want to explore genomic information for precision oncology.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
The emerging potential for network analysis to inform precision cancer medicine.
Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah
2018-06-14
Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.
The GLAS Algorithm Theoretical Basis Document for Precision Orbit Determination (POD)
NASA Technical Reports Server (NTRS)
Rim, Hyung Jin; Yoon, S. P.; Schultz, Bob E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) was the sole instrument for NASA's Ice, Cloud and land Elevation Satellite (ICESat) laser altimetry mission. The primary purpose of the ICESat mission was to make ice sheet elevation measurements of the polar regions. Additional goals were to measure the global distribution of clouds and aerosols and to map sea ice, land topography and vegetation. ICESat was the benchmark Earth Observing System (EOS) mission to be used to determine the mass balance of the ice sheets, as well as for providing cloud property information, especially for stratospheric clouds common over polar areas. The GLAS instrument operated from 2003 to 2009 and provided multi-year elevation data needed to determine changes in sea ice freeboard, land topography and vegetation around the globe, in addition to elevation changes of the Greenland and Antarctic ice sheets. This document describes the Precision Orbit Determination (POD) algorithm for the ICESat mission. The problem of determining an accurate ephemeris for an orbiting satellite involves estimating the position and velocity of the satellite from a sequence of observations. The ICESatGLAS elevation measurements must be very accurately geolocated, combining precise orbit information with precision pointing information. The ICESat mission POD requirement states that the position of the instrument should be determined with an accuracy of 5 and 20 cm (1-s) in radial and horizontal components, respectively, to meet the science requirements for determining elevation change.
Modeling and Implementation of Multi-Position Non-Continuous Rotation Gyroscope North Finder.
Luo, Jun; Wang, Zhiqian; Shen, Chengwu; Kuijper, Arjan; Wen, Zhuoman; Liu, Shaojin
2016-09-20
Even when the Global Positioning System (GPS) signal is blocked, a rate gyroscope (gyro) north finder is capable of providing the required azimuth reference information to a certain extent. In order to measure the azimuth between the observer and the north direction very accurately, we propose a multi-position non-continuous rotation gyro north finding scheme. Our new generalized mathematical model analyzes the elements that affect the azimuth measurement precision and can thus provide high precision azimuth reference information. Based on the gyro's principle of detecting a projection of the earth rotation rate on its sensitive axis and the proposed north finding scheme, we are able to deduct an accurate mathematical model of the gyro outputs against azimuth with the gyro and shaft misalignments. Combining the gyro outputs model and the theory of propagation of uncertainty, some approaches to optimize north finding are provided, including reducing the gyro bias error, constraining the gyro random error, increasing the number of rotation points, improving rotation angle measurement precision, decreasing the gyro and the shaft misalignment angles. According them, a north finder setup is built and the azimuth uncertainty of 18" is obtained. This paper provides systematic theory for analyzing the details of the gyro north finder scheme from simulation to implementation. The proposed theory can guide both applied researchers in academia and advanced practitioners in industry for designing high precision robust north finder based on different types of rate gyroscopes.
NASA Technical Reports Server (NTRS)
1975-01-01
The Proceedings contain the papers presented at the Seventh Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting and the edited record of the discussion period following each paper. This meeting provided a forum to promote more effective, efficient, economical and skillful applications of PTTI technology to the many problem areas to which PTTI offers solutions. Specifically the purpose of the meeting is to: disseminate, coordinate, and exchange practical information associated with precise time and frequency; acquaint systems engineers, technicians and managers with precise time and frequency technology and its applications; and review present and future requirements for PTTI.
Master-slave micromanipulator apparatus
Morimoto, A.K.; Kozlowski, D.M.; Charles, S.T.; Spalding, J.A.
1999-08-31
An apparatus is disclosed based on precision X-Y stages that are stacked. Attached to arms projecting from each X-Y stage are a set of two axis gimbals. Attached to the gimbals is a rod, which provides motion along the axis of the rod and rotation around its axis. A dual-planar apparatus that provides six degrees of freedom of motion precise to within microns of motion. Precision linear stages along with precision linear motors, encoders, and controls provide a robotics system. The motors can be positioned in a remote location by incorporating a set of bellows on the motors and can be connected through a computer controller that will allow one to be a master and the other one to be a slave. Position information from the master can be used to control the slave. Forces of interaction of the slave with its environment can be reflected back to the motor control of the master to provide a sense of force sensed by the slave. Forces import onto the master by the operator can be fed back into the control of the slave to reduce the forces required to move it. 12 figs.
Master-slave micromanipulator method
Morimoto, Alan K.; Kozlowski, David M.; Charles, Steven T.; Spalding, James A.
1999-01-01
A method based on precision X-Y stages that are stacked. Attached to arms projecting from each X-Y stage are a set of two axis gimbals. Attached to the gimbals is a rod, which provides motion along the axis of the rod and rotation around its axis. A dual-planar apparatus that provides six degrees of freedom of motion precise to within microns of motion. Precision linear stages along with precision linear motors, encoders, and controls provide a robotics system. The motors can be remotized by incorporating a set of bellows on the motors and can be connected through a computer controller that will allow one to be a master and the other one to be a slave. Position information from the master can be used to control the slave. Forces of interaction of the slave with its environment can be reflected back to the motor control of the master to provide a sense of force sensed by the slave. Forces import onto the master by the operator can be fed back into the control of the slave to reduce the forces required to move it.
Master-slave micromanipulator apparatus
Morimoto, Alan K.; Kozlowski, David M.; Charles, Steven T.; Spalding, James A.
1999-01-01
An apparatus based on precision X-Y stages that are stacked. Attached to arms projecting from each X-Y stage are a set of two axis gimbals. Attached to the gimbals is a rod, which provides motion along the axis of the rod and rotation around its axis. A dual-planar apparatus that provides six degrees of freedom of motion precise to within microns of motion. Precision linear stages along with precision linear motors, encoders, and controls provide a robotics system. The motors can be positioned in a remote location by incorporating a set of bellows on the motors and can be connected through a computer controller that will allow one to be a master and the other one to be a slave. Position information from the master can be used to control the slave. Forces of interaction of the slave with its environment can be reflected back to the motor control of the master to provide a sense of force sensed by the slave. Forces import onto the master by the operator can be fed back into the control of the slave to reduce the forces required to move it.
Wang, Shun-Yi; Chen, Xian-Xia; Li, Yi; Zhang, Yu-Ying
2016-12-20
The arrival of precision medicine plan brings new opportunities and challenges for patients undergoing precision diagnosis and treatment of malignant tumors. With the development of medical imaging, information on different modality imaging can be integrated and comprehensively analyzed by imaging fusion system. This review aimed to update the application of multimodality imaging fusion technology in the precise diagnosis and treatment of malignant tumors under the precision medicine plan. We introduced several multimodality imaging fusion technologies and their application to the diagnosis and treatment of malignant tumors in clinical practice. The data cited in this review were obtained mainly from the PubMed database from 1996 to 2016, using the keywords of "precision medicine", "fusion imaging", "multimodality", and "tumor diagnosis and treatment". Original articles, clinical practice, reviews, and other relevant literatures published in English were reviewed. Papers focusing on precision medicine, fusion imaging, multimodality, and tumor diagnosis and treatment were selected. Duplicated papers were excluded. Multimodality imaging fusion technology plays an important role in tumor diagnosis and treatment under the precision medicine plan, such as accurate location, qualitative diagnosis, tumor staging, treatment plan design, and real-time intraoperative monitoring. Multimodality imaging fusion systems could provide more imaging information of tumors from different dimensions and angles, thereby offing strong technical support for the implementation of precision oncology. Under the precision medicine plan, personalized treatment of tumors is a distinct possibility. We believe that multimodality imaging fusion technology will find an increasingly wide application in clinical practice.
Flight Test Performance of a High Precision Navigation Doppler Lidar
NASA Technical Reports Server (NTRS)
Pierrottet, Diego; Amzajerdian, Farzin; Petway, Larry; Barnes, Bruce; Lockard, George
2009-01-01
A navigation Doppler Lidar (DL) was developed at NASA Langley Research Center (LaRC) for high precision velocity measurements from a lunar or planetary landing vehicle in support of the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. A unique feature of this DL is that it has the capability to provide a precision velocity vector which can be easily separated into horizontal and vertical velocity components and high accuracy line of sight (LOS) range measurements. This dual mode of operation can provide useful information, such as vehicle orientation relative to the direction of travel, and vehicle attitude relative to the sensor footprint on the ground. System performance was evaluated in a series of helicopter flight tests over the California desert. This paper provides a description of the DL system and presents results obtained from these flight tests.
[Precision Nursing: Individual-Based Knowledge Translation].
Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung
2016-12-01
U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.
Field Demonstrations of Active Laser Ranging with Sub-mm Precision
NASA Technical Reports Server (NTRS)
Chen, Yijiang; Birnbaum, Kevin M.; Hemmati, Hamid
2011-01-01
Precision ranging between planets will provide valuable information for scientific studies of the solar system and fundamental physics. Current passive ranging techniques using retro-reflectors are limited to the Earth-Moon distance due to the 1/R? losses. We report on a laboratory realization and field implementation of active laser ranging in real-time with two terminals, emulating interplanetary distance. Sub-millimeter accuracy is demonstrated.
Happel, Max F K; Jeschke, Marcus; Ohl, Frank W
2010-08-18
Primary sensory cortex integrates sensory information from afferent feedforward thalamocortical projection systems and convergent intracortical microcircuits. Both input systems have been demonstrated to provide different aspects of sensory information. Here we have used high-density recordings of laminar current source density (CSD) distributions in primary auditory cortex of Mongolian gerbils in combination with pharmacological silencing of cortical activity and analysis of the residual CSD, to dissociate the feedforward thalamocortical contribution and the intracortical contribution to spectral integration. We found a temporally highly precise integration of both types of inputs when the stimulation frequency was in close spectral neighborhood of the best frequency of the measurement site, in which the overlap between both inputs is maximal. Local intracortical connections provide both directly feedforward excitatory and modulatory input from adjacent cortical sites, which determine how concurrent afferent inputs are integrated. Through separate excitatory horizontal projections, terminating in cortical layers II/III, information about stimulus energy in greater spectral distance is provided even over long cortical distances. These projections effectively broaden spectral tuning width. Based on these data, we suggest a mechanism of spectral integration in primary auditory cortex that is based on temporally precise interactions of afferent thalamocortical inputs and different short- and long-range intracortical networks. The proposed conceptual framework allows integration of different and partly controversial anatomical and physiological models of spectral integration in the literature.
42 CFR 85.7 - Conduct of investigations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... investigation, the employer should precisely identify information which can be obtained in the workplace or workplaces to be inspected as trade secrets. If the NIOSH officer has no clear reason to question such... provide additional information in support of the trade secret designation. The Director, NIOSH, shall...
The least channel capacity for chaos synchronization.
Wang, Mogei; Wang, Xingyuan; Liu, Zhenzhen; Zhang, Huaguang
2011-03-01
Recently researchers have found that a channel with capacity exceeding the Kolmogorov-Sinai entropy of the drive system (h(KS)) is theoretically necessary and sufficient to sustain the unidirectional synchronization to arbitrarily high precision. In this study, we use symbolic dynamics and the automaton reset sequence to distinguish the information that is required in identifying the current drive word and obtaining the synchronization. Then, we show that the least channel capacity that is sufficient to transmit the distinguished information and attain the synchronization of arbitrarily high precision is h(KS). Numerical simulations provide support for our conclusions.
Automatically identifying health outcome information in MEDLINE records.
Demner-Fushman, Dina; Few, Barbara; Hauser, Susan E; Thoma, George
2006-01-01
Understanding the effect of a given intervention on the patient's health outcome is one of the key elements in providing optimal patient care. This study presents a methodology for automatic identification of outcomes-related information in medical text and evaluates its potential in satisfying clinical information needs related to health care outcomes. An annotation scheme based on an evidence-based medicine model for critical appraisal of evidence was developed and used to annotate 633 MEDLINE citations. Textual, structural, and meta-information features essential to outcome identification were learned from the created collection and used to develop an automatic system. Accuracy of automatic outcome identification was assessed in an intrinsic evaluation and in an extrinsic evaluation, in which ranking of MEDLINE search results obtained using PubMed Clinical Queries relied on identified outcome statements. The accuracy and positive predictive value of outcome identification were calculated. Effectiveness of the outcome-based ranking was measured using mean average precision and precision at rank 10. Automatic outcome identification achieved 88% to 93% accuracy. The positive predictive value of individual sentences identified as outcomes ranged from 30% to 37%. Outcome-based ranking improved retrieval accuracy, tripling mean average precision and achieving 389% improvement in precision at rank 10. Preliminary results in outcome-based document ranking show potential validity of the evidence-based medicine-model approach in timely delivery of information critical to clinical decision support at the point of service.
Information Retrieval and the Philosophy of Language.
ERIC Educational Resources Information Center
Blair, David C.
2003-01-01
Provides an overview of some of the main ideas in the philosophy of language that have relevance to the issues of information retrieval, focusing on the description of the intellectual content. Highlights include retrieval problems; recall and precision; words and meanings; context; externalism and the philosophy of language; and scaffolding and…
Molecular Profiling of Liquid Biopsy Samples for Precision Medicine.
Campos, Camila D M; Jackson, Joshua M; Witek, Małgorzata A; Soper, Steven A
In the context of oncology, liquid biopsies consist of harvesting cancer biomarkers, such as circulating tumor cells, tumor-derived cell-free DNA, and extracellular vesicles, from bodily fluids. These biomarkers provide a source of clinically actionable molecular information that can enable precision medicine. Herein, we review technologies for the molecular profiling of liquid biopsy markers with special emphasis on the analysis of low abundant markers from mixed populations.
ERIC Educational Resources Information Center
Edge, Brittani; Velandia, Margarita; Lambert, Dayton M.; Roberts, Roland K.; Larson, James A.; English, Burton C.; Boyer, Christopher; Rejesus, Roderick; Mishra, Ashok
2017-01-01
Using information from precision farmer surveys conducted in the southern United States in 2005 and 2013, we evaluated changes in the use of precision farming information sources among cotton producers. Although Extension remains an important source for producers interested in precision farming information, the percentage of cotton producers using…
AVIRIS Spectrometer Maps Total Water Vapor Column
NASA Technical Reports Server (NTRS)
Conel, James E.; Green, Robert O.; Carrere, Veronique; Margolis, Jack S.; Alley, Ronald E.; Vane, Gregg A.; Bruegge, Carol J.; Gary, Bruce L.
1992-01-01
Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) processes maps of vertical-column abundances of water vapor in atmosphere with good precision and spatial resolution. Maps provide information for meteorology, climatology, and agriculture.
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
Towards precision medicine: from quantitative imaging to radiomics
Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong
2018-01-01
Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604
U.S. Navy Interoperability with its High-End Allies
2000-10-01
Precision weapons require tremendous amounts of information from multiple sensors . Information is first used to plan missions. Then when the weapon is...programed and launched, information must be con - tinuously transmitted at very high rates of speed. The U.S. has developed systems capable of...liberal, on the assumption that advanced sensors can provide sufficient information to judge the severity of incoming threats U.S. allies develop
An economic analysis of five selected LANDSAT assisted information systems in Oregon
NASA Technical Reports Server (NTRS)
Solomon, S.; Maher, K. M.
1979-01-01
A comparative cost analysis was performed on five LANDSAT-based information systems. In all cases, the LANDSAT system was found to have cost advantages over its alternative. The information sets generated by LANDSAT and the alternative method are not identical but are comparable in terms of satisfying the needs of the sponsor. The information obtained from the LANDSAT system in some cases is said to lack precision and detail. On the other hand, it was found to be superior in terms of providing information on areas that are inaccessible and unobtainable through conventional means. There is therefore a trade-off between precision and detail, and considerations of costs. The projects examined were concerned with locating irrigation circles in Morrow County; monitoring tansy ragwort infestation; inventoring old growth Douglas fir near Spotted Owl habitats; inventoring vegetation and resources in all state-owned lands; and determining and use for Columbia River water policies.
Effects of experimental design on calibration curve precision in routine analysis
Pimentel, Maria Fernanda; Neto, Benício de Barros; Saldanha, Teresa Cristina B.
1998-01-01
A computational program which compares the effciencies of different experimental designs with those of maximum precision (D-optimized designs) is described. The program produces confidence interval plots for a calibration curve and provides information about the number of standard solutions, concentration levels and suitable concentration ranges to achieve an optimum calibration. Some examples of the application of this novel computational program are given, using both simulated and real data. PMID:18924816
Baynam, Gareth; Bowman, Faye; Lister, Karla; Walker, Caroline E; Pachter, Nicholas; Goldblatt, Jack; Boycott, Kym M; Gahl, William A; Kosaki, Kenjiro; Adachi, Takeya; Ishii, Ken; Mahede, Trinity; McKenzie, Fiona; Townshend, Sharron; Slee, Jennie; Kiraly-Borri, Cathy; Vasudevan, Anand; Hawkins, Anne; Broley, Stephanie; Schofield, Lyn; Verhoef, Hedwig; Groza, Tudor; Zankl, Andreas; Robinson, Peter N; Haendel, Melissa; Brudno, Michael; Mattick, John S; Dinger, Marcel E; Roscioli, Tony; Cowley, Mark J; Olry, Annie; Hanauer, Marc; Alkuraya, Fowzan S; Taruscio, Domenica; Posada de la Paz, Manuel; Lochmüller, Hanns; Bushby, Kate; Thompson, Rachel; Hedley, Victoria; Lasko, Paul; Mina, Kym; Beilby, John; Tifft, Cynthia; Davis, Mark; Laing, Nigel G; Julkowska, Daria; Le Cam, Yann; Terry, Sharon F; Kaufmann, Petra; Eerola, Iiro; Norstedt, Irene; Rath, Ana; Suematsu, Makoto; Groft, Stephen C; Austin, Christopher P; Draghia-Akli, Ruxandra; Weeramanthri, Tarun S; Molster, Caron; Dawkins, Hugh J S
2017-01-01
Public health relies on technologies to produce and analyse data, as well as effectively develop and implement policies and practices. An example is the public health practice of epidemiology, which relies on computational technology to monitor the health status of populations, identify disadvantaged or at risk population groups and thereby inform health policy and priority setting. Critical to achieving health improvements for the underserved population of people living with rare diseases is early diagnosis and best care. In the rare diseases field, the vast majority of diseases are caused by destructive but previously difficult to identify protein-coding gene mutations. The reduction in cost of genetic testing and advances in the clinical use of genome sequencing, data science and imaging are converging to provide more precise understandings of the 'person-time-place' triad. That is: who is affected (people); when the disease is occurring (time); and where the disease is occurring (place). Consequently we are witnessing a paradigm shift in public health policy and practice towards 'precision public health'.Patient and stakeholder engagement has informed the need for a national public health policy framework for rare diseases. The engagement approach in different countries has produced highly comparable outcomes and objectives. Knowledge and experience sharing across the international rare diseases networks and partnerships has informed the development of the Western Australian Rare Diseases Strategic Framework 2015-2018 (RD Framework) and Australian government health briefings on the need for a National plan.The RD Framework is guiding the translation of genomic and other technologies into the Western Australian health system, leading to greater precision in diagnostic pathways and care, and is an example of how a precision public health framework can improve health outcomes for the rare diseases population.Five vignettes are used to illustrate how policy decisions provide the scaffolding for translation of new genomics knowledge, and catalyze transformative change in delivery of clinical services. The vignettes presented here are from an Australian perspective and are not intended to be comprehensive, but rather to provide insights into how a new and emerging 'precision public health' paradigm can improve the experiences of patients living with rare diseases, their caregivers and families.The conclusion is that genomic public health is informed by the individual and family needs, and the population health imperatives of an early and accurate diagnosis; which is the portal to best practice care. Knowledge sharing is critical for public health policy development and improving the lives of people living with rare diseases.
Development and evaluation of a biomedical search engine using a predicate-based vector space model.
Kwak, Myungjae; Leroy, Gondy; Martinez, Jesse D; Harwell, Jeffrey
2013-10-01
Although biomedical information available in articles and patents is increasing exponentially, we continue to rely on the same information retrieval methods and use very few keywords to search millions of documents. We are developing a fundamentally different approach for finding much more precise and complete information with a single query using predicates instead of keywords for both query and document representation. Predicates are triples that are more complex datastructures than keywords and contain more structured information. To make optimal use of them, we developed a new predicate-based vector space model and query-document similarity function with adjusted tf-idf and boost function. Using a test bed of 107,367 PubMed abstracts, we evaluated the first essential function: retrieving information. Cancer researchers provided 20 realistic queries, for which the top 15 abstracts were retrieved using a predicate-based (new) and keyword-based (baseline) approach. Each abstract was evaluated, double-blind, by cancer researchers on a 0-5 point scale to calculate precision (0 versus higher) and relevance (0-5 score). Precision was significantly higher (p<.001) for the predicate-based (80%) than for the keyword-based (71%) approach. Relevance was almost doubled with the predicate-based approach-2.1 versus 1.6 without rank order adjustment (p<.001) and 1.34 versus 0.98 with rank order adjustment (p<.001) for predicate--versus keyword-based approach respectively. Predicates can support more precise searching than keywords, laying the foundation for rich and sophisticated information search. Copyright © 2013 Elsevier Inc. All rights reserved.
Integrating Programming Language and Operating System Information Security Mechanisms
2016-08-31
suggestions for reducing the burden, to the Department of Defense, Executive Service Directorate (0704-0188). Respondents should be aware that...improve the precision of security enforcement, and to provide greater assurance of information security. This grant focuses on two key projects: language...based control of authority; and formal guarantees for the correctness of audit information. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17
van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H
2017-06-01
The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.
In praise of vagueness: malleability of vague information as a performance booster.
Mishra, Himanshu; Mishra, Arul; Shiv, Baba
2011-06-01
Is the eternal quest for precise information always worthwhile? Our research suggests that, at times, vagueness has its merits. Previous research has demonstrated that people prefer precise information over vague information because it gives them a sense of security and makes their environments more predictable. However, we show that the fuzzy boundaries afforded by vague information can actually help individuals perform better than can precise information. We document these findings across two laboratory studies and one quasi-field study that involved different performance-related contexts (mental acuity, physical strength, and weight loss). We argue that the malleability of vague information allows people to interpret it in the manner they desire, so that they can generate positive response expectancies and, thereby, perform better. The rigidity of precise information discourages desirable interpretations. Hence, on certain occasions, precise information is not as helpful as vague information in boosting performance.
Noisy metrology: a saturable lower bound on quantum Fisher information
NASA Astrophysics Data System (ADS)
Yousefjani, R.; Salimi, S.; Khorashad, A. S.
2017-06-01
In order to provide a guaranteed precision and a more accurate judgement about the true value of the Cramér-Rao bound and its scaling behavior, an upper bound (equivalently a lower bound on the quantum Fisher information) for precision of estimation is introduced. Unlike the bounds previously introduced in the literature, the upper bound is saturable and yields a practical instruction to estimate the parameter through preparing the optimal initial state and optimal measurement. The bound is based on the underling dynamics, and its calculation is straightforward and requires only the matrix representation of the quantum maps responsible for encoding the parameter. This allows us to apply the bound to open quantum systems whose dynamics are described by either semigroup or non-semigroup maps. Reliability and efficiency of the method to predict the ultimate precision limit are demonstrated by three main examples.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-12
... approval for a new information collection to conduct a pilot study to test the Elwha River Dam Removal and... important gap in research on indirect and nonuse values provided by habitat restoration. A study of the... losses associated with the reservoir. The ability to link results of the study to precise measures of...
USDA-ARS?s Scientific Manuscript database
The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...
Multitemporal spectroscopy for crop stress detection using band selection methods
NASA Astrophysics Data System (ADS)
Mewes, Thorsten; Franke, Jonas; Menz, Gunter
2008-08-01
A fast and precise sensor-based identification of pathogen infestations in wheat stands is essential for the implementation of site-specific fungicide applications. Several works have shown possibilities and limitations for the detection of plant stress using spectral sensor data. Hyperspectral data provide the opportunity to collect spectral reflectance in contiguous bands over a broad range of the electromagnetic spectrum. Individual phenomena like the light absorption of leaf pigments can be examined in detail. The precise knowledge of stress-dependent shifting in certain spectral wavelengths provides great advantages in detecting fungal infections. This study focuses on band selection techniques for hyperspectral data to identify relevant and redundant information in spectra regarding a detection of plant stress caused by pathogens. In a laboratory experiment, five 1 sqm boxes with wheat were multitemporarily measured by a ASD Fieldspec® 3 FR spectroradiometer. Two stands were inoculated with Blumeria graminis - the pathogen causing powdery mildew - and one stand was used to simulate the effect of water deficiency. Two stands were kept healthy as control stands. Daily measurements of the spectral reflectance were taken over a 14-day period. Three ASD Pro Lamps were used to illuminate the plots with constant light. By applying band selection techniques, the three types of different wheat vitality could be accurately differentiated at certain stages. Hyperspectral data can provide precise information about pathogen infestations. The reduction of the spectral dimension of sensor data by means of band selection procedures is an appropriate method to speed up the data supply for precision agriculture.
Slater, Jim; Shields, Laura; Racette, Ray J; Juzwishin, Donald; Coppes, Max
2015-11-01
In the era of personalized and precision medicine, the approach to healthcare is quickly changing. Genetic and other molecular information are being increasingly demanded by clinicians and expected by patients for prevention, screening, diagnosis, prognosis, health promotion, and treatment of an increasing number of conditions. As a result of these developments, Canadian health leaders must understand and be prepared to lead the necessary changes associated with these disruptive technologies. This article focuses on precision therapeutics but also provides background on the concepts and terminology related to personalized and precision medicine and explores Canadian health leadership and system issues that may pose barriers to their implementation. The article is intended to inspire, educate, and mobilize Canadian health leaders to initiate dialogue around the transformative changes necessary to ready the healthcare system to realize the benefits of precision therapeutics. © 2015 Collège canadien des leaders en santé
Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.
Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael
2017-06-20
High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.
Integrative methods for analyzing big data in precision medicine.
Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša
2016-03-01
We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Area under precision-recall curves for weighted and unweighted data.
Keilwagen, Jens; Grosse, Ivo; Grau, Jan
2014-01-01
Precision-recall curves are highly informative about the performance of binary classifiers, and the area under these curves is a popular scalar performance measure for comparing different classifiers. However, for many applications class labels are not provided with absolute certainty, but with some degree of confidence, often reflected by weights or soft labels assigned to data points. Computing the area under the precision-recall curve requires interpolating between adjacent supporting points, but previous interpolation schemes are not directly applicable to weighted data. Hence, even in cases where weights were available, they had to be neglected for assessing classifiers using precision-recall curves. Here, we propose an interpolation for precision-recall curves that can also be used for weighted data, and we derive conditions for classification scores yielding the maximum and minimum area under the precision-recall curve. We investigate accordances and differences of the proposed interpolation and previous ones, and we demonstrate that taking into account existing weights of test data is important for the comparison of classifiers.
Area under Precision-Recall Curves for Weighted and Unweighted Data
Grosse, Ivo
2014-01-01
Precision-recall curves are highly informative about the performance of binary classifiers, and the area under these curves is a popular scalar performance measure for comparing different classifiers. However, for many applications class labels are not provided with absolute certainty, but with some degree of confidence, often reflected by weights or soft labels assigned to data points. Computing the area under the precision-recall curve requires interpolating between adjacent supporting points, but previous interpolation schemes are not directly applicable to weighted data. Hence, even in cases where weights were available, they had to be neglected for assessing classifiers using precision-recall curves. Here, we propose an interpolation for precision-recall curves that can also be used for weighted data, and we derive conditions for classification scores yielding the maximum and minimum area under the precision-recall curve. We investigate accordances and differences of the proposed interpolation and previous ones, and we demonstrate that taking into account existing weights of test data is important for the comparison of classifiers. PMID:24651729
NASA Technical Reports Server (NTRS)
Knox, Charles E.
1993-01-01
A piloted simulation study was conducted to examine the requirements for using electromechanical flight instrumentation to provide situation information and flight guidance for manually controlled flight along curved precision approach paths to a landing. Six pilots were used as test subjects. The data from these tests indicated that flight director guidance is required for the manually controlled flight of a jet transport airplane on curved approach paths. Acceptable path tracking performance was attained with each of the three situation information algorithms tested. Approach paths with both multiple sequential turns and short final path segments were evaluated. Pilot comments indicated that all the approach paths tested could be used in normal airline operations.
Molecular Pathology: A Requirement for Precision Medicine in Cancer.
Dietel, Manfred
2016-01-01
The increasing importance of targeting drugs and check-point inhibitors in the treatment of several tumor entities (breast, colon, lung, malignant melanoma, lymphoma, etc.) and the necessity of a companion diagnostic (HER2, (pan)RAS, EGFR, ALK, BRAF, ROS1, MET, PD-L1, etc.) is leading to new challenges for surgical pathology. Since almost all the biomarkers to be specifically detected are tissue based, a precise and reliable diagnostic is absolutely crucial. To meet this challenge surgical pathology has adapted a number of molecular methods (semi-quantitative immunohistochemistry, fluorescence in situ hybridization, PCR and its multiple variants, (pyro/Sanger) sequencing, next generation sequencing (amplicon, whole exome, whole genome), DNA arrays, methylation analyses, etc.) to be applicable for formalin-fixed paraffin-embedded tissue. Reading a patient's tissue as 'deeply' as possible and obtaining information on the morphological, genetic, proteomic and epigenetic background are the tasks of pathologists and molecular biologists and provide the clinicians with information relevant for precision medicine. Intensified cooperation between clinicians and pathologists will provide the basis of improved clinical drug selection and guide development of new cancer gene therapies and molecularly targeted drugs by research units and the pharmaceutical industry. © 2016 S. Karger GmbH, Freiburg.
Faggion, Clovis Mariano; Aranda, Luisiana; Diaz, Karla Tatiana; Shih, Ming-Chieh; Tu, Yu-Kang; Alarcón, Marco Antonio
2016-01-01
Information on precision of treatment-effect estimates is pivotal for understanding research findings. In animal experiments, which provide important information for supporting clinical trials in implant dentistry, inaccurate information may lead to biased clinical trials. The aim of this methodological study was to determine whether sample size calculation, standard errors, and confidence intervals for treatment-effect estimates are reported accurately in publications describing animal experiments in implant dentistry. MEDLINE (via PubMed), Scopus, and SciELO databases were searched to identify reports involving animal experiments with dental implants published from September 2010 to March 2015. Data from publications were extracted into a standardized form with nine items related to precision of treatment estimates and experiment characteristics. Data selection and extraction were performed independently and in duplicate, with disagreements resolved by discussion-based consensus. The chi-square and Fisher exact tests were used to assess differences in reporting according to study sponsorship type and impact factor of the journal of publication. The sample comprised reports of 161 animal experiments. Sample size calculation was reported in five (2%) publications. P values and confidence intervals were reported in 152 (94%) and 13 (8%) of these publications, respectively. Standard errors were reported in 19 (12%) publications. Confidence intervals were better reported in publications describing industry-supported animal experiments (P = .03) and with a higher impact factor (P = .02). Information on precision of estimates is rarely reported in publications describing animal experiments in implant dentistry. This lack of information makes it difficult to evaluate whether the translation of animal research findings to clinical trials is adequate.
Isotopic Ratios of Samarium by TIMS for Nuclear Forensic Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louis Jean, James; Inglis, Jeremy David
The isotopic ratio of Nd, Sm, and Gd can provide important information regarding fissile material (nuclear devices, reactors), neutron environment, and device yield. These studies require precise measurement of Sm isotope ratios, by either TIMS or MC-ICP-MS. There has been an increasing trend to measure smaller and smaller quantities of Sm bearing samples. In nuclear forensics 10-100 ng of Sm are needed for precise measurement. To measure sub-ng Sm samples using TIMS for nuclear forensic analysis.
NASA Technical Reports Server (NTRS)
Schmidlin, F. J.; Michel, W. R.
1985-01-01
Analysis of inflatable sphere measurements obtained during the Energy Budget and MAP/WINE campaigns led to questions concerning the precision of the MPS-36 radar used for tracking the spheres; the compatibility of the sphere program with the MPS-36 radar tracking data; and the oversmoothing of derived parameters at high altitudes. Simulations, with winds having sinusoidal vertical wavelengths, were done with the sphere program (HIROBIN) to determine the resolving capability of various filters. It is concluded that given a precision radar and a perfectly performing sphere, the HIROBIN filters can be adjusted to provide small-scale perturbation information to 70 km (i.e., sinusoidal wavelengths of 2 km). It is recommended that the HIROBIN program be modified to enable it to use a variable length filter, that adjusts to fall velocity and accelerations to provide wind data with small perturbations.
Ultrathin conformal devices for precise and continuous thermal characterization of human skin
Webb, R. Chad; Bonifas, Andrew P.; Behnaz, Alex; Zhang, Yihui; Yu, Ki Jun; Cheng, Huanyu; Shi, Mingxing; Bian, Zuguang; Liu, Zhuangjian; Kim, Yun-Soung; Yeo, Woon-Hong; Park, Jae Suk; Song, Jizhou; Li, Yuhang; Huang, Yonggang; Gorbach, Alexander M.; Rogers, John A.
2013-01-01
Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples. PMID:24037122
Ultrathin conformal devices for precise and continuous thermal characterization of human skin
NASA Astrophysics Data System (ADS)
Webb, R. Chad; Bonifas, Andrew P.; Behnaz, Alex; Zhang, Yihui; Yu, Ki Jun; Cheng, Huanyu; Shi, Mingxing; Bian, Zuguang; Liu, Zhuangjian; Kim, Yun-Soung; Yeo, Woon-Hong; Park, Jae Suk; Song, Jizhou; Li, Yuhang; Huang, Yonggang; Gorbach, Alexander M.; Rogers, John A.
2013-10-01
Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples.
Automation of Precise Time Reference Stations (PTRS)
NASA Astrophysics Data System (ADS)
Wheeler, P. J.
1985-04-01
The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.
A novel imaging method for photonic crystal fiber fusion splicer
NASA Astrophysics Data System (ADS)
Bi, Weihong; Fu, Guangwei; Guo, Xuan
2007-01-01
Because the structure of Photonic Crystal Fiber (PCF) is very complex, and it is very difficult that traditional fiber fusion splice obtains optical axial information of PCF. Therefore, we must search for a bran-new optical imaging method to get section information of Photonic Crystal Fiber. Based on complex trait of PCF, a novel high-precision optics imaging system is presented in this article. The system uses a thinned electron-bombarded CCD (EBCCD) which is a kind of image sensor as imaging element, the thinned electron-bombarded CCD can offer low light level performance superior to conventional image intensifier coupled CCD approaches, this high-performance device can provide high contrast high resolution in low light level surveillance imaging; in order to realize precision focusing of image, we use a ultra-highprecision pace motor to adjust position of imaging lens. In this way, we can obtain legible section information of PCF. We may realize further concrete analysis for section information of PCF by digital image processing technology. Using this section information may distinguish different sorts of PCF, compute some parameters such as the size of PCF ventage, cladding structure of PCF and so on, and provide necessary analysis data for PCF fixation, adjustment, regulation, fusion and cutting system.
Highly Siderophile Elements in Shocked and Unshocked Chondrites
NASA Technical Reports Server (NTRS)
Horan, M. F.; Walker, R. J.; Rubin, A. E.
2001-01-01
High precision abundances of Re, Os, Pt, Ir, Ru, and Pd are combined with Re-Os isotopic data to demonstrate that HSE provide a distinctive fingerprint for each of the chondrite groups. Additional information is contained in the original extended abstract.
ERIC Educational Resources Information Center
Dwight, Ernest
2008-01-01
Campus signage provides important navigational information, and it reflects an institution's image. Creating a signage system using strategically planned design methodology, precision fabrication, plus well-planned placement, enhances a campus environment. This helps create a positive experience for visitors and prospective students. An effective…
Precision Medicine: From Science To Value.
Ginsburg, Geoffrey S; Phillips, Kathryn A
2018-05-01
Precision medicine is making an impact on patients, health care delivery systems, and research participants in ways that were only imagined fifteen years ago when the human genome was first sequenced. Discovery of disease-causing and drug-response genetic variants has accelerated, while adoption into clinical medicine has lagged. We define precision medicine and the stakeholder community required to enable its integration into research and health care. We explore the intersection of data science, analytics, and precision medicine in the formation of health systems that carry out research in the context of clinical care and that optimize the tools and information used to deliver improved patient outcomes. We provide examples of real-world impact and conclude with a policy and economic agenda necessary for the adoption of this new paradigm of health care both in the United States and globally.
Nursing implications of personalized and precision medicine.
Vorderstrasse, Allison A; Hammer, Marilyn J; Dungan, Jennifer R
2014-05-01
Identify and discuss the nursing implications of personalized and precision oncology care. PubMed, CINAHL. The implications in personalized and precision cancer nursing care include interpretation and clinical use of novel and personalized information including genetic testing; patient advocacy and support throughout testing, anticipation of results and treatment; ongoing chronic monitoring; and support for patient decision-making. Attention must also be given to the family and ethical implications of a personalized approach to care. Nurses face increasing challenges and opportunities in communication, support, and advocacy for patients given the availability of advanced testing, care and treatment in personalized and precision medicine. Nursing education and continuing education, clinical decision support, and health systems changes will be necessary to provide personalized multidisciplinary care to patients, in which nurses play a key role. Copyright © 2014 Elsevier Inc. All rights reserved.
Viewing geometry determines the contribution of binocular vision to the online control of grasping.
Keefe, Bruce D; Watt, Simon J
2017-12-01
Binocular vision is often assumed to make a specific, critical contribution to online visual control of grasping by providing precise information about the separation between digits and object. This account overlooks the 'viewing geometry' typically encountered in grasping, however. Separation of hand and object is rarely aligned precisely with the line of sight (the visual depth dimension), and analysis of the raw signals suggests that, for most other viewing angles, binocular feedback is less precise than monocular feedback. Thus, online grasp control relying selectively on binocular feedback would not be robust to natural changes in viewing geometry. Alternatively, sensory integration theory suggests that different signals contribute according to their relative precision, in which case the role of binocular feedback should depend on viewing geometry, rather than being 'hard-wired'. We manipulated viewing geometry, and assessed the role of binocular feedback by measuring the effects on grasping of occluding one eye at movement onset. Loss of binocular feedback resulted in a significantly less extended final slow-movement phase when hand and object were separated primarily in the frontoparallel plane (where binocular information is relatively imprecise), compared to when they were separated primarily along the line of sight (where binocular information is relatively precise). Consistent with sensory integration theory, this suggests the role of binocular (and monocular) vision in online grasp control is not a fixed, 'architectural' property of the visuo-motor system, but arises instead from the interaction of viewer and situation, allowing robust online control across natural variations in viewing geometry.
Eighth International Workshop on Laser Ranging Instrumentation
NASA Technical Reports Server (NTRS)
Degnan, John J. (Compiler)
1993-01-01
The Eighth International Workshop for Laser Ranging Instrumentation was held in Annapolis, Maryland in May 1992, and was sponsored by the NASA Goddard Space Flight Center in Greenbelt, Maryland. The workshop is held once every 2 to 3 years under differing institutional sponsorship and provides a forum for participants to exchange information on the latest developments in satellite and lunar laser ranging hardware, software, science applications, and data analysis techniques. The satellite laser ranging (SLR) technique provides sub-centimeter precision range measurements to artificial satellites and the Moon. The data has application to a wide range of Earth and lunar science issues including precise orbit determination, terrestrial reference frames, geodesy, geodynamics, oceanography, time transfer, lunar dynamics, gravity and relativity.
Applications of RNA Indexes for Precision Oncology in Breast Cancer.
Ma, Liming; Liang, Zirui; Zhou, Hui; Qu, Lianghu
2018-05-09
Precision oncology aims to offer the most appropriate treatments to cancer patients mainly based on their individual genetic information. Genomics has provided numerous valuable data on driver mutations and risk loci; however, it remains a formidable challenge to transform these data into therapeutic agents. Transcriptomics describes the multifarious expression patterns of both mRNAs and non-coding RNAs (ncRNAs), which facilitates the deciphering of genomic codes. In this review, we take breast cancer as an example to demonstrate the applications of these rich RNA resources in precision medicine exploration. These include the use of mRNA profiles in triple-negative breast cancer (TNBC) subtyping to inform corresponding candidate targeted therapies; current advancements and achievements of high-throughput RNA interference (RNAi) screening technologies in breast cancer; and microRNAs as functional signatures for defining cell identities and regulating the biological activities of breast cancer cells. We summarize the benefits of transcriptomic analyses in breast cancer management and propose that unscrambling the core signaling networks of cancer may be an important task of multiple-omic data integration for precision oncology. Copyright © 2018 The Authors. Production and hosting by Elsevier B.V. All rights reserved.
Future paradigms for precision oncology
Klement, Giannoula Lakka; Arkun, Knarik; Valik, Dalibor; Roffidal, Tina; Hashemi, Ali; Klement, Christos; Carmassi, Paolo; Rietman, Edward; Slaby, Ondrej; Mazanek, Pavel; Mudry, Peter; Kovacs, Gabor; Kiss, Csongor; Norga, Koen; Konstantinov, Dobrin; André, Nicolas; Slavc, Irene; van Den Berg, Henk; Kolenova, Alexandra; Kren, Leos; Tuma, Jiri; Skotakova, Jarmila; Sterba, Jaroslav
2016-01-01
Research has exposed cancer to be a heterogeneous disease with a high degree of inter-tumoral and intra-tumoral variability. Individual tumors have unique profiles, and these molecular signatures make the use of traditional histology-based treatments problematic. The conventional diagnostic categories, while necessary for care, thwart the use of molecular information for treatment as molecular characteristics cross tissue types. This is compounded by the struggle to keep abreast the scientific advances made in all fields of science, and by the enormous challenge to organize, cross-reference, and apply molecular data for patient benefit. In order to supplement the site-specific, histology-driven diagnosis with genomic, proteomic and metabolomics information, a paradigm shift in diagnosis and treatment of patients is required. While most physicians are open and keen to use the emerging data for therapy, even those versed in molecular therapeutics are overwhelmed with the amount of available data. It is not surprising that even though The Human Genome Project was completed thirteen years ago, our patients have not benefited from the information. Physicians cannot, and should not be asked to process the gigabytes of genomic and proteomic information on their own in order to provide patients with safe therapies. The following consensus summary identifies the needed for practice changes, proposes potential solutions to the present crisis of informational overload, suggests ways of providing physicians with the tools necessary for interpreting patient specific molecular profiles, and facilitates the implementation of quantitative precision medicine. It also provides two case studies where this approach has been used. PMID:27223079
Automated Text Markup for Information Retrieval from an Electronic Textbook of Infectious Disease
Berrios, Daniel C.; Kehler, Andrew; Kim, David K.; Yu, Victor L.; Fagan, Lawrence M.
1998-01-01
The information needs of practicing clinicians frequently require textbook or journal searches. Making these sources available in electronic form improves the speed of these searches, but precision (i.e., the fraction of relevant to total documents retrieved) remains low. Improving the traditional keyword search by transforming search terms into canonical concepts does not improve search precision greatly. Kim et al. have designed and built a prototype system (MYCIN II) for computer-based information retrieval from a forthcoming electronic textbook of infectious disease. The system requires manual indexing by experts in the form of complex text markup. However, this mark-up process is time consuming (about 3 person-hours to generate, review, and transcribe the index for each of 218 chapters). We have designed and implemented a system to semiautomate the markup process. The system, information extraction for semiautomated indexing of documents (ISAID), uses query models and existing information-extraction tools to provide support for any user, including the author of the source material, to mark up tertiary information sources quickly and accurately.
DOT National Transportation Integrated Search
2017-03-24
The Pikalert System provides high precision road weather guidance. It assesses current weather and road conditions based on observations from connected vehicles, road weather information stations, radar, and weather model analysis fields. It also for...
Automated survey of pavement distress based on 2D and 3D laser images.
DOT National Transportation Integrated Search
2011-11-01
Despite numerous efforts in recent decades, currently most information on pavement surface distresses cannot be obtained automatically, at high-speed, and at acceptable precision and bias levels. This research provided seed funding to produce a funct...
Spatial interpolation quality assessments for soil sensor transect datasets
USDA-ARS?s Scientific Manuscript database
Near-ground geophysical soil sensors provide extremely valuable information for precision agriculture applications. Indeed, their readings can be used as proxy for many soil parameters. Typically, leave-one-out (loo) cross-validation (CV) of spatial interpolation of sensor data returns overly optimi...
Greenspan, Bennett S
2017-12-01
This article discusses the role of PET/CT in contributing to precision medicine in lung cancer, and provides the perspective of the Society of Nuclear Medicine and Molecular Imaging (SNMMI) on this process. The mission and vision of SNMMI are listed, along with the guidance provided by SNMMI to promote best practice in precision medicine. Basic principles of PET/CT are presented. An overview of the use of PET/CT imaging in lung cancer is discussed. In lung cancer patients, PET/CT is vitally important for optimal patient management. PET/CT is essential in determining staging and re-staging of disease, detecting recurrent or residual disease, evaluating response to therapy, and providing prognostic information. PET/CT is also critically important in radiation therapy planning by determining the extent of active disease, including an assessment of functional tumor volume. The current approach in tumor imaging is a significant advance over conventional imaging. However, recent advances suggest that therapeutic response criteria in the near future will be based on metabolic characteristics and will include the evaluation of biologic characteristics of tumors to further enhance the effectiveness of precision medicine in lung cancer, producing improved patient outcomes with less morbidity.
Proceedings of the 8th Precise Time and Time Interval (PTTI) Applications and Planning Meeting
NASA Technical Reports Server (NTRS)
1977-01-01
The Proceedings contain the papers presented at the Eight Annual Precise Time and Tme Interval PTTI Applications and Planning Meeting. The edited record of the discussions following the papers and the panel discussions are also included. This meeting provided a forum for the exchange of information on precise time and frequency technology among members of the scientific community and persons with program applications. The 282 registered attendees came from various U.S. Government agencies, private industry, universities and a number of foreign countries were represented. In this meeting, papers were presented that emphasized: (1) definitions and international regulations of precise time sources and users, (2) the scientific foundations of Hydrogen Maser standards, the current developments in this field and the application experience, and (3) how to measure the stability performance properties of precise standards. As in the previous meetings, update and new papers were presented on system applications with past, present and future requirements identified.
Metabolomics enables precision medicine: "A White Paper, Community Perspective".
Beger, Richard D; Dunn, Warwick; Schmidt, Michael A; Gross, Steven S; Kirwan, Jennifer A; Cascante, Marta; Brennan, Lorraine; Wishart, David S; Oresic, Matej; Hankemeier, Thomas; Broadhurst, David I; Lane, Andrew N; Suhre, Karsten; Kastenmüller, Gabi; Sumner, Susan J; Thiele, Ines; Fiehn, Oliver; Kaddurah-Daouk, Rima
Metabolomics is the comprehensive study of the metabolome, the repertoire of biochemicals (or small molecules) present in cells, tissues, and body fluids. The study of metabolism at the global or "-omics" level is a rapidly growing field that has the potential to have a profound impact upon medical practice. At the center of metabolomics, is the concept that a person's metabolic state provides a close representation of that individual's overall health status. This metabolic state reflects what has been encoded by the genome, and modified by diet, environmental factors, and the gut microbiome. The metabolic profile provides a quantifiable readout of biochemical state from normal physiology to diverse pathophysiologies in a manner that is often not obvious from gene expression analyses. Today, clinicians capture only a very small part of the information contained in the metabolome, as they routinely measure only a narrow set of blood chemistry analytes to assess health and disease states. Examples include measuring glucose to monitor diabetes, measuring cholesterol and high density lipoprotein/low density lipoprotein ratio to assess cardiovascular health, BUN and creatinine for renal disorders, and measuring a panel of metabolites to diagnose potential inborn errors of metabolism in neonates. We anticipate that the narrow range of chemical analyses in current use by the medical community today will be replaced in the future by analyses that reveal a far more comprehensive metabolic signature. This signature is expected to describe global biochemical aberrations that reflect patterns of variance in states of wellness, more accurately describe specific diseases and their progression, and greatly aid in differential diagnosis. Such future metabolic signatures will: (1) provide predictive, prognostic, diagnostic, and surrogate markers of diverse disease states; (2) inform on underlying molecular mechanisms of diseases; (3) allow for sub-classification of diseases, and stratification of patients based on metabolic pathways impacted; (4) reveal biomarkers for drug response phenotypes, providing an effective means to predict variation in a subject's response to treatment (pharmacometabolomics); (5) define a metabotype for each specific genotype, offering a functional read-out for genetic variants: (6) provide a means to monitor response and recurrence of diseases, such as cancers: (7) describe the molecular landscape in human performance applications and extreme environments. Importantly, sophisticated metabolomic analytical platforms and informatics tools have recently been developed that make it possible to measure thousands of metabolites in blood, other body fluids, and tissues. Such tools also enable more robust analysis of response to treatment. New insights have been gained about mechanisms of diseases, including neuropsychiatric disorders, cardiovascular disease, cancers, diabetes and a range of pathologies. A series of ground breaking studies supported by National Institute of Health (NIH) through the Pharmacometabolomics Research Network and its partnership with the Pharmacogenomics Research Network illustrate how a patient's metabotype at baseline, prior to treatment, during treatment, and post-treatment, can inform about treatment outcomes and variations in responsiveness to drugs (e.g., statins, antidepressants, antihypertensives and antiplatelet therapies). These studies along with several others also exemplify how metabolomics data can complement and inform genetic data in defining ethnic, sex, and gender basis for variation in responses to treatment, which illustrates how pharmacometabolomics and pharmacogenomics are complementary and powerful tools for precision medicine. Our metabolomics community believes that inclusion of metabolomics data in precision medicine initiatives is timely and will provide an extremely valuable layer of data that compliments and informs other data obtained by these important initiatives. Our Metabolomics Society, through its "Precision Medicine and Pharmacometabolomics Task Group", with input from our metabolomics community at large, has developed this White Paper where we discuss the value and approaches for including metabolomics data in large precision medicine initiatives. This White Paper offers recommendations for the selection of state of-the-art metabolomics platforms and approaches that offer the widest biochemical coverage, considers critical sample collection and preservation, as well as standardization of measurements, among other important topics. We anticipate that our metabolomics community will have representation in large precision medicine initiatives to provide input with regard to sample acquisition/preservation, selection of optimal omics technologies, and key issues regarding data collection, interpretation, and dissemination. We strongly recommend the collection and biobanking of samples for precision medicine initiatives that will take into consideration needs for large-scale metabolic phenotyping studies.
Development of a UAV system for VNIR-TIR acquisitions in precision agriculture
NASA Astrophysics Data System (ADS)
Misopolinos, L.; Zalidis, Ch.; Liakopoulos, V.; Stavridou, D.; Katsigiannis, P.; Alexandridis, T. K.; Zalidis, G.
2015-06-01
Adoption of precision agriculture techniques requires the development of specialized tools that provide spatially distributed information. Both flying platforms and airborne sensors are being continuously evolved to cover the needs of plant and soil sensing at affordable costs. Due to restrictions in payload, flying platforms are usually limited to carry a single sensor on board. The aim of this work is to present the development of a vertical take-off and landing autonomous unmanned aerial vehicle (VTOL UAV) system for the simultaneous acquisition of high resolution vertical images at the visible, near infrared (VNIR) and thermal infrared (TIR) wavelengths. A system was developed that has the ability to trigger two cameras simultaneously with a fully automated process and no pilot intervention. A commercial unmanned hexacopter UAV platform was optimized to increase reliability, ease of operation and automation. The designed systems communication platform is based on a reduced instruction set computing (RISC) processor running Linux OS with custom developed drivers in an efficient way, while keeping the cost and weight to a minimum. Special software was also developed for the automated image capture, data processing and on board data and metadata storage. The system was tested over a kiwifruit field in northern Greece, at flying heights of 70 and 100m above the ground. The acquired images were mosaicked and geo-corrected. Images from both flying heights were of good quality and revealed unprecedented detail within the field. The normalized difference vegetation index (NDVI) was calculated along with the thermal image in order to provide information on the accurate location of stressors and other parameters related to the crop productivity. Compared to other available sources of data, this system can provide low cost, high resolution and easily repeatable information to cover the requirements of precision agriculture.
Iconic Memories Die a Sudden Death.
Pratte, Michael S
2018-06-01
Iconic memory is characterized by its large storage capacity and brief storage duration, whereas visual working memory is characterized by its small storage capacity. The limited information stored in working memory is often modeled as an all-or-none process in which studied information is either successfully stored or lost completely. This view raises a simple question: If almost all viewed information is stored in iconic memory, yet one second later most of it is completely absent from working memory, what happened to it? Here, I characterized how the precision and capacity of iconic memory changed over time and observed a clear dissociation: Iconic memory suffered from a complete loss of visual items, while the precision of items retained in memory was only marginally affected by the passage of time. These results provide new evidence for the discrete-capacity view of working memory and a new characterization of iconic memory decay.
Fundamentals of precision medicine
Divaris, Kimon
2018-01-01
Imagine a world where clinicians make accurate diagnoses and provide targeted therapies to their patients according to well-defined, biologically-informed disease subtypes, accounting for individual differences in genetic make-up, behaviors, cultures, lifestyles and the environment. This is not as utopic as it may seem. Relatively recent advances in science and technology have led to an explosion of new information on what underlies health and what constitutes disease. These novel insights emanate from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, as well as epigenomics and exposomics—such ‘omics data can now be generated at unprecedented depth and scale, and at rapidly decreasing cost. Making sense and integrating these fundamental information domains to transform health care and improve health remains a challenge—an ambitious, laudable and high-yield goal. Precision dentistry is no longer a distant vision; it is becoming part of the rapidly evolving present. Insights from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, and epigenomics and exposomics have reached an unprecedented depth and scale. Much more needs to be done, however, for the realization of precision medicine in the oral health domain. PMID:29227115
The Data-Driven Approach to Spectroscopic Analyses
NASA Astrophysics Data System (ADS)
Ness, M.
2018-01-01
I review the data-driven approach to spectroscopy, The Cannon, which is a method for deriving fundamental diagnostics of galaxy formation of precise chemical compositions and stellar ages, across many stellar surveys that are mapping the Milky Way. With The Cannon, the abundances and stellar parameters from the multitude of stellar surveys can be placed directly on the same scale, using stars in common between the surveys. Furthermore, the information that resides in the data can be fully extracted, this has resulted in higher precision stellar parameters and abundances being delivered from spectroscopic data and has opened up new avenues in galactic archeology, for example, in the determination of ages for red giant stars across the Galactic disk. Coupled with Gaia distances, proper motions, and derived orbit families, the stellar age and individual abundance information delivered at the precision obtained with the data-driven approach provides very strong constraints on the evolution of and birthplace of stars in the Milky Way. I will review the role of data-driven spectroscopy as we enter the era where we have both the data and the tools to build the ultimate conglomerate of galactic information as well as highlight further applications of data-driven models in the coming decade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pritychenko, B.
The precision of double-beta ββ-decay experimental half lives and their uncertainties is reanalyzed. The method of Benford's distributions has been applied to nuclear reaction, structure and decay data sets. First-digit distribution trend for ββ-decay T 2v 1/2 is consistent with large nuclear reaction and structure data sets and provides validation of experimental half-lives. A complementary analysis of the decay uncertainties indicates deficiencies due to small size of statistical samples, and incomplete collection of experimental information. Further experimental and theoretical efforts would lead toward more precise values of-decay half-lives and nuclear matrix elements.
Precision of working memory for visual motion sequences and transparent motion surfaces
Zokaei, Nahid; Gorgoraptis, Nikos; Bahrami, Bahador; Bays, Paul M; Husain, Masud
2012-01-01
Recent studies investigating working memory for location, colour and orientation support a dynamic resource model. We examined whether this might also apply to motion, using random dot kinematograms (RDKs) presented sequentially or simultaneously. Mean precision for motion direction declined as sequence length increased, with precision being lower for earlier RDKs. Two alternative models of working memory were compared specifically to distinguish between the contributions of different sources of error that corrupt memory (Zhang & Luck (2008) vs. Bays et al (2009)). The latter provided a significantly better fit for the data, revealing that decrease in memory precision for earlier items is explained by an increase in interference from other items in a sequence, rather than random guessing or a temporal decay of information. Misbinding feature attributes is an important source of error in working memory. Precision of memory for motion direction decreased when two RDKs were presented simultaneously as transparent surfaces, compared to sequential RDKs. However, precision was enhanced when one motion surface was prioritized, demonstrating that selective attention can improve recall precision. These results are consistent with a resource model that can be used as a general conceptual framework for understanding working memory across a range of visual features. PMID:22135378
Precision medicine in cardiology.
Antman, Elliott M; Loscalzo, Joseph
2016-10-01
The cardiovascular research and clinical communities are ideally positioned to address the epidemic of noncommunicable causes of death, as well as advance our understanding of human health and disease, through the development and implementation of precision medicine. New tools will be needed for describing the cardiovascular health status of individuals and populations, including 'omic' data, exposome and social determinants of health, the microbiome, behaviours and motivations, patient-generated data, and the array of data in electronic medical records. Cardiovascular specialists can build on their experience and use precision medicine to facilitate discovery science and improve the efficiency of clinical research, with the goal of providing more precise information to improve the health of individuals and populations. Overcoming the barriers to implementing precision medicine will require addressing a range of technical and sociopolitical issues. Health care under precision medicine will become a more integrated, dynamic system, in which patients are no longer a passive entity on whom measurements are made, but instead are central stakeholders who contribute data and participate actively in shared decision-making. Many traditionally defined diseases have common mechanisms; therefore, elimination of a siloed approach to medicine will ultimately pave the path to the creation of a universal precision medicine environment.
Precise monitoring of global temperature trends from satellites
NASA Technical Reports Server (NTRS)
Spencer, Roy W.; Christy, John R.
1990-01-01
Passive microwave radiometry from satellites provides more precise atmospheric temperature information than that obtained from the relatively sparse distribution of thermometers over the earth's surface. Accurate global atmospheric temperature estimates are needed for detection of possible greenhouse warming, evaluation of computer models of climate change, and for understanding important factors in the climate system. Analysis of the first 10 years (1979 to 1988) of satellite measurements of lower atmospheric temperature changes reveals a monthly precision of 0.01 C, large temperature variability on time scales from weeks to several years, but no obvious trend for the 10-year period. The warmest years, in descending order, were 1987, 1988, 1983, and 1980. The years 1984, 1985, and 1986 were the coolest.
Play-Based Neuropsychological Assessment of Toddlers
ERIC Educational Resources Information Center
Dykeman, Bruce F.
2008-01-01
Standardized psychological assessment provides a precise yet limited view of the neuropsychological status of preschool toddlers, whose brain functioning is only beginning to develop localized functioning. Yet, referrals for preschool evaluation of these early-age children often request a wide variety of information about brain-behavior…
The PDF4LHC report on PDFs and LHC data: Results from Run I and preparation for Run II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rojo, Juan; Accardi, Alberto; Ball, Richard D.
2015-09-16
The accurate determination of Parton Distribution Functions (PDFs) of the proton is an essential ingredient of the Large Hadron Collider (LHC) program. PDF uncertainties impact a wide range of processes, from Higgs boson characterization and precision Standard Model measurements to New Physics searches. A major recent development in modern PDF analyses has been to exploit the wealth of new information contained in precision measurements from the LHC Run I, as well as progress in tools and methods to include these data in PDF fits. In this report we summarize the information that PDF-sensitive measurements at the LHC have provided somore » far, and review the prospects for further constraining PDFs with data from the recently started Run II. As a result, this document aims to provide useful input to the LHC collaborations to prioritize their PDF-sensitive measurements at Run II, as well as a comprehensive reference for the PDF-fitting collaborations.« less
Quantum metrology and estimation of Unruh effect
Wang, Jieci; Tian, Zehua; Jing, Jiliang; Fan, Heng
2014-01-01
We study the quantum metrology for a pair of entangled Unruh-Dewitt detectors when one of them is accelerated and coupled to a massless scalar field. Comparing with previous schemes, our model requires only local interaction and avoids the use of cavities in the probe state preparation process. We show that the probe state preparation and the interaction between the accelerated detector and the external field have significant effects on the value of quantum Fisher information, correspondingly pose variable ultimate limit of precision in the estimation of Unruh effect. We find that the precision of the estimation can be improved by a larger effective coupling strength and a longer interaction time. Alternatively, the energy gap of the detector has a range that can provide us a better precision. Thus we may adjust those parameters and attain a higher precision in the estimation. We also find that an extremely high acceleration is not required in the quantum metrology process. PMID:25424772
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Precision Medicine: From Science to Value
Ginsburg, Geoffrey S; Phillips, Kathryn A
2018-01-01
Precision medicine is poised to have an impact on patients, health care delivery systems and research participants in ways that were only imagined 15 years ago when the human genome was first sequenced. While discovery using genome-based technologies has accelerated, these have only begun to be adopted into clinical medicine. Here we define precision medicine and the stakeholder ecosystem required to enable its integration into research and health care. We explore the intersection of data science, analytics and precision medicine in creating a learning health system that carries out research in the context of clinical care and at the same time optimizes the tools and information used to delivery improved patient outcomes. We provide examples of real world impact, and conclude with a policy and economic agenda that will be necessary for the adoption of this new paradigm of health care both in the United States and globally. PMID:29733705
An Abstraction-Based Data Model for Information Retrieval
NASA Astrophysics Data System (ADS)
McAllister, Richard A.; Angryk, Rafal A.
Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.
Hybridization of Architectural Styles for Integrated Enterprise Information Systems
NASA Astrophysics Data System (ADS)
Bagusyte, Lina; Lupeikiene, Audrone
Current enterprise systems engineering theory does not provide adequate support for the development of information systems on demand. To say more precisely, it is forming. This chapter proposes the main architectural decisions that underlie the design of integrated enterprise information systems. This chapter argues for the extending service-oriented architecture - for merging it with component-based paradigm at the design stage and using connectors of different architectural styles. The suitability of general-purpose language SysML for the modeling of integrated enterprise information systems architectures is described and arguments pros are presented.
NASA Astrophysics Data System (ADS)
Karagiannis, Georgios
2017-03-01
This work led to a new method named 3D spectracoustic tomographic mapping imaging. The current and the future work is related to the fabrication of a combined acoustic microscopy transducer and infrared illumination probe permitting the simultaneous acquisition of the spectroscopic and the tomographic information. This probe provides with the capability of high fidelity and precision registered information from the combined modalities named spectracoustic information.
Monte Carlo simulations of precise timekeeping in the Milstar communication satellite system
NASA Technical Reports Server (NTRS)
Camparo, James C.; Frueholz, R. P.
1995-01-01
The Milstar communications satellite system will provide secure antijam communication capabilities for DOD operations into the next century. In order to accomplish this task, the Milstar system will employ precise timekeeping on its satellites and at its ground control stations. The constellation will consist of four satellites in geosynchronous orbit, each carrying a set of four rubidium (Rb) atomic clocks. Several times a day, during normal operation, the Mission Control Element (MCE) will collect timing information from the constellation, and after several days use this information to update the time and frequency of the satellite clocks. The MCE will maintain precise time with a cesium (Cs) atomic clock, synchronized to UTC(USNO) via a GPS receiver. We have developed a Monte Carlo simulation of Milstar's space segment timekeeping. The simulation includes the effects of: uplink/downlink time transfer noise; satellite crosslink time transfer noise; satellite diurnal temperature variations; satellite and ground station atomic clock noise; and also quantization limits regarding satellite time and frequency corrections. The Monte Carlo simulation capability has proven to be an invaluable tool in assessing the performance characteristics of various timekeeping algorithms proposed for Milstar, and also in highlighting the timekeeping capabilities of the system. Here, we provide a brief overview of the basic Milstar timekeeping architecture as it is presently envisioned. We then describe the Monte Carlo simulation of space segment timekeeping, and provide examples of the simulation's efficacy in resolving timekeeping issues.
Receptive Fields and the Reconstruction of Visual Information.
1985-09-01
depending on the noise . Thus our model would suggest that the interpolation filters for deblurring are playing a role in Ii hyperacuity. This is novel...of additional precision in the information can be obtained by a process of deblurring , which could be relevant to hyperacuity. It also provides an... impulse of heat diffuses into increasingly larger Gaussian distributions as time proceeds. Mathematically, let f(x) denote the initial temperature
Lossless compression of image data products on th e FIFE CD-ROM series
NASA Technical Reports Server (NTRS)
Newcomer, Jeffrey A.; Strebel, Donald E.
1993-01-01
How do you store enough of the key data sets, from a total of 120 gigabytes of data collected for a scientific experiment, on a collection of CD-ROM's, small enough to distribute to a broad scientific community? In such an application where information loss in unacceptable, lossless compression algorithms are the only choice. Although lossy compression algorithms can provide an order of magnitude improvement in compression ratios over lossless algorithms the information that is lost is often part of the key scientific precision of the data. Therefore, lossless compression algorithms are and will continue to be extremely important in minimizing archiving storage requirements and distribution of large earth and space (ESS) data sets while preserving the essential scientific precision of the data.
Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2002-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.
Hughes, Kevin S; Ambinder, Edward P; Hess, Gregory P; Yu, Peter Paul; Bernstam, Elmer V; Routbort, Mark J; Clemenceau, Jean Rene; Hamm, John T; Febbo, Phillip G; Domchek, Susan M; Chen, James L; Warner, Jeremy L
2017-09-20
At the ASCO Data Standards and Interoperability Summit held in May 2016, it was unanimously decided that four areas of current oncology clinical practice have serious, unmet health information technology needs. The following areas of need were identified: 1) omics and precision oncology, 2) advancing interoperability, 3) patient engagement, and 4) value-based oncology. To begin to address these issues, ASCO convened two complementary workshops: the Omics and Precision Oncology Workshop in October 2016 and the Advancing Interoperability Workshop in December 2016. A common goal was to address the complexity, enormity, and rapidly changing nature of genomic information, which existing electronic health records are ill equipped to manage. The subject matter experts invited to the Omics and Precision Oncology Workgroup were tasked with the responsibility of determining a specific, limited need that could be addressed by a software application (app) in the short-term future, using currently available genomic knowledge bases. Hence, the scope of this workshop was to determine the basic functionality of one app that could serve as a test case for app development. The goal of the second workshop, described separately, was to identify the specifications for such an app. This approach was chosen both to facilitate the development of a useful app and to help ASCO and oncologists better understand the mechanics, difficulties, and gaps in genomic clinical decision support tool development. In this article, we discuss the key challenges and recommendations identified by the workshop participants. Our hope is to narrow the gap between the practicing oncologist and ongoing national efforts to provide precision oncology and value-based care to cancer patients.
Fidelity of the ensemble code for visual motion in primate retina.
Frechette, E S; Sher, A; Grivich, M I; Petrusca, D; Litke, A M; Chichilnisky, E J
2005-07-01
Sensory experience typically depends on the ensemble activity of hundreds or thousands of neurons, but little is known about how populations of neurons faithfully encode behaviorally important sensory information. We examined how precisely speed of movement is encoded in the population activity of magnocellular-projecting parasol retinal ganglion cells (RGCs) in macaque monkey retina. Multi-electrode recordings were used to measure the activity of approximately 100 parasol RGCs simultaneously in isolated retinas stimulated with moving bars. To examine how faithfully the retina signals motion, stimulus speed was estimated directly from recorded RGC responses using an optimized algorithm that resembles models of motion sensing in the brain. RGC population activity encoded speed with a precision of approximately 1%. The elementary motion signal was conveyed in approximately 10 ms, comparable to the interspike interval. Temporal structure in spike trains provided more precise speed estimates than time-varying firing rates. Correlated activity between RGCs had little effect on speed estimates. The spatial dispersion of RGC receptive fields along the axis of motion influenced speed estimates more strongly than along the orthogonal direction, as predicted by a simple model based on RGC response time variability and optimal pooling. on and off cells encoded speed with similar and statistically independent variability. Simulation of downstream speed estimation using populations of speed-tuned units showed that peak (winner take all) readout provided more precise speed estimates than centroid (vector average) readout. These findings reveal how faithfully the retinal population code conveys information about stimulus speed and the consequences for motion sensing in the brain.
Spatial analysis of NDVI readings with difference sampling density
USDA-ARS?s Scientific Manuscript database
Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...
Working memory retrieval as a decision process
Pearson, Benjamin; Raškevičius, Julius; Bays, Paul M.; Pertzov, Yoni; Husain, Masud
2014-01-01
Working memory (WM) is a core cognitive process fundamental to human behavior, yet the mechanisms underlying it remain highly controversial. Here we provide a new framework for understanding retrieval of information from WM, conceptualizing it as a decision based on the quality of internal evidence. Recent findings have demonstrated that precision of WM decreases with memory load. If WM retrieval uses a decision process that depends on memory quality, systematic changes in response time distribution should occur as a function of WM precision. We asked participants to view sample arrays and, after a delay, report the direction of change in location or orientation of a probe. As WM precision deteriorated with increasing memory load, retrieval time increased systematically. Crucially, the shape of reaction time distributions was consistent with a linear accumulator decision process. Varying either task relevance of items or maintenance duration influenced memory precision, with corresponding shifts in retrieval time. These results provide strong support for a decision-making account of WM retrieval based on noisy storage of items. Furthermore, they show that encoding, maintenance, and retrieval in WM need not be considered as separate processes, but may instead be conceptually unified as operations on the same noise-limited, neural representation. PMID:24492597
Working memory retrieval as a decision process.
Pearson, Benjamin; Raskevicius, Julius; Bays, Paul M; Pertzov, Yoni; Husain, Masud
2014-02-03
Working memory (WM) is a core cognitive process fundamental to human behavior, yet the mechanisms underlying it remain highly controversial. Here we provide a new framework for understanding retrieval of information from WM, conceptualizing it as a decision based on the quality of internal evidence. Recent findings have demonstrated that precision of WM decreases with memory load. If WM retrieval uses a decision process that depends on memory quality, systematic changes in response time distribution should occur as a function of WM precision. We asked participants to view sample arrays and, after a delay, report the direction of change in location or orientation of a probe. As WM precision deteriorated with increasing memory load, retrieval time increased systematically. Crucially, the shape of reaction time distributions was consistent with a linear accumulator decision process. Varying either task relevance of items or maintenance duration influenced memory precision, with corresponding shifts in retrieval time. These results provide strong support for a decision-making account of WM retrieval based on noisy storage of items. Furthermore, they show that encoding, maintenance, and retrieval in WM need not be considered as separate processes, but may instead be conceptually unified as operations on the same noise-limited, neural representation.
Using simulation to improve wildlife surveys: Wintering mallards in Mississippi, USA
Pearse, A.T.; Reinecke, K.J.; Dinsmore, S.J.; Kaminski, R.M.
2009-01-01
Wildlife conservation plans generally require reliable data about population abundance and density. Aerial surveys often can provide these data; however, associated costs necessitate designing and conducting surveys efficiently. We developed methods to simulate population distributions of mallards (Anas platyrhynchos) wintering in western Mississippi, USA, by combining bird observations from three previous strip-transect surveys and habitat data from three sets of satellite images representing conditions when surveys were conducted. For each simulated population distribution, we compared 12 primary survey designs and two secondary design options by using coefficients of variation (CV) of population indices as the primary criterion for assessing survey performance. In all, 3 of the 12 primary designs provided the best precision (CV???11.7%) and performed equally well (WR08082E1d.gif diff???0.6%). Features of the designs that provided the largest gains in precision were optimal allocation of sample effort among strata and configuring the study area into five rather than four strata, to more precisely estimate mallard indices in areas of consistently high density. Of the two secondary design options, we found including a second observer to double the size of strip transects increased precision or decreased costs, whereas ratio estimation using auxiliary habitat data from satellite images did not increase precision appreciably. We recommend future surveys of mallard populations in our study area use the strata we developed, optimally allocate samples among strata, employ PPS or EPS sampling, and include two observers when qualified staff are available. More generally, the methods we developed to simulate population distributions from prior survey data provide a cost-effective method to assess performance of alternative wildlife surveys critical to informing management decisions, and could be extended to account for effects of detectability on estimates of true abundance. ?? 2009 CSIRO.
The Paradox of Abstraction: Precision Versus Concreteness.
Iliev, Rumen; Axelrod, Robert
2017-06-01
We introduce a novel measure of abstractness based on the amount of information of a concept computed from its position in a semantic taxonomy. We refer to this measure as precision. We propose two alternative ways to measure precision, one based on the path length from a concept to the root of the taxonomic tree, and another one based on the number of direct and indirect descendants. Since more information implies greater processing load, we hypothesize that nouns higher in precision will have a processing disadvantage in a lexical decision task. We contrast precision to concreteness, a common measure of abstractness based on the proportion of sensory-based information associated with a concept. Since concreteness facilitates cognitive processing, we predict that while both concreteness and precision are measures of abstractness, they will have opposite effects on performance. In two studies we found empirical support for our hypothesis. Precision and concreteness had opposite effects on latency and accuracy in a lexical decision task, and these opposite effects were observable while controlling for word length, word frequency, affective content and semantic diversity. Our results support the view that concepts organization includes amodal semantic structures which are independent of sensory information. They also suggest that we should distinguish between sensory-based and amount-of-information-based abstractness.
Garrard, Georgia E; McCarthy, Michael A; Vesk, Peter A; Radford, James Q; Bennett, Andrew F
2012-01-01
1. Informative Bayesian priors can improve the precision of estimates in ecological studies or estimate parameters for which little or no information is available. While Bayesian analyses are becoming more popular in ecology, the use of strongly informative priors remains rare, perhaps because examples of informative priors are not readily available in the published literature. 2. Dispersal distance is an important ecological parameter, but is difficult to measure and estimates are scarce. General models that provide informative prior estimates of dispersal distances will therefore be valuable. 3. Using a world-wide data set on birds, we develop a predictive model of median natal dispersal distance that includes body mass, wingspan, sex and feeding guild. This model predicts median dispersal distance well when using the fitted data and an independent test data set, explaining up to 53% of the variation. 4. Using this model, we predict a priori estimates of median dispersal distance for 57 woodland-dependent bird species in northern Victoria, Australia. These estimates are then used to investigate the relationship between dispersal ability and vulnerability to landscape-scale changes in habitat cover and fragmentation. 5. We find evidence that woodland bird species with poor predicted dispersal ability are more vulnerable to habitat fragmentation than those species with longer predicted dispersal distances, thus improving the understanding of this important phenomenon. 6. The value of constructing informative priors from existing information is also demonstrated. When used as informative priors for four example species, predicted dispersal distances reduced the 95% credible intervals of posterior estimates of dispersal distance by 8-19%. Further, should we have wished to collect information on avian dispersal distances and relate it to species' responses to habitat loss and fragmentation, data from 221 individuals across 57 species would have been required to obtain estimates with the same precision as those provided by the general model. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.
INFORMATION: THEORY, BRAIN, AND BEHAVIOR
Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.
2016-01-01
In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456
Precision Manipulation with Cooperative Robots
NASA Technical Reports Server (NTRS)
Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghzarian, Hrand
2005-01-01
This work addresses several challenges of cooperative transportThis work addresses several challenges of cooperative transport and precision manipulation. Precision manipulation requires a rigid grasp, which places a hard constraint on the relative rover formation that must be accommodated, even though the rovers cannot directly observe their relative poses. Additionally, rovers must jointly select appropriate actions based on all available sensor information. Lastly, rovers cannot act on independent sensor information, but must fuse information to move jointly; the methods for fusing information must be determined.
Thin-Slice Perception Develops Slowly
ERIC Educational Resources Information Center
Balas, Benjamin; Kanwisher, Nancy; Saxe, Rebecca
2012-01-01
Body language and facial gesture provide sufficient visual information to support high-level social inferences from "thin slices" of behavior. Given short movies of nonverbal behavior, adults make reliable judgments in a large number of tasks. Here we find that the high precision of adults' nonverbal social perception depends on the slow…
Performance Evaluation of the Honeywell GG1308 Miniature Ring Laser Gyroscope
1993-01-01
information. The final display line provides the current DSB configuration status. An external strobe was established between the Contraves motion...components and systems. The core of the facility is a Contraves -Goerz Model 57CD 2-axis motion simulator capable of highly precise position, rate and
3D Self-Localisation From Angle of Arrival Measurements
2009-04-01
systems can provide precise position information. However, there are situations where GPS is not adequate such as indoor, underwater, extraterrestrial or...Transactions on Pattern Analysis and Machine Intelligence , Vol. 22, No. 6, June 2000, pp 610-622. 7. Torrieri, D.J., "Statistical Theory of Passive Location
NASA's global differential GPS system and the TDRSS augmentation service for satellites
NASA Technical Reports Server (NTRS)
Bar-Sever, Yoaz; Young, Larry; Stocklin, Frank; Rush, John
2004-01-01
NASA is planning to launch a new service for Earth satellites providing them with precise GPS differential corrections and other ancillary information enabling decimeter level orbit determination accuracy, and nanosecond time-transfer accuracy, onboard, in real-time. The TDRSS Augmentation Service for Satellites (TASS) will broadcast its message on the S-band multiple access channel of NASA's Tracking and Data Relay Satellite System (TDRSS). The satellite's phase array antenna has been configured to provide a wide beam, extending coverage up to 1000 km altitude over the poles. Global coverage will be ensured with broadcast from three or more TDRSS satellites. The GPS differential corrections are provided by the NASA Global Differential GPS (GDGPS) System, developed and operated by NASA's Jet Propulsion Laboratory. The GDGPS System employs a global ground network of more than 70 GPS receivers to monitor the GPS constellation in real time. The system provides real-time estimates of the GPS satellite states, as well as many other real-time products such as differential corrections, global ionospheric maps, and integrity monitoring. The unique multiply redundant architecture of the GDGPS System ensures very high reliability, with 99.999% demonstrated since the inception of the system in Early 2000. The estimated real time GPS orbit and clock states provided by the GDGPS system are accurate to better than 20 cm 3D RMS, and have been demonstrated to support sub-decimeter real time positioning and orbit determination for a variety of terrestrial, airborne, and spaceborne applications. In addition to the GPS differential corrections, TASS will provide real-time Earth orientation and solar flux information that enable precise onboard knowledge of the Earth-fixed position of the spacecraft, and precise orbit prediction and planning capabilities. TASS will also provide 5 seconds alarms for GPS integrity failures based on the unique GPS integrity monitoring service of the GDGPS System.
Electroweak precision data and gravitino dark matter
NASA Astrophysics Data System (ADS)
Heinemeyer, S.
2007-11-01
Electroweak precision measurements can provide indirect information about the possible scale of supersymmetry already at the present level of accuracy. We review present day sensitivities of precision data in mSUGRA-type models with the gravitino as the lightest supersymmetric particle (LSP). The c2 fit is based on MW, sin2 qeff, (g-2)m , BR (b xAE sl) and the lightest MSSM Higgs boson mass, Mh. We find indications for relatively light soft supersymmetry-breaking masses, offering good prospects for the LHC and the ILC, and in some cases also for the Tevatron.
Ferguson, Michael A.D.; Messier, François
1997-01-01
Aboriginal peoples want their ecological knowledge used in the management of wildlife populations. To accomplish this, management agencies will need regional summaries of aboriginal knowledge about long-term changes in the distribution and abundance of wildlife populations and ecological factors that influence those changes. Between 1983 and 1994, we developed a method for collecting Inuit knowledge about historical changes in a caribou (Rangifer tarandus) population on southern Baffin Island from c. 1900 to 1994. Advice from Inuit allowed us to collect and interpret their oral knowledge in culturally appropriate ways. Local Hunters and Trappers Associations (HTAs) and other Inuit identified potential informants to maximize the spatial and temporal scope of the study. In the final interview protocol, each informant (i) established his biographical map and time line, (ii) described changes in caribou distribution and density during his life, and (iii) discussed ecological factors that may have caused changes in caribou populations. Personal and parental observations of caribou distribution and abundance were reliable and precise. Inuit who had hunted caribou during periods of scarcity provided more extensive information than those hunters who had hunted mainly ringed seals (Phoca hispida); nevertheless, seal hunters provided information about coastal areas where caribou densities were insufficient for the needs of caribou hunters. The wording of our questions influenced the reliability of informants' answers; leading questions were especially problematic. We used only information that we considered reliable after analyzing the wording of both questions and answers from translated transcripts. This analysis may have excluded some reliable information because informants tended to understate certainty in their recollections. We tried to retain the accuracy and precision inherent in Inuit oral traditions; comparisons of information from several informants and comparisons with published and archival historical reports indicate that we retained these qualities of Inuit knowledge.
Capacity and precision in an animal model of visual short-term memory.
Lara, Antonio H; Wallis, Jonathan D
2012-03-14
Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys' VSTM capacity. Subjects' performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrades the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM.
Search guidance is proportional to the categorical specificity of a target cue.
Schmidt, Joseph; Zelinsky, Gregory J
2009-10-01
Visual search studies typically assume the availability of precise target information to guide search, often a picture of the exact target. However, search targets in the real world are often defined categorically and with varying degrees of visual specificity. In five target preview conditions we manipulated the availability of target visual information in a search task for common real-world objects. Previews were: a picture of the target, an abstract textual description of the target, a precise textual description, an abstract + colour textual description, or a precise + colour textual description. Guidance generally increased as information was added to the target preview. We conclude that the information used for search guidance need not be limited to a picture of the target. Although generally less precise, to the extent that visual information can be extracted from a target label and loaded into working memory, this information too can be used to guide search.
Precision of working memory for visual motion sequences and transparent motion surfaces.
Zokaei, Nahid; Gorgoraptis, Nikos; Bahrami, Bahador; Bays, Paul M; Husain, Masud
2011-12-01
Recent studies investigating working memory for location, color, and orientation support a dynamic resource model. We examined whether this might also apply to motion, using random dot kinematograms (RDKs) presented sequentially or simultaneously. Mean precision for motion direction declined as sequence length increased, with precision being lower for earlier RDKs. Two alternative models of working memory were compared specifically to distinguish between the contributions of different sources of error that corrupt memory (W. Zhang & S. J. Luck, 2008 vs. P. M. Bays, R. F. G. Catalao, & M. Husain, 2009). The latter provided a significantly better fit for the data, revealing that decrease in memory precision for earlier items is explained by an increase in interference from other items in a sequence rather than random guessing or a temporal decay of information. Misbinding feature attributes is an important source of error in working memory. Precision of memory for motion direction decreased when two RDKs were presented simultaneously as transparent surfaces, compared to sequential RDKs. However, precision was enhanced when one motion surface was prioritized, demonstrating that selective attention can improve recall precision. These results are consistent with a resource model that can be used as a general conceptual framework for understanding working memory across a range of visual features.
Electronics design of the RPC system for the OPERA muon spectrometer
NASA Astrophysics Data System (ADS)
Acquafredda, R.; Ambrosio, M.; Balsamo, E.; Barichello, G.; Bergnoli, A.; Consiglio, L.; Corradi, G.; dal Corso, F.; Felici, G.; Manea, C.; Masone, V.; Parascandolo, P.; Sorrentino, G.
2004-09-01
The present document describes the front-end electronics of the RPC system that instruments the magnet muon spectrometer of the OPERA experiment. The main task of the OPERA spectrometer is to provide particle tracking information for muon identification and simplify the matching between the Precision Trackers. As no trigger has been foreseen for the experiment, the spectrometer electronics must be self-triggered with single-plane readout capability. Moreover, precision time information must be added within each event frame for off-line reconstruction. The read-out electronics is made of three different stages: the Front-End Boards (FEBs) system, the Controller Boards (CBs) system and the Trigger Boards (TBs) system. The FEB system provides discrimination of the strip incoming signals; a FAST-OR output of the input signals is also available for trigger plane signal generation. FEB signals are acquired by the CB system that provides the zero suppression and manages the communication to the DAQ and Slow Control. A Trigger Board allows to operate in both self-trigger mode (the FEB's FAST-OR signal starts the plane acquisition) or in external-trigger mode (different conditions can be set on the FAST-OR signals generated from different planes).
Influence of item distribution pattern and abundance on efficiency of benthic core sampling
Behney, Adam C.; O'Shaughnessy, Ryan; Eichholz, Michael W.; Stafford, Joshua D.
2014-01-01
ore sampling is a commonly used method to estimate benthic item density, but little information exists about factors influencing the accuracy and time-efficiency of this method. We simulated core sampling in a Geographic Information System framework by generating points (benthic items) and polygons (core samplers) to assess how sample size (number of core samples), core sampler size (cm2), distribution of benthic items, and item density affected the bias and precision of estimates of density, the detection probability of items, and the time-costs. When items were distributed randomly versus clumped, bias decreased and precision increased with increasing sample size and increased slightly with increasing core sampler size. Bias and precision were only affected by benthic item density at very low values (500–1,000 items/m2). Detection probability (the probability of capturing ≥ 1 item in a core sample if it is available for sampling) was substantially greater when items were distributed randomly as opposed to clumped. Taking more small diameter core samples was always more time-efficient than taking fewer large diameter samples. We are unable to present a single, optimal sample size, but provide information for researchers and managers to derive optimal sample sizes dependent on their research goals and environmental conditions.
A general natural-language text processor for clinical radiology.
Friedman, C; Alderson, P O; Austin, J H; Cimino, J J; Johnson, S B
1994-01-01
OBJECTIVE: Development of a general natural-language processor that identifies clinical information in narrative reports and maps that information into a structured representation containing clinical terms. DESIGN: The natural-language processor provides three phases of processing, all of which are driven by different knowledge sources. The first phase performs the parsing. It identifies the structure of the text through use of a grammar that defines semantic patterns and a target form. The second phase, regularization, standardizes the terms in the initial target structure via a compositional mapping of multi-word phrases. The third phase, encoding, maps the terms to a controlled vocabulary. Radiology is the test domain for the processor and the target structure is a formal model for representing clinical information in that domain. MEASUREMENTS: The impression sections of 230 radiology reports were encoded by the processor. Results of an automated query of the resultant database for the occurrences of four diseases were compared with the analysis of a panel of three physicians to determine recall and precision. RESULTS: Without training specific to the four diseases, recall and precision of the system (combined effect of the processor and query generator) were 70% and 87%. Training of the query component increased recall to 85% without changing precision. PMID:7719797
Holman, B W B; Alvarenga, T I R C; van de Ven, R J; Hopkins, D L
2015-07-01
The Warner-Bratzler shear force (WBSF) of 335 lamb m. longissimus lumborum (LL) caudal and cranial ends was measured to examine and simulate the effect of replicate number (r: 1-8) on the precision of mean WBSF estimates and to compare LL caudal and cranial end WBSF means. All LL were sourced from two experimental flocks as part of the Information Nucleus slaughter programme (CRC for Sheep Industry Innovation) and analysed using a Lloyd Texture analyser with a Warner-Bratzler blade attachment. WBSF data were natural logarithm (ln) transformed before statistical analysis. Mean ln(WBSF) precision improved as r increased; however the practical implications support an r equal to 6, as precision improves only marginally with additional replicates. Increasing LL sample replication results in better ln(WBSF) precision compared with increasing r, provided that sample replicates are removed from the same LL end. Cranial end mean WBSF was 11.2 ± 1.3% higher than the caudal end. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Hillyer, Grace Clarke; Schmitt, Karen M; Lizardo, Maria; Reyes, Andria; Bazan, Mercedes; Alvarez, Maria C; Sandoval, Rossy; Abdul, Kazeem; Orjuela, Manuela A
2017-04-01
Understanding key health concepts is crucial to participation in Precision Medicine initiatives. In order to assess methods to develop and disseminate a curriculum to educate community members in Northern Manhattan about Precision Medicine, clients from a local community-based organization were interviewed during 2014-2015. Health literacy, acculturation, use of Internet, email, and text messaging, and health information sources were assessed. Associations between age and outcomes were evaluated; multivariable analysis used to examine the relationship between participant characteristics and sources of health information. Of 497 interviewed, 29.4 % had inadequate health literacy and 53.6 % had access to the Internet, 43.9 % to email, and 45.3 % to text messaging. Having adequate health literacy was associated with seeking information from a healthcare professional (OR 2.59, 95 % CI 1.54-4.35) and from the Internet (OR 3.15, 95 % CI 1.97-5.04); having ≤ grade school education (OR 2.61, 95 % CI 1.32-5.17) also preferred information from their provider; persons >45 years (OR 0.29, 95 % CI 0.18-0.47) were less likely to use the Internet for health information and preferred printed media (OR 1.64, 95 % CI 1.07-2.50). Overall, electronic communication channel use was low and varied significantly by age with those ≤45 years more likely to utilize electronic channels. Preferred sources of health information also varied by age as well as by health literacy and educational level. This study demonstrates that to effectively communicate key Precision Medicine concepts, curriculum development for Latino community members of Northern Manhattan will require attention to health literacy, language preference and acculturation and incorporate more traditional communication channels for older community members.
Oulas, Anastasis; Minadakis, George; Zachariou, Margarita; Sokratous, Kleitos; Bourdakou, Marilena M; Spyrou, George M
2017-11-27
Systems Bioinformatics is a relatively new approach, which lies in the intersection of systems biology and classical bioinformatics. It focuses on integrating information across different levels using a bottom-up approach as in systems biology with a data-driven top-down approach as in bioinformatics. The advent of omics technologies has provided the stepping-stone for the emergence of Systems Bioinformatics. These technologies provide a spectrum of information ranging from genomics, transcriptomics and proteomics to epigenomics, pharmacogenomics, metagenomics and metabolomics. Systems Bioinformatics is the framework in which systems approaches are applied to such data, setting the level of resolution as well as the boundary of the system of interest and studying the emerging properties of the system as a whole rather than the sum of the properties derived from the system's individual components. A key approach in Systems Bioinformatics is the construction of multiple networks representing each level of the omics spectrum and their integration in a layered network that exchanges information within and between layers. Here, we provide evidence on how Systems Bioinformatics enhances computational therapeutics and diagnostics, hence paving the way to precision medicine. The aim of this review is to familiarize the reader with the emerging field of Systems Bioinformatics and to provide a comprehensive overview of its current state-of-the-art methods and technologies. Moreover, we provide examples of success stories and case studies that utilize such methods and tools to significantly advance research in the fields of systems biology and systems medicine. © The Author 2017. Published by Oxford University Press.
Precision Neutron Time-of-Flight Detectors Provide Insight into NIF Implosion Dynamics
NASA Astrophysics Data System (ADS)
Schlossberg, David; Eckart, M. J.; Grim, G. P.; Hartouni, E. P.; Hatarik, R.; Moore, A. S.; Waltz, C. S.
2017-10-01
During inertial confinement fusion, higher-order moments of neutron time-of-flight (nToF) spectra can provide essential information for optimizing implosions. The nToF diagnostic suite at the National Ignition Facility (NIF) was recently upgraded to include novel, quartz Cherenkov detectors. These detectors exploit the rapid Cherenkov radiation process, in contrast with conventional scintillator decay times, to provide high temporal-precision measurements that support higher-order moment analyses. Preliminary measurements have been made on the NIF during several implosions and initial results are presented here. Measured line-of-sight asymmetries, for example in ion temperatures, will be discussed. Finally, advanced detector optimization is shown to advance accessible physics, with possibilities for energy discrimination, gamma source identification, and further reduction in quartz response times. Work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.
Crew Office Evaluation of a Precision Lunar Landing System
NASA Technical Reports Server (NTRS)
Major, Laura M.; Duda, Kevin R.; Hirsh, Robert L.
2011-01-01
A representative Human System Interface for a precision lunar landing system, ALHAT, has been developed as a platform for prototype visualization and interaction concepts. This facilitates analysis of crew interaction with advanced sensors and AGNC systems. Human-in-the-loop evaluations with representatives from the Crew Office (i.e. astronauts) and Mission Operations Directorate (MOD) were performed to refine the crew role and information requirements during the final phases of landing. The results include a number of lessons learned from Shuttle that are applicable to the design of a human supervisory landing system and cockpit. Overall, the results provide a first order analysis of the tasks the crew will perform during lunar landing, an architecture for the Human System Interface based on these tasks, as well as details on the information needs to land safely.
NASA Astrophysics Data System (ADS)
Tamborini, D.; Portaluppi, D.; Villa, F.; Tisa, S.; Tosi, A.
2014-11-01
We present a Time-to-Digital Converter (TDC) card with a compact form factor, suitable for multichannel timing instruments or for integration into more complex systems. The TDC Card provides 10 ps timing resolution over the whole measurement range, which is selectable from 160 ns up to 10 μs, reaching 21 ps rms precision, 1.25% LSB rms differential nonlinearity, up to 3 Mconversion/s with 400 mW power consumption. The I/O edge card connector provides timing data readout through either a parallel bus or a 100 MHz serial interface and further measurement information like input signal rate and valid conversion rate (typically useful for time-correlated single-photon counting application) through an independent serial link.
Tamborini, D; Portaluppi, D; Villa, F; Tisa, S; Tosi, A
2014-11-01
We present a Time-to-Digital Converter (TDC) card with a compact form factor, suitable for multichannel timing instruments or for integration into more complex systems. The TDC Card provides 10 ps timing resolution over the whole measurement range, which is selectable from 160 ns up to 10 μs, reaching 21 ps rms precision, 1.25% LSB rms differential nonlinearity, up to 3 Mconversion/s with 400 mW power consumption. The I/O edge card connector provides timing data readout through either a parallel bus or a 100 MHz serial interface and further measurement information like input signal rate and valid conversion rate (typically useful for time-correlated single-photon counting application) through an independent serial link.
NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
Plantet, C; Meimon, S; Conan, J-M; Fusco, T
2015-11-02
Exoplanet direct imaging with large ground based telescopes requires eXtreme Adaptive Optics that couples high-order adaptive optics and coronagraphy. A key element of such systems is the high-order wavefront sensor. We study here several high-order wavefront sensing approaches, and more precisely compare their sensitivity to noise. Three techniques are considered: the classical Shack-Hartmann sensor, the pyramid sensor and the recently proposed LIFTed Shack-Hartmann sensor. They are compared in a unified framework based on precise diffractive models and on the Fisher information matrix, which conveys the information present in the data whatever the estimation method. The diagonal elements of the inverse of the Fisher information matrix, which we use as a figure of merit, are similar to noise propagation coefficients. With these diagonal elements, so called "Fisher coefficients", we show that the LIFTed Shack-Hartmann and pyramid sensors outperform the classical Shack-Hartmann sensor. In photon noise regime, the LIFTed Shack-Hartmann and modulated pyramid sensors obtain a similar overall noise propagation. The LIFTed Shack-Hartmann sensor however provides attractive noise properties on high orders.
Quantum technologies with hybrid systems
Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg
2015-01-01
An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field. PMID:25737558
Quantum technologies with hybrid systems.
Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg
2015-03-31
An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.
Quantum technologies with hybrid systems
NASA Astrophysics Data System (ADS)
Kurizki, Gershon; Bertet, Patrice; Kubo, Yuimaru; Mølmer, Klaus; Petrosyan, David; Rabl, Peter; Schmiedmayer, Jörg
2015-03-01
An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.
The String Stability of a Trajectory-Based Interval Management Algorithm in the Midterm Airspace
NASA Technical Reports Server (NTRS)
Swieringa, Kurt A.
2015-01-01
NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides terminal controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain a precise spacing interval behind a target aircraft. As the percentage of IM equipped aircraft increases, controllers may provide IM clearances to sequences, or strings, of IM-equipped aircraft. It is important for these strings to maintain stable performance. This paper describes an analytic analysis of the string stability of the latest version of NASA's IM algorithm and a fast-time simulation designed to characterize the string performance of the IM algorithm. The analytic analysis showed that the spacing algorithm has stable poles, indicating that a spacing error perturbation will be reduced as a function of string position. The fast-time simulation investigated IM operations at two airports using constraints associated with the midterm airspace, including limited information of the target aircraft's intended speed profile and limited information of the wind forecast on the target aircraft's route. The results of the fast-time simulation demonstrated that the performance of the spacing algorithm is acceptable for strings of moderate length; however, there is some degradation in IM performance as a function of string position.
Glacier and Ice Shelves Studies Using Satellite SAR Interferometry
NASA Technical Reports Server (NTRS)
Rignot, Eric
1999-01-01
Satellite radar interferometry is a powerful technique to measure the surface velocity and topography of glacier ice. On ice shelves, a quadruple difference technique separates tidal motion from the steady creep flow deformation of ice. The results provide a wealth of information about glacier grounding lines , mass fluxes, stability, elastic properties of ice, and tidal regime. The grounding line, which is where the glacier detaches from its bed and becomes afloat, is detected with a precision of a few tens of meters. Combining this information with satellite radar altimetry makes it possible to measure glacier discharge into the ocean and state of mass balance with greater precision than ever before, and in turn provide a significant revision of past estimates of mass balance of the Greenland and Antarctic Ice Sheets. Analysis of creep rates on floating ice permits an estimation of basal melting at the ice shelf underside. The results reveal that the action of ocean water in sub-ice-shelf cavities has been largely underestimated by oceanographic models and is the dominant mode of mass release to the ocean from an ice shelf. Precise mapping of grounding line positions also permits the detection of grounding line migration, which is a fine indicator of glacier change, independent of our knowledge of snow accumulation and ice melting. This technique has been successfully used to detect the rapid retreat of Pine Island Glacier, the largest ice stream in West Antarctica. Finally, tidal motion of ice shelves measured interferometrically provides a modern, synoptic view of the physical processes which govern the formation of tabular icebergs in the Antarctic.
Motion compensation using origin ensembles in awake small animal positron emission tomography
NASA Astrophysics Data System (ADS)
Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.
2017-02-01
In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.
Rothschild, Adam S.; Lehmann, Harold P.
2005-01-01
Objective: The aim of this study was to preliminarily determine the feasibility of probabilistically generating problem-specific computerized provider order entry (CPOE) pick-lists from a database of explicitly linked orders and problems from actual clinical cases. Design: In a pilot retrospective validation, physicians reviewed internal medicine cases consisting of the admission history and physical examination and orders placed using CPOE during the first 24 hours after admission. They created coded problem lists and linked orders from individual cases to the problem for which they were most indicated. Problem-specific order pick-lists were generated by including a given order in a pick-list if the probability of linkage of order and problem (PLOP) equaled or exceeded a specified threshold. PLOP for a given linked order-problem pair was computed as its prevalence among the other cases in the experiment with the given problem. The orders that the reviewer linked to a given problem instance served as the reference standard to evaluate its system-generated pick-list. Measurements: Recall, precision, and length of the pick-lists. Results: Average recall reached a maximum of .67 with a precision of .17 and pick-list length of 31.22 at a PLOP threshold of 0. Average precision reached a maximum of .73 with a recall of .09 and pick-list length of .42 at a PLOP threshold of .9. Recall varied inversely with precision in classic information retrieval behavior. Conclusion: We preliminarily conclude that it is feasible to generate problem-specific CPOE pick-lists probabilistically from a database of explicitly linked orders and problems. Further research is necessary to determine the usefulness of this approach in real-world settings. PMID:15684134
Li, Xiang-Yao; Wang, Ning; Wang, Yong-Jie; Zuo, Zhen-Xing; Koga, Kohei; Luo, Fei
2014-01-01
Temporal properties of spike firing in the central nervous system (CNS) are critical for neuronal coding and the precision of information storage. Chronic pain has been reported to affect cognitive and emotional functions, in addition to trigger long-term plasticity in sensory synapses and behavioral sensitization. Less is known about the possible changes in temporal precision of cortical neurons in chronic pain conditions. In the present study, we investigated the temporal precision of action potential firing in the anterior cingulate cortex (ACC) by using both in vivo and in vitro electrophysiological approaches. We found that peripheral inflammation caused by complete Freund's adjuvant (CFA) increased the standard deviation (SD) of spikes latency (also called jitter) of ∼51% of recorded neurons in the ACC of adult rats in vivo. Similar increases in jitter were found in ACC neurons using in vitro brain slices from adult mice with peripheral inflammation or nerve injury. Bath application of glutamate receptor antagonists CNQX and AP5 abolished the enhancement of jitter induced by CFA injection or nerve injury, suggesting that the increased jitter depends on the glutamatergic synaptic transmission. Activation of adenylyl cyclases (ACs) by bath application of forskolin increased jitter, whereas genetic deletion of AC1 abolished the change of jitter caused by CFA inflammation. Our study provides strong evidence for long-term changes of temporal precision of information coding in cortical neurons after peripheral injuries and explains neuronal mechanism for chronic pain caused cognitive and emotional impairment. PMID:25100600
Computational advances towards linking BOLD and behavior.
Serences, John T; Saproo, Sameer
2012-03-01
Traditionally, fMRI studies have focused on analyzing the mean response amplitude within a cortical area. However, the mean response is blind to many important patterns of cortical modulation, which severely limits the formulation and evaluation of linking hypotheses between neural activity, BOLD responses, and behavior. More recently, multivariate pattern classification analysis (MVPA) has been applied to fMRI data to evaluate the information content of spatially distributed activation patterns. This approach has been remarkably successful at detecting the presence of specific information in targeted brain regions, and provides an extremely flexible means of extracting that information without a precise generative model for the underlying neural activity. However, this flexibility comes at a cost: since MVPA relies on pooling information across voxels that are selective for many different stimulus attributes, it is difficult to infer how specific sub-sets of tuned neurons are modulated by an experimental manipulation. In contrast, recently developed encoding models can produce more precise estimates of feature-selective tuning functions, and can support the creation of explicit linking hypotheses between neural activity and behavior. Although these encoding models depend on strong - and often untested - assumptions about the response properties of underlying neural generators, they also provide a unique opportunity to evaluate population-level computational theories of perception and cognition that have previously been difficult to assess using either single-unit recording or conventional neuroimaging techniques. Copyright © 2011. Published by Elsevier Ltd.
Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce
2016-05-01
For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
A Police and Insurance Joint Management System Based on High Precision BDS/GPS Positioning
Zuo, Wenwei; Guo, Chi; Liu, Jingnan; Peng, Xuan; Yang, Min
2018-01-01
Car ownership in China reached 194 million vehicles at the end of 2016. The traffic congestion index (TCI) exceeds 2.0 during rush hour in some cities. Inefficient processing for minor traffic accidents is considered to be one of the leading causes for road traffic jams. Meanwhile, the process after an accident is quite troublesome. The main reason is that it is almost always impossible to get the complete chain of evidence when the accident happens. Accordingly, a police and insurance joint management system is developed which is based on high precision BeiDou Navigation Satellite System (BDS)/Global Positioning System (GPS) positioning to process traffic accidents. First of all, an intelligent vehicle rearview mirror terminal is developed. The terminal applies a commonly used consumer electronic device with single frequency navigation. Based on the high precision BDS/GPS positioning algorithm, its accuracy can reach sub-meter level in the urban areas. More specifically, a kernel driver is built to realize the high precision positioning algorithm in an Android HAL layer. Thus the third-party application developers can call the general location Application Programming Interface (API) of the original standard Global Navigation Satellite System (GNSS) to get high precision positioning results. Therefore, the terminal can provide lane level positioning service for car users. Next, a remote traffic accident processing platform is built to provide big data analysis and management. According to the big data analysis of information collected by BDS high precision intelligent sense service, vehicle behaviors can be obtained. The platform can also automatically match and screen the data that uploads after an accident to achieve accurate reproduction of the scene. Thus, it helps traffic police and insurance personnel to complete remote responsibility identification and survey for the accident. Thirdly, a rapid processing flow is established in this article to meet the requirements to quickly handle traffic accidents. The traffic police can remotely identify accident responsibility and the insurance personnel can remotely survey an accident. Moreover, the police and insurance joint management system has been carried out in Wuhan, Central China’s Hubei Province, and Wuxi, Eastern China’s Jiangsu Province. In a word, a system is developed to obtain and analyze multisource data including precise positioning and visual information, and a solution is proposed for efficient processing of traffic accidents. PMID:29320406
A Police and Insurance Joint Management System Based on High Precision BDS/GPS Positioning.
Zuo, Wenwei; Guo, Chi; Liu, Jingnan; Peng, Xuan; Yang, Min
2018-01-10
Car ownership in China reached 194 million vehicles at the end of 2016. The traffic congestion index (TCI) exceeds 2.0 during rush hour in some cities. Inefficient processing for minor traffic accidents is considered to be one of the leading causes for road traffic jams. Meanwhile, the process after an accident is quite troublesome. The main reason is that it is almost always impossible to get the complete chain of evidence when the accident happens. Accordingly, a police and insurance joint management system is developed which is based on high precision BeiDou Navigation Satellite System (BDS)/Global Positioning System (GPS) positioning to process traffic accidents. First of all, an intelligent vehicle rearview mirror terminal is developed. The terminal applies a commonly used consumer electronic device with single frequency navigation. Based on the high precision BDS/GPS positioning algorithm, its accuracy can reach sub-meter level in the urban areas. More specifically, a kernel driver is built to realize the high precision positioning algorithm in an Android HAL layer. Thus the third-party application developers can call the general location Application Programming Interface (API) of the original standard Global Navigation Satellite System (GNSS) to get high precision positioning results. Therefore, the terminal can provide lane level positioning service for car users. Next, a remote traffic accident processing platform is built to provide big data analysis and management. According to the big data analysis of information collected by BDS high precision intelligent sense service, vehicle behaviors can be obtained. The platform can also automatically match and screen the data that uploads after an accident to achieve accurate reproduction of the scene. Thus, it helps traffic police and insurance personnel to complete remote responsibility identification and survey for the accident. Thirdly, a rapid processing flow is established in this article to meet the requirements to quickly handle traffic accidents. The traffic police can remotely identify accident responsibility and the insurance personnel can remotely survey an accident. Moreover, the police and insurance joint management system has been carried out in Wuhan, Central China's Hubei Province, and Wuxi, Eastern China's Jiangsu Province. In a word, a system is developed to obtain and analyze multisource data including precise positioning and visual information, and a solution is proposed for efficient processing of traffic accidents.
Naive Beliefs in Baseball: Systematic Distortion in Perceived Time of Apex for Fly Balls
ERIC Educational Resources Information Center
Shaffer, Dennis M.; McBeath, Michael K.
2005-01-01
When fielders catch fly balls they use geometric properties to optically maintain control over the ball. The strategy provides ongoing guidance without indicating precise positional information concerning where the ball is located in space. Here, the authors show that observers have striking misconceptions about what the motion of projectiles…
Commentary: Can This Evaluation Be Saved?
ERIC Educational Resources Information Center
Ginsberg, Pauline E.
2004-01-01
Can this evaluation be saved? More precisely, can this evaluation be saved in such a way that both evaluator and client feel satisfied that their points of view were respected and both agree that the evaluation itself provides valid information obtained in a principled manner? Because the scenario describes a preliminary discussion and no contract…
Applying Bootstrap Resampling to Compute Confidence Intervals for Various Statistics with R
ERIC Educational Resources Information Center
Dogan, C. Deha
2017-01-01
Background: Most of the studies in academic journals use p values to represent statistical significance. However, this is not a good indicator of practical significance. Although confidence intervals provide information about the precision of point estimation, they are, unfortunately, rarely used. The infrequent use of confidence intervals might…
Reporting Confidence Intervals and Effect Sizes: Collecting the Evidence
ERIC Educational Resources Information Center
Zientek, Linda Reichwein; Ozel, Z. Ebrar Yetkiner; Ozel, Serkan; Allen, Jeff
2012-01-01
Confidence intervals (CIs) and effect sizes are essential to encourage meta-analytic thinking and to accumulate research findings. CIs provide a range of plausible values for population parameters with a degree of confidence that the parameter is in that particular interval. CIs also give information about how precise the estimates are. Comparison…
The Use of Foreign Languages in Tourism: Research Needs.
ERIC Educational Resources Information Center
Watts, Noel
1994-01-01
Examines the research needs relative to the use of foreign languages in tourism activities in Australia and New Zealand. Findings indicate a lack of precise information on the ways in which the tourism industry in these countries provides appropriate language assistance to non-English speaking inbound visitors. Suggestions for future research are…
Estimating Standardized Linear Contrasts of Means with Desired Precision
ERIC Educational Resources Information Center
Bonett, Douglas G.
2009-01-01
L. Wilkinson and the Task Force on Statistical Inference (1999) recommended reporting confidence intervals for measures of effect sizes. If the sample size is too small, the confidence interval may be too wide to provide meaningful information. Recently, K. Kelley and J. R. Rausch (2006) used an iterative approach to computer-generate tables of…
49 CFR 395.16 - Electronic on-board recording devices.
Code of Federal Regulations, 2011 CFR
2011-10-01
... “sufficiently precise,” for purposes of this paragraph means the nearest city, town or village. (3) When the CMV... driving, and where released from work), the name of the nearest city, town, or village, with State... password) that identifies the driver or to provide other information (such as smart cards, biometrics) that...
Confidence Intervals for Effect Sizes: Applying Bootstrap Resampling
ERIC Educational Resources Information Center
Banjanovic, Erin S.; Osborne, Jason W.
2016-01-01
Confidence intervals for effect sizes (CIES) provide readers with an estimate of the strength of a reported statistic as well as the relative precision of the point estimate. These statistics offer more information and context than null hypothesis statistic testing. Although confidence intervals have been recommended by scholars for many years,…
Confidence Intervals for Weighted Composite Scores under the Compound Binomial Error Model
ERIC Educational Resources Information Center
Kim, Kyung Yong; Lee, Won-Chan
2018-01-01
Reporting confidence intervals with test scores helps test users make important decisions about examinees by providing information about the precision of test scores. Although a variety of estimation procedures based on the binomial error model are available for computing intervals for test scores, these procedures assume that items are randomly…
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology.
Bhargava, Rohit; Madabhushi, Anant
2016-07-11
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area.
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology
Bhargava, Rohit; Madabhushi, Anant
2017-01-01
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area. PMID:27420575
Technology-enabled Airborne Spacing and Merging
NASA Technical Reports Server (NTRS)
Hull, James; Barmore, Bryan; Abbott, Tetence
2005-01-01
Over the last several decades, advances in airborne and groundside technologies have allowed the Air Traffic Service Provider (ATSP) to give safer and more efficient service, reduce workload and frequency congestion, and help accommodate a critically escalating traffic volume. These new technologies have included advanced radar displays, and data and communication automation to name a few. In step with such advances, NASA Langley is developing a precision spacing concept designed to increase runway throughput by enabling the flight crews to manage their inter-arrival spacing from TRACON entry to the runway threshold. This concept is being developed as part of NASA s Distributed Air/Ground Traffic Management (DAG-TM) project under the Advanced Air Transportation Technologies Program. Precision spacing is enabled by Automatic Dependent Surveillance-Broadcast (ADS-B), which provides air-to-air data exchange including position and velocity reports; real-time wind information and other necessary data. On the flight deck, a research prototype system called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR) processes this information and provides speed guidance to the flight crew to achieve the desired inter-arrival spacing. AMSTAR is designed to support current ATC operations, provide operationally acceptable system-wide increases in approach spacing performance and increase runway throughput through system stability, predictability and precision spacing. This paper describes problems and costs associated with an imprecise arrival flow. It also discusses methods by which Air Traffic Controllers achieve and maintain an optimum interarrival interval, and explores means by which AMSTAR can assist in this pursuit. AMSTAR is an extension of NASA s previous work on in-trail spacing that was successfully demonstrated in a flight evaluation at Chicago O Hare International Airport in September 2002. In addition to providing for precision inter-arrival spacing, AMSTAR provides speed guidance for aircraft on converging routes to safely and smoothly merge onto a common approach. Much consideration has been given to working with operational conditions such as imperfect ADS-B data, wind prediction errors, changing winds, differing aircraft types and wake vortex separation requirements. A series of Monte Carlo simulations are planned for the spring and summer of 2004 at NASA Langley to further study the system behavior and performance under more operationally extreme and varying conditions. This will coincide with a human-in-the-loop study to investigate the flight crew interface, workload and acceptability.
Emulating JWST Exoplanet Transit Observations in a Testbed laboratory experiment
NASA Astrophysics Data System (ADS)
Touli, D.; Beichman, C. A.; Vasisht, G.; Smith, R.; Krist, J. E.
2014-12-01
The transit technique is used for the detection and characterization of exoplanets. The combination of transit and radial velocity (RV) measurements gives information about a planet's radius and mass, respectively, leading to an estimate of the planet's density (Borucki et al. 2011) and therefore to its composition and evolutionary history. Transit spectroscopy can provide information on atmospheric composition and structure (Fortney et al. 2013). Spectroscopic observations of individual planets have revealed atomic and molecular species such as H2O, CO2 and CH4 in atmospheres of planets orbiting bright stars, e.g. Deming et al. (2013). The transit observations require extremely precise photometry. For instance, Jupiter transit results to a 1% brightness decrease of a solar type star while the Earth causes only a 0.0084% decrease (84 ppm). Spectroscopic measurements require still greater precision <30ppm. The Precision Projector Laboratory (PPL) is a collaboration between the Jet Propulsion Laboratory (JPL) and California Institute of Technology (Caltech) to characterize and validate detectors through emulation of science images. At PPL we have developed a testbed to project simulated spectra and other images onto a HgCdTe array in order to assess precision photometry for transits, weak lensing etc. for Explorer concepts like JWST, WFIRST, EUCLID. In our controlled laboratory experiment, the goal is to demonstrate ability to extract weak transit spectra as expected for NIRCam, NIRIS and NIRSpec. Two lamps of variable intensity, along with spectral line and photometric simulation masks emulate the signals from a star-only, from a planet-only and finally, from a combination of a planet + star. Three masks have been used to simulate spectra in monochromatic light. These masks, which are fabricated at JPL, have a length of 1000 pixels and widths of 2 pixels, 10 pixels and 1 pixel to correspond respectively to the noted above JWST instruments. From many-hour long observing sequences, we obtain time series photometry with deliberate offsets introduced to test sensitivity to pointing jitter and other effects. We can modify the star-planet brightness contrast by factors up to 10^4:1. With cross correlation techniques we calculate positional shifts which are then used to decorrelate the effects of vertical and lateral offsets due to turbulence and instrumental vibrations on the photometry. Using Principal Component Analysis (PCA), we reject correlated temporal noise to achieve a precision lower than 50 ppm (Clanton et al. 2012). In our current work, after decorrelation of vertical and lateral offsets along with PCA, we achieve a precision of sim20 ppm. To assess the photometric precision we use the Allan variance (Allan 1987). This statistical method is used to characterize noise and stability as it indicates shot noise limited performance. Testbed experiments are ongoing to provide quantitative information on the achievable spectroscopic precision using realistic exoplanet spectra with the goal to define optimized data acquisition sequences for use, for example, with the James Webb Space Telescope.
Evaluation of the Archaeological Data Base, Coralville Lake, Iowa.
1987-05-01
of Ms. Debby Zieglowsky, often provided the precise information necessary to facilitate field relocation. In addition, review of the site forms and...prehistory. Again, Ms. Debby Zieglowsky, Dr. Joseph Tiffany, and Dr. Duane Anderson were quite helpful in securing access to - these collections , providing...drained. Vegetation is deciduous forest of oak, hickory, birch , a few maple and several large cedar trees. The area sampled was _ well above (50’, 15m
D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D
2011-01-01
Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.
NASA Technical Reports Server (NTRS)
Joiner, J.; Gaunter, L.; Lindstrot, R.; Voigt, M.; Vasilkov, A. P.; Middleton, E. M.; Huemmrich, K. F.; Yoshida, Y.; Frankenberg, C.
2013-01-01
Globally mapped terrestrial chlorophyll fluorescence retrievals are of high interest because they can provide information on the functional status of vegetation including light-use efficiency and global primary productivity that can be used for global carbon cycle modeling and agricultural applications. Previous satellite retrievals of fluorescence have relied solely upon the filling-in of solar Fraunhofer lines that are not significantly affected by atmospheric absorption. Although these measurements provide near-global coverage on a monthly basis, they suffer from relatively low precision and sparse spatial sampling. Here, we describe a new methodology to retrieve global far-red fluorescence information; we use hyperspectral data with a simplified radiative transfer model to disentangle the spectral signatures of three basic components: atmospheric absorption, surface reflectance, and fluorescence radiance. An empirically based principal component analysis approach is employed, primarily using cloudy data over ocean, to model and solve for the atmospheric absorption. Through detailed simulations, we demonstrate the feasibility of the approach and show that moderate-spectral-resolution measurements with a relatively high signal-to-noise ratio can be used to retrieve far-red fluorescence information with good precision and accuracy. The method is then applied to data from the Global Ozone Monitoring Instrument 2 (GOME-2). The GOME-2 fluorescence retrievals display similar spatial structure as compared with those from a simpler technique applied to the Greenhouse gases Observing SATellite (GOSAT). GOME-2 enables global mapping of far-red fluorescence with higher precision over smaller spatial and temporal scales than is possible with GOSAT. Near-global coverage is provided within a few days. We are able to show clearly for the first time physically plausible variations in fluorescence over the course of a single month at a spatial resolution of 0.5 deg × 0.5 deg. We also show some significant differences between fluorescence and coincident normalized difference vegetation indices (NDVI) retrievals.
NASA Technical Reports Server (NTRS)
Joiner, J.; Guanter, L.; Lindstrot, R.; Voigt, M.; Vasilkov, A. P.; Middleton, E. M.; Huemmrich, K. F.; Yoshida, Y.; Frankenberg, C.
2013-01-01
Globally mapped terrestrial chlorophyll fluorescence retrievals are of high interest because they can provide information on the functional status of vegetation including light-use efficiency and global primary productivity that can be used for global carbon cycle modeling and agricultural applications. Previous satellite retrievals of fluorescence have relied solely upon the filling-in of solar Fraunhofer lines that are not significantly affected by atmospheric absorption. Although these measurements provide near-global coverage on a monthly basis, they suffer from relatively low precision and sparse spatial sampling. Here, we describe a new methodology to retrieve global far-red fluorescence information; we use hyperspectral data with a simplified radiative transfer model to disentangle the spectral signatures of three basic components: atmospheric absorption, surface reflectance, and fluorescence radiance. An empirically based principal component analysis approach is employed, primarily using cloudy data over ocean, to model and solve for the atmospheric absorption. Through detailed simulations, we demonstrate the feasibility of the approach and show that moderate-spectral-resolution measurements with a relatively high signal-to-noise ratio can be used to retrieve far-red fluorescence information with good precision and accuracy. The method is then applied to data from the Global Ozone Monitoring Instrument 2 (GOME-2). The GOME-2 fluorescence retrievals display similar spatial structure as compared with those from a simpler technique applied to the Greenhouse gases Observing SATellite (GOSAT). GOME-2 enables global mapping of far-red fluorescence with higher precision over smaller spatial and temporal scales than is possible with GOSAT. Near-global coverage is provided within a few days. We are able to show clearly for the first time physically plausible variations in fluorescence over the course of a single month at a spatial resolution of 0.5 0.5. We also show some significant differences between fluorescence and coincident normalized difference vegetation indices (NDVI) retrievals.
Precision Attitude Control for the BETTII Balloon-Borne Interferometer
NASA Technical Reports Server (NTRS)
Benford, Dominic J.; Fixsen, Dale J.; Rinehart. Stephen
2012-01-01
The Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII) is an 8-meter baseline far-infrared interferometer to fly on a high altitude balloon. Operating at wavelengths of 30-90 microns, BETTII will obtain spatial and spectral information on science targets at angular resolutions down to less than half an arcsecond, a capability unmatched by other far-infrared facilities. This requires attitude control at a level ofless than a tenth of an arcsecond, a great challenge for a lightweight balloon-borne system. We have designed a precision attitude determination system to provide gondola attitude knowledge at a level of 2 milliarcseconds at rates up to 100Hz, with accurate absolute attitude determination at the half arcsecond level at rates of up to 10Hz. A mUlti-stage control system involving rigid body motion and tip-tilt-piston correction provides precision pointing stability to the level required for the far-infrared instrument to perform its spatial/spectral interferometry in an open-loop control. We present key aspects of the design of the attitude determination and control and its development status.
Madison, Guy
2014-03-01
Timing performance becomes less precise for longer intervals, which makes it difficult to achieve simultaneity in synchronisation with a rhythm. The metrical structure of music, characterised by hierarchical levels of binary or ternary subdivisions of time, may function to increase precision by providing additional timing information when the subdivisions are explicit. This hypothesis was tested by comparing synchronisation performance across different numbers of metrical levels conveyed by loudness of sounds, such that the slowest level was loudest and the fastest was softest. Fifteen participants moved their hand with one of 9 inter-beat intervals (IBIs) ranging from 524 to 3,125 ms in 4 metrical level (ML) conditions ranging from 1 (one movement for each sound) to 4 (one movement for every 8th sound). The lowest relative variability (SD/IBI<1.5%) was obtained for the 3 longest IBIs (1600-3,125 ms) and MLs 3-4, significantly less than the smallest value (4-5% at 524-1024 ms) for any ML 1 condition in which all sounds are identical. Asynchronies were also more negative with higher ML. In conclusion, metrical subdivision provides information that facilitates temporal performance, which suggests an underlying neural multi-level mechanism capable of integrating information across levels. © 2013.
Coupled Integration of CSAC, MIMU, and GNSS for Improved PNT Performance
Ma, Lin; You, Zheng; Liu, Tianyi; Shi, Shuai
2016-01-01
Positioning, navigation, and timing (PNT) is a strategic key technology widely used in military and civilian applications. Global navigation satellite systems (GNSS) are the most important PNT techniques. However, the vulnerability of GNSS threatens PNT service quality, and integrations with other information are necessary. A chip scale atomic clock (CSAC) provides high-precision frequency and high-accuracy time information in a short time. A micro inertial measurement unit (MIMU) provides a strap-down inertial navigation system (SINS) with rich navigation information, better real-time feed, anti-jamming, and error accumulation. This study explores the coupled integration of CSAC, MIMU, and GNSS to enhance PNT performance. The architecture of coupled integration is designed and degraded when any subsystem fails. A mathematical model for a precise time aiding navigation filter is derived rigorously. The CSAC aids positioning by weighted linear optimization when the visible satellite number is four or larger. By contrast, CSAC converts the GNSS observations to range measurements by “clock coasting” when the visible satellite number is less than four, thereby constraining the error divergence of micro inertial navigation and improving the availability of GNSS signals and the positioning accuracy of the integration. Field vehicle experiments, both in open-sky area and in a harsh environment, show that the integration can improve the positioning probability and accuracy. PMID:27187399
Coupled Integration of CSAC, MIMU, and GNSS for Improved PNT Performance.
Ma, Lin; You, Zheng; Liu, Tianyi; Shi, Shuai
2016-05-12
Positioning, navigation, and timing (PNT) is a strategic key technology widely used in military and civilian applications. Global navigation satellite systems (GNSS) are the most important PNT techniques. However, the vulnerability of GNSS threatens PNT service quality, and integrations with other information are necessary. A chip scale atomic clock (CSAC) provides high-precision frequency and high-accuracy time information in a short time. A micro inertial measurement unit (MIMU) provides a strap-down inertial navigation system (SINS) with rich navigation information, better real-time feed, anti-jamming, and error accumulation. This study explores the coupled integration of CSAC, MIMU, and GNSS to enhance PNT performance. The architecture of coupled integration is designed and degraded when any subsystem fails. A mathematical model for a precise time aiding navigation filter is derived rigorously. The CSAC aids positioning by weighted linear optimization when the visible satellite number is four or larger. By contrast, CSAC converts the GNSS observations to range measurements by "clock coasting" when the visible satellite number is less than four, thereby constraining the error divergence of micro inertial navigation and improving the availability of GNSS signals and the positioning accuracy of the integration. Field vehicle experiments, both in open-sky area and in a harsh environment, show that the integration can improve the positioning probability and accuracy.
NASA Astrophysics Data System (ADS)
Morgenthaler, George; Khatib, Nader; Kim, Byoungsoo
with information to improve their crop's vigor has been a major topic of interest. With world population growing exponentially, arable land being consumed by urbanization, and an unfavorable farm economy, the efficiency of farming must increase to meet future food requirements and to make farming a sustainable occupation for the farmer. "Precision Agriculture" refers to a farming methodology that applies nutrients and moisture only where and when they are needed in the field. The goal is to increase farm revenue by increasing crop yield and decreasing applications of costly chemical and water treatments. In addition, this methodology will decrease the environmental costs of farming, i.e., reduce air, soil, and water pollution. Sensing/Precision Agriculture has not grown as rapidly as early advocates envisioned. Technology for a successful Remote Sensing/Precision Agriculture system is now available. Commercial satellite systems can image (multi-spectral) the Earth with a resolution of approximately 2.5 m. Variable precision dispensing systems using GPS are available and affordable. Crop models that predict yield as a function of soil, chemical, and irrigation parameter levels have been formulated. Personal computers and internet access are in place in most farm homes and can provide a mechanism to periodically disseminate, e.g. bi-weekly, advice on what quantities of water and chemicals are needed in individual regions of the field. What is missing is a model that fuses the disparate sources of information on the current states of the crop and soil, and the remaining resource levels available with the decisions farmers are required to make. This must be a product that is easy for the farmer to understand and to implement. A "Constrained Optimization Feed-back Control Model" to fill this void will be presented. The objective function of the model will be used to maximize the farmer's profit by increasing yields while decreasing environmental costs and decreasing application of costly treatments. This model will incorporate information from remote sensing, in-situ weather sources, soil measurements, crop models, and tacit farmer knowledge of the relative productivity of the selected control regions of the farm to provide incremental advice throughout the growing season on water and chemical treatments. Genetic and meta-heuristic algorithms will be used to solve the constrained optimization problem that possesses complex constraints and a non-linear objective function. *
Medical Device for Automated Prick Test Reading.
Justo, Xabier; Diaz, Inaki; Gil, Jorge Juan; Gastaminza, Gabriel
2018-05-01
Allergy tests are routinely performed in most hospitals everyday. However, measuring the outcomes of these tests is still a very laborious manual task. Current methods and systems lack of precision and repeatability. This paper presents a novel mechatronic system that is able to scan a patient's entire arm and provide allergists with precise measures of wheals for diagnosis. The device is based on 3-D laser technology and specific algorithms have been developed to process the information gathered. This system aims to automate the reading of skin prick tests and make gains in speed, accuracy, and reliability. Several experiments have been performed to evaluate the performance of the system.
Evaluation of SAPHIRE: an automated approach to indexing and retrieving medical literature.
Hersh, W.; Hickam, D. H.; Haynes, R. B.; McKibbon, K. A.
1991-01-01
An analysis of SAPHIRE, an experimental information retrieval system featuring automated indexing and natural language retrieval, was performed on MEDLINE references using data previously generated for a MEDLINE evaluation. Compared with searches performed by novice and expert physicians using MEDLINE, SAPHIRE achieved comparable recall and precision. While its combined recall and precision performance did not equal the level of librarians, SAPHIRE did achieve a significantly higher level of absolute recall. SAPHIRE has other potential advantages over existing MEDLINE systems. Its natural language interface does not require knowledge of MeSH, and it provides relevance ranking of retrieved references. PMID:1807718
Commissioning of the ATLAS pixel detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
ATLAS Collaboration; Golling, Tobias
2008-09-01
The ATLAS pixel detector is a high precision silicon tracking device located closest to the LHC interaction point. It belongs to the first generation of its kind in a hadron collider experiment. It will provide crucial pattern recognition information and will largely determine the ability of ATLAS to precisely track particle trajectories and find secondary vertices. It was the last detector to be installed in ATLAS in June 2007, has been fully connected and tested in-situ during spring and summer 2008, and is ready for the imminent LHC turn-on. The highlights of the past and future commissioning activities of themore » ATLAS pixel system are presented.« less
NASA Technical Reports Server (NTRS)
Devivar, Rodrigo
2014-01-01
The performance of a material is greatly influenced by its thermal and chemical properties. Analytical pyrolysis, when coupled to a GC-MS system, is a powerful technique that can unlock the thermal and chemical properties of almost any substance and provide vital information. At NASA, we depend on precise thermal analysis instrumentation for understanding aerospace travel. Our analytical techniques allow us to test materials in the laboratory prior to an actual field test; whether the field test is miles up in the sky or miles underground, the properties of any involved material must be fully studied and understood in the laboratory.
The research of radar target tracking observed information linear filter method
NASA Astrophysics Data System (ADS)
Chen, Zheng; Zhao, Xuanzhi; Zhang, Wen
2018-05-01
Aiming at the problems of low precision or even precision divergent is caused by nonlinear observation equation in radar target tracking, a new filtering algorithm is proposed in this paper. In this algorithm, local linearization is carried out on the observed data of the distance and angle respectively. Then the kalman filter is performed on the linearized data. After getting filtered data, a mapping operation will provide the posteriori estimation of target state. A large number of simulation results show that this algorithm can solve above problems effectively, and performance is better than the traditional filtering algorithm for nonlinear dynamic systems.
Zbýň, Š; Krššák, M; Memarsadeghi, M; Gholami, B; Haitel, A; Weber, M; Helbich, T H; Trattnig, S; Moser, E; Gruber, S
2014-07-01
The presented evaluation of the relative uncertainty (δ'CCC) of the (choline + creatine)/citrate (CC/C) ratios can provide objective information about the quality and diagnostic value of prostate MR spectroscopic imaging data. This information can be combined with the numeric values of CC/C ratios and provides metabolic-quality maps enabling accurate cancer detection and user-independent data evaluation. In addition, the prostate areas suffering most from the low precision of CC/C ratios (e. g., prostate base) were identified. © Georg Thieme Verlag KG Stuttgart · New York.
Robust adhesive precision bonding in automated assembly cells
NASA Astrophysics Data System (ADS)
Müller, Tobias; Haag, Sebastian; Bastuck, Thomas; Gisler, Thomas; Moser, Hansruedi; Uusimaa, Petteri; Axt, Christoph; Brecher, Christian
2014-03-01
Diode lasers are gaining importance, making their way to higher output powers along with improved BPP. The assembly of micro-optics for diode laser systems goes along with the highest requirements regarding assembly precision. Assembly costs for micro-optics are driven by the requirements regarding alignment in a submicron and the corresponding challenges induced by adhesive bonding. For micro-optic assembly tasks a major challenge in adhesive bonding at highest precision level is the fact, that the bonding process is irreversible. Accordingly, the first bonding attempt needs to be successful. Today's UV-curing adhesives inherit shrinkage effects crucial for submicron tolerances of e.g. FACs. The impact of the shrinkage effects can be tackled by a suitable bonding area design, such as minimal adhesive gaps and an adapted shrinkage offset value for the specific assembly parameters. Compensating shrinkage effects is difficult, as the shrinkage of UV-curing adhesives is not constant between two different lots and varies even over the storage period even under ideal circumstances as first test results indicate. An up-to-date characterization of the adhesive appears necessary for maximum precision in optics assembly to reach highest output yields, minimal tolerances and ideal beamshaping results. Therefore, a measurement setup to precisely determine the up-to-date level of shrinkage has been setup. The goal is to provide necessary information on current shrinkage to the operator or assembly cell to adjust the compensation offset on a daily basis. Impacts of this information are expected to be an improved beam shaping result and a first-time-right production.
Shi, Longxiang; Li, Shijian; Yang, Xiaoran; Qi, Jiaheng; Pan, Gang; Zhou, Binbin
2017-01-01
With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK), which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective.
Yang, Xiaoran; Qi, Jiaheng; Pan, Gang; Zhou, Binbin
2017-01-01
With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK), which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective. PMID:28299322
NASA Astrophysics Data System (ADS)
Elliott, Thomas J.; Gu, Mile
2018-03-01
Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.
Video-rate or high-precision: a flexible range imaging camera
NASA Astrophysics Data System (ADS)
Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.; Payne, Andrew D.; Conroy, Richard M.; Godbaz, John P.; Jongenelen, Adrian P. P.
2008-02-01
A range imaging camera produces an output similar to a digital photograph, but every pixel in the image contains distance information as well as intensity. This is useful for measuring the shape, size and location of objects in a scene, hence is well suited to certain machine vision applications. Previously we demonstrated a heterodyne range imaging system operating in a relatively high resolution (512-by-512) pixels and high precision (0.4 mm best case) configuration, but with a slow measurement rate (one every 10 s). Although this high precision range imaging is useful for some applications, the low acquisition speed is limiting in many situations. The system's frame rate and length of acquisition is fully configurable in software, which means the measurement rate can be increased by compromising precision and image resolution. In this paper we demonstrate the flexibility of our range imaging system by showing examples of high precision ranging at slow acquisition speeds and video-rate ranging with reduced ranging precision and image resolution. We also show that the heterodyne approach and the use of more than four samples per beat cycle provides better linearity than the traditional homodyne quadrature detection approach. Finally, we comment on practical issues of frame rate and beat signal frequency selection.
An Assessment of Imaging Informatics for Precision Medicine in Cancer.
Chennubhotla, C; Clarke, L P; Fedorov, A; Foran, D; Harris, G; Helton, E; Nordstrom, R; Prior, F; Rubin, D; Saltz, J H; Shalley, E; Sharma, A
2017-08-01
Objectives: Precision medicine requires the measurement, quantification, and cataloging of medical characteristics to identify the most effective medical intervention. However, the amount of available data exceeds our current capacity to extract meaningful information. We examine the informatics needs to achieve precision medicine from the perspective of quantitative imaging and oncology. Methods: The National Cancer Institute (NCI) organized several workshops on the topic of medical imaging and precision medicine. The observations and recommendations are summarized herein. Results: Recommendations include: use of standards in data collection and clinical correlates to promote interoperability; data sharing and validation of imaging tools; clinician's feedback in all phases of research and development; use of open-source architecture to encourage reproducibility and reusability; use of challenges which simulate real-world situations to incentivize innovation; partnership with industry to facilitate commercialization; and education in academic communities regarding the challenges involved with translation of technology from the research domain to clinical utility and the benefits of doing so. Conclusions: This article provides a survey of the role and priorities for imaging informatics to help advance quantitative imaging in the era of precision medicine. While these recommendations were drawn from oncology, they are relevant and applicable to other clinical domains where imaging aids precision medicine. Georg Thieme Verlag KG Stuttgart.
Capacity and precision in an animal model of visual short-term memory
Lara, Antonio H.; Wallis, Jonathan D.
2013-01-01
Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys’ VSTM capacity. Subjects’ performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrading the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM. PMID:22419756
IGS Network Coordinator Report - 2002
NASA Technical Reports Server (NTRS)
Moore, Angelyn
2004-01-01
The IGS network is a set of permanent, continuously-operating, dual-frequency GPS stations operated by over 100 worldwide agencies. The dataset is pooled at IGS Data Centers for routine use by IGS Analysis Centers in creating precise IGS products, as well as free access by other analysts around the world. The IGS Central Bureau hosts the IGS Network Coordinator, who assures adherence to standards and provides information regarding the IGS network via the Central Bureau Information System website at http://igscb.jpl.nasa.gov.
Beckmann, Jacques S; Lew, Daniel
2016-12-19
This era of groundbreaking scientific developments in high-resolution, high-throughput technologies is allowing the cost-effective collection and analysis of huge, disparate datasets on individual health. Proper data mining and translation of the vast datasets into clinically actionable knowledge will require the application of clinical bioinformatics. These developments have triggered multiple national initiatives in precision medicine-a data-driven approach centering on the individual. However, clinical implementation of precision medicine poses numerous challenges. Foremost, precision medicine needs to be contrasted with the powerful and widely used practice of evidence-based medicine, which is informed by meta-analyses or group-centered studies from which mean recommendations are derived. This "one size fits all" approach can provide inadequate solutions for outliers. Such outliers, which are far from an oddity as all of us fall into this category for some traits, can be better managed using precision medicine. Here, we argue that it is necessary and possible to bridge between precision medicine and evidence-based medicine. This will require worldwide and responsible data sharing, as well as regularly updated training programs. We also discuss the challenges and opportunities for achieving clinical utility in precision medicine. We project that, through collection, analyses and sharing of standardized medically relevant data globally, evidence-based precision medicine will shift progressively from therapy to prevention, thus leading eventually to improved, clinician-to-patient communication, citizen-centered healthcare and sustained well-being.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamborini, D., E-mail: davide.tamborini@polimi.it; Portaluppi, D.; Villa, F.
We present a Time-to-Digital Converter (TDC) card with a compact form factor, suitable for multichannel timing instruments or for integration into more complex systems. The TDC Card provides 10 ps timing resolution over the whole measurement range, which is selectable from 160 ns up to 10 μs, reaching 21 ps rms precision, 1.25% LSB rms differential nonlinearity, up to 3 Mconversion/s with 400 mW power consumption. The I/O edge card connector provides timing data readout through either a parallel bus or a 100 MHz serial interface and further measurement information like input signal rate and valid conversion rate (typically usefulmore » for time-correlated single-photon counting application) through an independent serial link.« less
NASA Astrophysics Data System (ADS)
Chaves-Montero, Jonás; Angulo, Raúl E.; Hernández-Monteagudo, Carlos
2018-07-01
In the upcoming era of high-precision galaxy surveys, it becomes necessary to understand the impact of redshift uncertainties on cosmological observables. In this paper we explore the effect of sub-percent photometric redshift errors (photo-z errors) on galaxy clustering and baryonic acoustic oscillations (BAOs). Using analytic expressions and results from 1000 N-body simulations, we show how photo-z errors modify the amplitude of moments of the 2D power spectrum, their variances, the amplitude of BAOs, and the cosmological information in them. We find that (a) photo-z errors suppress the clustering on small scales, increasing the relative importance of shot noise, and thus reducing the interval of scales available for BAO analyses; (b) photo-z errors decrease the smearing of BAOs due to non-linear redshift-space distortions (RSDs) by giving less weight to line-of-sight modes; and (c) photo-z errors (and small-scale RSD) induce a scale dependence on the information encoded in the BAO scale, and that reduces the constraining power on the Hubble parameter. Using these findings, we propose a template that extracts unbiased cosmological information from samples with photo-z errors with respect to cases without them. Finally, we provide analytic expressions to forecast the precision in measuring the BAO scale, showing that spectro-photometric surveys will measure the expansion history of the Universe with a precision competitive to that of spectroscopic surveys.
,
1983-01-01
This standard establishes uniform formats for geographic point location data. Geographic point location refers to the use of a coordinate system to define the position of a point that may be on, above, or below the Earth's surface. It provides a means for representing these data in digital form for the purpose of interchanging information among data systems and improving clarity and accuracy of interpersonal communications. This document is an expansion and clarification of National Bureau of Standards FIPS PUB 70, issued October 24, 1980. There are minor editorial changes, plus the following additions and modifications: (I) The representation of latitude and longitude using radian measure was added. (2) Alternate 2 for Representation of Hemispheric Information was deleted. (3) Use of the maximum precision for all numerical values was emphasized. The Alternate Representation of Precision was deleted. (4) The length of the zone representation for the State Plane Coordinate System was standardized. (5) The term altitude was substituted for elevation throughout to conform with international usage. (6) Section 3, Specifications for Altitude Data, was expanded and upgraded significantly to the same level of detail as for the horizontal values. (7) A table delineating the coverage of Universal Transverse Mercator zones and the longitudes of the Central Meridians was added and the other tables renumbered. (8) The total length of the representation of point location data at maximum precision was standardized.
NASA Astrophysics Data System (ADS)
Chaves-Montero, Jonás; Angulo, Raúl E.; Hernández-Monteagudo, Carlos
2018-04-01
In the upcoming era of high-precision galaxy surveys, it becomes necessary to understand the impact of redshift uncertainties on cosmological observables. In this paper we explore the effect of sub-percent photometric redshift errors (photo-z errors) on galaxy clustering and baryonic acoustic oscillations (BAO). Using analytic expressions and results from 1 000 N-body simulations, we show how photo-z errors modify the amplitude of moments of the 2D power spectrum, their variances, the amplitude of BAO, and the cosmological information in them. We find that: a) photo-z errors suppress the clustering on small scales, increasing the relative importance of shot noise, and thus reducing the interval of scales available for BAO analyses; b) photo-z errors decrease the smearing of BAO due to non-linear redshift-space distortions (RSD) by giving less weight to line-of-sight modes; and c) photo-z errors (and small-scale RSD) induce a scale dependence on the information encoded in the BAO scale, and that reduces the constraining power on the Hubble parameter. Using these findings, we propose a template that extracts unbiased cosmological information from samples with photo-z errors with respect to cases without them. Finally, we provide analytic expressions to forecast the precision in measuring the BAO scale, showing that spectro-photometric surveys will measure the expansion history of the Universe with a precision competitive to that of spectroscopic surveys.
Closed-Loop Control System for Friction Stir Welding Retractable Pin Tool
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey; Romine, Peter L.; Munafo, Paul M. (Technical Monitor)
2001-01-01
NASA invention disclosure, NASA Case No. MFS-31413, entitled "System for Controlling the Stirring Pin of a Friction Stir Welding Apparatus", (Patent Pending) authored by Jeff Ding, Dr Peter Romine and Pete Oelgoetz, addresses the precision control of the friction stir welding process. The closed-loop control system automatically adjusts the spinning welding pin, real-time, to maintain a precise penetration ligament (i.e., distance between pin-tip and weld panel backside surface). A specific pin length can be maintained while welding constant thickness or tapered material thickness weld panels. The closed-loop control system provides operator data and information relative to the exact position of the welding pin inside the weld joint. This paper presents the closed-loop RPT control system that operates using the auto-feedback of force signals sensed by the tip and shoulder of the welding pin. Significance: The FSW process can be successfully used in a production environment only if there is a method or technique that informs the FSW operator the precise location of the welding pin inside the weld joint. This is essential for applications in aerospace, automotive, pressure vessel, commercial aircraft and other industries.
Real-World Evidence In Support Of Precision Medicine: Clinico-Genomic Cancer Data As A Case Study.
Agarwala, Vineeta; Khozin, Sean; Singal, Gaurav; O'Connell, Claire; Kuk, Deborah; Li, Gerald; Gossai, Anala; Miller, Vincent; Abernethy, Amy P
2018-05-01
The majority of US adult cancer patients today are diagnosed and treated outside the context of any clinical trial (that is, in the real world). Although these patients are not part of a research study, their clinical data are still recorded. Indeed, data captured in electronic health records form an ever-growing, rich digital repository of longitudinal patient experiences, treatments, and outcomes. Likewise, genomic data from tumor molecular profiling are increasingly guiding oncology care. Linking real-world clinical and genomic data, as well as information from other co-occurring data sets, could create study populations that provide generalizable evidence for precision medicine interventions. However, the infrastructure required to link, ensure quality, and rapidly learn from such composite data is complex. We outline the challenges and describe a novel approach to building a real-world clinico-genomic database of patients with cancer. This work represents a case study in how data collected during routine patient care can inform precision medicine efforts for the population at large. We suggest that health policies can promote innovation by defining appropriate uses of real-world evidence, establishing data standards, and incentivizing data sharing.
NASA Technical Reports Server (NTRS)
Dutra, Jayne E.; Smith, Lisa
2006-01-01
The goal of this plan is to briefly describe new technologies available to us in the arenas of information discovery and discuss the strategic value they have for the NASA enterprise with some considerations and suggestions for near term implementations using the NASA Engineering Network (NEN) as a delivery venue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, Paul J.; Hill, Karen K.
2009-11-09
The results outlined in this report provide the information for needed to apply a SNP-based forensic analysis to diverse ricin preparations. The same methods could be useful in castor breeding programs that seek to reduce or eliminate ricin in oil-producing R. communis cultivars.
Potential Energy Surface Database of Group II Dimer
National Institute of Standards and Technology Data Gateway
SRD 143 NIST Potential Energy Surface Database of Group II Dimer (Web, free access) This database provides critical atomic and molecular data needed in order to evaluate the feasibility of using laser cooled and trapped Group II atomic species (Mg, Ca, Sr, and Ba) for ultra-precise optical clocks or quantum information processing devices.
Usability-driven pruning of large ontologies: the case of SNOMED CT.
López-García, Pablo; Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-06-01
To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Graph-traversal heuristics provided high coverage (71-96% of terms in the test sets of discharge summaries) at the expense of subset size (17-51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24-55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available.
Jiang, Taoran; Zhu, Ming; Zan, Tao; Gu, Bin; Li, Qingfeng
2017-08-01
In perforator flap transplantation, dissection of the perforator is an important but difficult procedure because of the high variability in vascular anatomy. Preoperative imaging techniques could provide substantial information about vascular anatomy; however, it cannot provide direct guidance for surgeons during the operation. In this study, a navigation system (NS) was established to overlie a vascular map on surgical sites to further provide a direct guide for perforator flap transplantation. The NS was established based on computed tomographic angiography and augmented reality techniques. A virtual vascular map was reconstructed according to computed tomographic angiography data and projected onto real patient images using ARToolKit software. Additionally, a screw-fixation marker holder was created to facilitate registration. With the use of a tracking and display system, we conducted the NS on an animal model and measured the system error on a rapid prototyping model. The NS assistance allowed for correct identification, as well as a safe and precise dissection of the perforator. The mean value of the system error was determined to be 3.474 ± 1.546 mm. Augmented reality-based NS can provide precise navigation information by directly displaying a 3-dimensional individual anatomical virtual model onto the operative field in real time. It will allow rapid identification and safe dissection of a perforator in free flap transplantation surgery.
Using language models to identify relevant new information in inpatient clinical notes.
Zhang, Rui; Pakhomov, Serguei V; Lee, Janet T; Melton, Genevieve B
2014-01-01
Redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous and may negatively impact the use of these notes by clinicians, and, potentially, the efficiency of patient care delivery. Automated methods to identify redundant versus relevant new information may provide a valuable tool for clinicians to better synthesize patient information and navigate to clinically important details. In this study, we investigated the use of language models for identification of new information in inpatient notes, and evaluated our methods using expert-derived reference standards. The best method achieved precision of 0.743, recall of 0.832 and F1-measure of 0.784. The average proportion of redundant information was similar between inpatient and outpatient progress notes (76.6% (SD=17.3%) and 76.7% (SD=14.0%), respectively). Advanced practice providers tended to have higher rates of redundancy in their notes compared to physicians. Future investigation includes the addition of semantic components and visualization of new information.
Using Language Models to Identify Relevant New Information in Inpatient Clinical Notes
Zhang, Rui; Pakhomov, Serguei V.; Lee, Janet T.; Melton, Genevieve B.
2014-01-01
Redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous and may negatively impact the use of these notes by clinicians, and, potentially, the efficiency of patient care delivery. Automated methods to identify redundant versus relevant new information may provide a valuable tool for clinicians to better synthesize patient information and navigate to clinically important details. In this study, we investigated the use of language models for identification of new information in inpatient notes, and evaluated our methods using expert-derived reference standards. The best method achieved precision of 0.743, recall of 0.832 and F1-measure of 0.784. The average proportion of redundant information was similar between inpatient and outpatient progress notes (76.6% (SD=17.3%) and 76.7% (SD=14.0%), respectively). Advanced practice providers tended to have higher rates of redundancy in their notes compared to physicians. Future investigation includes the addition of semantic components and visualization of new information. PMID:25954438
Hayes, D.B.; Baylis, J.R.; Carl, L.M.; Dodd, H.R.; Goldstein, J.D.; McLaughlin, R.L.; Noakes, D.L.G.; Porto, L.M.
2003-01-01
Four sampling designs for quantifying the effect of low-head sea lamprey (Petromyzon marinus) barriers on fish communities were evaluated, and the contribution of process-oriented research to the overall confidence of results obtained was discussed. The designs include: (1) sample barrier streams post-construction; (2) sample barrier and reference streams post-construction; (3) sample barrier streams pre- and post-construction; and (4) sample barrier and reference streams pre- and post-construction. In the statistical literature, the principal basis for comparison of sampling designs is generally the precision achieved by each design. In addition to precision, designs should be compared based on the interpretability of results and on the scale to which the results apply. Using data collected in a broad survey of streams with and without sea lamprey barriers, some of the tradeoffs that occur among precision, scale, and interpretability are illustrated. Although circumstances such as funding and availability of pre-construction data may limit which design can be implemented, a pre/post-construction design including barrier and reference streams provides the most meaningful information for use in barrier management decisions. Where it is not feasible to obtain pre-construction data, a design including reference streams is important to maintain the interpretability of results. Regardless of the design used, process-oriented research provides a framework for interpreting results obtained in broad surveys. As such, information from both extensive surveys and intensive process-oriented research provides the best basis for fishery management actions, and gives researchers and managers the most confidence in the conclusions reached regarding the effects of sea lamprey barriers.
Konokhova, Anastasiya I; Chernova, Darya N; Moskalensky, Alexander E; Strokotov, Dmitry I; Yurkin, Maxim A; Chernyshev, Andrei V; Maltsev, Valeri P
2016-02-01
Importance of microparticles (MPs), also regarded as extracellular vesicles, in many physiological processes and clinical conditions motivates one to use the most informative and precise methods for their characterization. Methods based on individual particle analysis provide statistically reliable distributions of MP population over characteristics. Although flow cytometry is one of the most powerful technologies of this type, the standard forward-versus-side-scattering plots of MPs and platelets (PLTs) overlap considerably because of similarity of their morphological characteristics. Moreover, ordinary flow cytometry is not capable of measurement of size and refractive index (RI) of MPs. In this study, we 1) employed the potential of the scanning flow cytometer (SFC) for identification and characterization of MPs from light scattering; 2) suggested the reference method to characterize MP morphology (size and RI) with high precision; and 3) determined the lowest size of a MP that can be characterized from light scattering with the SFC. We equipped the SFC with 405 and 488 nm lasers to measure the light-scattering profiles and side scattering from MPs, respectively. The developed two-stage method allowed accurate separation of PLTs and MPs in platelet-rich plasma. We used two optical models for MPs, a sphere and a bisphere, in the solution of the inverse light-scattering problem. This solution provides unprecedented precision in determination of size and RI of individual spherical MPs-median uncertainties (standard deviations) were 6 nm and 0.003, respectively. The developed method provides instrument-independent quantitative information on MPs, which can be used in studies of various factors affecting MP population. © 2015 International Society for Advancement of Cytometry.
Yanq, Xuming; Ye, Yijun; Xia, Yong; Wei, Xuanzhong; Wang, Zheyu; Ni, Hongmei; Zhu, Ying; Xu, Lingyu
2015-02-01
To develop a more precise and accurate method, and identified a procedure to measure whether an acupoint had been correctly located. On the face, we used an acupoint location from different acupuncture experts and obtained the most precise and accurate values of acupoint location based on the consistency information fusion algorithm, through a virtual simulation of the facial orientation coordinate system. Because of inconsistencies in each acupuncture expert's original data, the system error the general weight calculation. First, we corrected each expert of acupoint location system error itself, to obtain a rational quantification for each expert of acupuncture and moxibustion acupoint location consistent support degree, to obtain pointwise variable precision fusion results, to put every expert's acupuncture acupoint location fusion error enhanced to pointwise variable precision. Then, we more effectively used the measured characteristics of different acupuncture expert's acupoint location, to improve the measurement information utilization efficiency and acupuncture acupoint location precision and accuracy. Based on using the consistency matrix pointwise fusion method on the acupuncture experts' acupoint location values, each expert's acupoint location information could be calculated, and the most precise and accurate values of each expert's acupoint location could be obtained.
Technological advances in precision medicine and drug development.
Maggi, Elaine; Patterson, Nicole E; Montagna, Cristina
New technologies are rapidly becoming available to expand the arsenal of tools accessible for precision medicine and to support the development of new therapeutics. Advances in liquid biopsies, which analyze cells, DNA, RNA, proteins, or vesicles isolated from the blood, have gained particular interest for their uses in acquiring information reflecting the biology of tumors and metastatic tissues. Through advancements in DNA sequencing that have merged unprecedented accuracy with affordable cost, personalized treatments based on genetic variations are becoming a real possibility. Extraordinary progress has been achieved in the development of biological therapies aimed to even further advance personalized treatments. We provide a summary of current and future applications of blood based liquid biopsies and how new technologies are utilized for the development of biological therapeutic treatments. We discuss current and future sequencing methods with an emphasis on how technological advances will support the progress in the field of precision medicine.
Pushing the limits of Monte Carlo simulations for the three-dimensional Ising model
NASA Astrophysics Data System (ADS)
Ferrenberg, Alan M.; Xu, Jiahao; Landau, David P.
2018-04-01
While the three-dimensional Ising model has defied analytic solution, various numerical methods like Monte Carlo, Monte Carlo renormalization group, and series expansion have provided precise information about the phase transition. Using Monte Carlo simulation that employs the Wolff cluster flipping algorithm with both 32-bit and 53-bit random number generators and data analysis with histogram reweighting and quadruple precision arithmetic, we have investigated the critical behavior of the simple cubic Ising Model, with lattice sizes ranging from 163 to 10243. By analyzing data with cross correlations between various thermodynamic quantities obtained from the same data pool, e.g., logarithmic derivatives of magnetization and derivatives of magnetization cumulants, we have obtained the critical inverse temperature Kc=0.221 654 626 (5 ) and the critical exponent of the correlation length ν =0.629 912 (86 ) with precision that exceeds all previous Monte Carlo estimates.
Validating spatiotemporal predictions of an important pest of small grains.
Merrill, Scott C; Holtzer, Thomas O; Peairs, Frank B; Lester, Philip J
2015-01-01
Arthropod pests are typically managed using tactics applied uniformly to the whole field. Precision pest management applies tactics under the assumption that within-field pest pressure differences exist. This approach allows for more precise and judicious use of scouting resources and management tactics. For example, a portion of a field delineated as attractive to pests may be selected to receive extra monitoring attention. Likely because of the high variability in pest dynamics, little attention has been given to developing precision pest prediction models. Here, multimodel synthesis was used to develop a spatiotemporal model predicting the density of a key pest of wheat, the Russian wheat aphid, Diuraphis noxia (Kurdjumov). Spatially implicit and spatially explicit models were synthesized to generate spatiotemporal pest pressure predictions. Cross-validation and field validation were used to confirm model efficacy. A strong within-field signal depicting aphid density was confirmed with low prediction errors. Results show that the within-field model predictions will provide higher-quality information than would be provided by traditional field scouting. With improvements to the broad-scale model component, the model synthesis approach and resulting tool could improve pest management strategy and provide a template for the development of spatially explicit pest pressure models. © 2014 Society of Chemical Industry.
Context-aware recommender system based on ontology for recommending tourist destinations at Bandung
NASA Astrophysics Data System (ADS)
Rizaldy Hafid Arigi, L.; Abdurahman Baizal, Z. K.; Herdiani, Anisa
2018-03-01
Recommender System is software that is able to provide personalized recommendation suits users’ needs. Recommender System has been widely implemented in various domains, including tourism. One approach that can be done for more personalized recommendations is the use of contextual information. This paper proposes a context aware recommender based ontology system in the tourism domain. The system is capable of recommending tourist destinations by using user preferences of the categories of tourism and contextual information such as user locations, weather around tourist destinations and close time of destination. Based on the evaluation, the system has accuracy of of 0.94 (item recommendation precision evaluated by expert) and 0.58 (implicitly from system-end user interaction). Based on the evaluation of user satisfaction, the system provides a satisfaction level of more than 0.7 (scale 0 to 1) for speed factors for providing liked recommendations (PE), informative description of recommendations (INF) and user trust (TR).
QTLTableMiner++: semantic mining of QTL tables in scientific articles.
Singh, Gurnoor; Kuzniar, Arnold; van Mulligen, Erik M; Gavai, Anand; Bachem, Christian W; Visser, Richard G F; Finkers, Richard
2018-05-25
A quantitative trait locus (QTL) is a genomic region that correlates with a phenotype. Most of the experimental information about QTL mapping studies is described in tables of scientific publications. Traditional text mining techniques aim to extract information from unstructured text rather than from tables. We present QTLTableMiner ++ (QTM), a table mining tool that extracts and semantically annotates QTL information buried in (heterogeneous) tables of plant science literature. QTM is a command line tool written in the Java programming language. This tool takes scientific articles from the Europe PMC repository as input, extracts QTL tables using keyword matching and ontology-based concept identification. The tables are further normalized using rules derived from table properties such as captions, column headers and table footers. Furthermore, table columns are classified into three categories namely column descriptors, properties and values based on column headers and data types of cell entries. Abbreviations found in the tables are expanded using the Schwartz and Hearst algorithm. Finally, the content of QTL tables is semantically enriched with domain-specific ontologies (e.g. Crop Ontology, Plant Ontology and Trait Ontology) using the Apache Solr search platform and the results are stored in a relational database and a text file. The performance of the QTM tool was assessed by precision and recall based on the information retrieved from two manually annotated corpora of open access articles, i.e. QTL mapping studies in tomato (Solanum lycopersicum) and in potato (S. tuberosum). In summary, QTM detected QTL statements in tomato with 74.53% precision and 92.56% recall and in potato with 82.82% precision and 98.94% recall. QTM is a unique tool that aids in providing QTL information in machine-readable and semantically interoperable formats.
Animal research as a basis for clinical trials.
Faggion, Clovis M
2015-04-01
Animal experiments are critical for the development of new human therapeutics because they provide mechanistic information, as well as important information on efficacy and safety. Some evidence suggests that authors of animal research in dentistry do not observe important methodological issues when planning animal experiments, for example sample-size calculation. Low-quality animal research directly interferes with development of the research process in which multiple levels of research are interconnected. For example, high-quality animal experiments generate sound information for the further planning and development of randomized controlled trials in humans. These randomized controlled trials are the main source for the development of systematic reviews and meta-analyses, which will generate the best evidence for the development of clinical guidelines. Therefore, adequate planning of animal research is a sine qua non condition for increasing efficacy and efficiency in research. Ethical concerns arise when animal research is not performed with high standards. This Focus article presents the latest information on the standards of animal research in dentistry, more precisely in the field of implant dentistry. Issues on precision and risk of bias are discussed, and strategies to reduce risk of bias in animal research are reported. © 2015 Eur J Oral Sci.
The Multi-energy High precision Data Processor Based on AD7606
NASA Astrophysics Data System (ADS)
Zhao, Chen; Zhang, Yanchi; Xie, Da
2017-11-01
This paper designs an information collector based on AD7606 to realize the high-precision simultaneous acquisition of multi-source information of multi-energy systems to form the information platform of the energy Internet at Laogang with electricty as its major energy source. Combined with information fusion technologies, this paper analyzes the data to improve the overall energy system scheduling capability and reliability.
Quantitative Determination of Isotope Ratios from Experimental Isotopic Distributions
Kaur, Parminder; O’Connor, Peter B.
2008-01-01
Isotope variability due to natural processes provides important information for studying a variety of complex natural phenomena from the origins of a particular sample to the traces of biochemical reaction mechanisms. These measurements require high-precision determination of isotope ratios of a particular element involved. Isotope Ratio Mass Spectrometers (IRMS) are widely employed tools for such a high-precision analysis, which have some limitations. This work aims at overcoming the limitations inherent to IRMS by estimating the elemental isotopic abundance from the experimental isotopic distribution. In particular, a computational method has been derived which allows the calculation of 13C/12C ratios from the whole isotopic distributions, given certain caveats, and these calculations are applied to several cases to demonstrate their utility. The limitations of the method in terms of the required number of ions and S/N ratio are discussed. For high-precision estimates of the isotope ratios, this method requires very precise measurement of the experimental isotopic distribution abundances, free from any artifacts introduced by noise, sample heterogeneity, or other experimental sources. PMID:17263354
Liu, Hongfang; Maxwell, Kara N.; Pathak, Jyotishman; Zhang, Rui
2018-01-01
Abstract Precision medicine is at the forefront of biomedical research. Cancer registries provide rich perspectives and electronic health records (EHRs) are commonly utilized to gather additional clinical data elements needed for translational research. However, manual annotation is resource‐intense and not readily scalable. Informatics‐based phenotyping presents an ideal solution, but perspectives obtained can be impacted by both data source and algorithm selection. We derived breast cancer (BC) receptor status phenotypes from structured and unstructured EHR data using rule‐based algorithms, including natural language processing (NLP). Overall, the use of NLP increased BC receptor status coverage by 39.2% from 69.1% with structured medication information alone. Using all available EHR data, estrogen receptor‐positive BC cases were ascertained with high precision (P = 0.976) and recall (R = 0.987) compared with gold standard chart‐reviewed patients. However, status negation (R = 0.591) decreased 40.2% when relying on structured medications alone. Using multiple EHR data types (and thorough understanding of the perspectives offered) are necessary to derive robust EHR‐based precision medicine phenotypes. PMID:29084368
Li, Yuanfang; Zhou, Zhiwei
2016-02-01
Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.
NASA Astrophysics Data System (ADS)
Moreenthaler, George W.; Khatib, Nader; Kim, Byoungsoo
2003-08-01
For two decades now, the use of Remote Sensing/Precision Agriculture to improve farm yields while reducing the use of polluting chemicals and the limited water supply has been a major goal. With world population growing exponentially, arable land being consumed by urbanization, and an unfavorable farm economy, farm efficiency must increase to meet future food requirements and to make farming a sustainable, profitable occupation. "Precision Agriculture" refers to a farming methodology that applies nutrients and moisture only where and when they are needed in the field. The real goal is to increase farm profitability by identifying the additional treatments of chemicals and water that increase revenues more than they increase costs and do no exceed pollution standards (constrained optimization). Even though the economic and environmental benefits appear to be great, Remote Sensing/Precision Agriculture has not grown as rapidly as early advocates envisioned. Technology for a successful Remote Sensing/Precision Agriculture system is now in place, but other needed factors have been missing. Commercial satellite systems can now image the Earth (multi-spectrally) with a resolution as fine as 2.5 m. Precision variable dispensing systems using GPS are now available and affordable. Crop models that predict yield as a function of soil, chemical, and irrigation parameter levels have been developed. Personal computers and internet access are now in place in most farm homes and can provide a mechanism for periodically disseminating advice on what quantities of water and chemicals are needed in specific regions of each field. Several processes have been selected that fuse the disparate sources of information on the current and historic states of the crop and soil, and the remaining resource levels available, with the critical decisions that farmers are required to make. These are done in a way that is easy for the farmer to understand and profitable to implement. A "Constrained Optimization Algorithm" to further improve these processes will be presented. The objective function of the model will used to maximize the farmer's profit via increasing yields while decreasing environmental damage and decreasing applications of costly treatments. This model will incorporate information from Remote Sensing, from in-situ weather sources, from soil history, and from tacit farmer knowledge of the relative productivity of selected "Management Zones" of the farm, to provide incremental advice throughout the growing season on the optimum usage of water and chemical treatments.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
Semiempirical studies of atomic structure. Progress report, 1 July 1984-1 January 1985
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, L.J.
1985-01-01
Through the acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities have been and are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, much new information has become available since this program was begun in 1980. The purpose of the project is to perform needed measurements and to utilize the available data through parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences to provide predictions for large classes of quantities with a precision thatmore » is sharpened by subsequent measurements.« less
Aguilar, M; Aisa, D; Alpat, B; Alvino, A; Ambrosi, G; Andeen, K; Arruda, L; Attig, N; Azzarello, P; Bachlechner, A; Barao, F; Barrau, A; Barrin, L; Bartoloni, A; Basara, L; Battarbee, M; Battiston, R; Bazo, J; Becker, U; Behlmann, M; Beischer, B; Berdugo, J; Bertucci, B; Bigongiari, G; Bindi, V; Bizzaglia, S; Bizzarri, M; Boella, G; de Boer, W; Bollweg, K; Bonnivard, V; Borgia, B; Borsini, S; Boschini, M J; Bourquin, M; Burger, J; Cadoux, F; Cai, X D; Capell, M; Caroff, S; Casaus, J; Cascioli, V; Castellini, G; Cernuda, I; Cervelli, F; Chae, M J; Chang, Y H; Chen, A I; Chen, H; Cheng, G M; Chen, H S; Cheng, L; Chikanian, A; Chou, H Y; Choumilov, E; Choutko, V; Chung, C H; Clark, C; Clavero, R; Coignet, G; Consolandi, C; Contin, A; Corti, C; Coste, B; Crispoltoni, M; Cui, Z; Dai, M; Delgado, C; Della Torre, S; Demirköz, M B; Derome, L; Di Falco, S; Di Masso, L; Dimiccoli, F; Díaz, C; von Doetinchem, P; Donnini, F; Du, W J; Duranti, M; D'Urso, D; Eline, A; Eppling, F J; Eronen, T; Fan, Y Y; Farnesini, L; Feng, J; Fiandrini, E; Fiasson, A; Finch, E; Fisher, P; Galaktionov, Y; Gallucci, G; García, B; García-López, R; Gargiulo, C; Gast, H; Gebauer, I; Gervasi, M; Ghelfi, A; Gillard, W; Giovacchini, F; Goglov, P; Gong, J; Goy, C; Grabski, V; Grandi, D; Graziani, M; Guandalini, C; Guerri, I; Guo, K H; Habiby, M; Haino, S; Han, K C; He, Z H; Heil, M; Hoffman, J; Hsieh, T H; Huang, Z C; Huh, C; Incagli, M; Ionica, M; Jang, W Y; Jinchi, H; Kanishev, K; Kim, G N; Kim, K S; Kirn, Th; Kossakowski, R; Kounina, O; Kounine, A; Koutsenko, V; Krafczyk, M S; Kunz, S; La Vacca, G; Laudi, E; Laurenti, G; Lazzizzera, I; Lebedev, A; Lee, H T; Lee, S C; Leluc, C; Li, H L; Li, J Q; Li, Q; Li, Q; Li, T X; Li, W; Li, Y; Li, Z H; Li, Z Y; Lim, S; Lin, C H; Lipari, P; Lippert, T; Liu, D; Liu, H; Lomtadze, T; Lu, M J; Lu, Y S; Luebelsmeyer, K; Luo, F; Luo, J Z; Lv, S S; Majka, R; Malinin, A; Mañá, C; Marín, J; Martin, T; Martínez, G; Masi, N; Maurin, D; Menchaca-Rocha, A; Meng, Q; Mo, D C; Morescalchi, L; Mott, P; Müller, M; Ni, J Q; Nikonov, N; Nozzoli, F; Nunes, P; Obermeier, A; Oliva, A; Orcinha, M; Palmonari, F; Palomares, C; Paniccia, M; Papi, A; Pauluzzi, M; Pedreschi, E; Pensotti, S; Pereira, R; Pilo, F; Piluso, A; Pizzolotto, C; Plyaskin, V; Pohl, M; Poireau, V; Postaci, E; Putze, A; Quadrani, L; Qi, X M; Räihä, T; Rancoita, P G; Rapin, D; Ricol, J S; Rodríguez, I; Rosier-Lees, S; Rozhkov, A; Rozza, D; Sagdeev, R; Sandweiss, J; Saouter, P; Sbarra, C; Schael, S; Schmidt, S M; Schuckardt, D; Schulz von Dratzig, A; Schwering, G; Scolieri, G; Seo, E S; Shan, B S; Shan, Y H; Shi, J Y; Shi, X Y; Shi, Y M; Siedenburg, T; Son, D; Spada, F; Spinella, F; Sun, W; Sun, W H; Tacconi, M; Tang, C P; Tang, X W; Tang, Z C; Tao, L; Tescaro, D; Ting, Samuel C C; Ting, S M; Tomassetti, N; Torsti, J; Türkoğlu, C; Urban, T; Vagelli, V; Valente, E; Vannini, C; Valtonen, E; Vaurynovich, S; Vecchi, M; Velasco, M; Vialle, J P; Wang, L Q; Wang, Q L; Wang, R S; Wang, X; Wang, Z X; Weng, Z L; Whitman, K; Wienkenhöver, J; Wu, H; Xia, X; Xie, M; Xie, S; Xiong, R Q; Xin, G M; Xu, N S; Xu, W; Yan, Q; Yang, J; Yang, M; Ye, Q H; Yi, H; Yu, Y J; Yu, Z Q; Zeissler, S; Zhang, J H; Zhang, M T; Zhang, X B; Zhang, Z; Zheng, Z M; Zhuang, H L; Zhukov, V; Zichichi, A; Zimmermann, N; Zuccon, P; Zurbach, C
2014-11-28
We present a measurement of the cosmic ray (e^{+}+e^{-}) flux in the range 0.5 GeV to 1 TeV based on the analysis of 10.6 million (e^{+}+e^{-}) events collected by AMS. The statistics and the resolution of AMS provide a precision measurement of the flux. The flux is smooth and reveals new and distinct information. Above 30.2 GeV, the flux can be described by a single power law with a spectral index γ=-3.170±0.008(stat+syst)±0.008(energy scale).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilar, M.; Aisa, D.; Alpat, B.
2014-11-26
We present a measurement of the cosmic ray (e ++e -) flux in the range 0.5 GeV to 1 TeV based on the analysis of 10.6 million (e ++e -) events collected by AMS. The statistics and the resolution of AMS provide a precision measurement of the flux. The flux is smooth and reveals new and distinct information. Above 30.2 GeV, the flux can be described by a single power law with a spectral index γ= -3.170 ± 0.008(stat+syst) ± 0.008(energy scale).
NASA Astrophysics Data System (ADS)
Aguilar, M.; Aisa, D.; Alpat, B.; Alvino, A.; Ambrosi, G.; Andeen, K.; Arruda, L.; Attig, N.; Azzarello, P.; Bachlechner, A.; Barao, F.; Barrau, A.; Barrin, L.; Bartoloni, A.; Basara, L.; Battarbee, M.; Battiston, R.; Bazo, J.; Becker, U.; Behlmann, M.; Beischer, B.; Berdugo, J.; Bertucci, B.; Bigongiari, G.; Bindi, V.; Bizzaglia, S.; Bizzarri, M.; Boella, G.; de Boer, W.; Bollweg, K.; Bonnivard, V.; Borgia, B.; Borsini, S.; Boschini, M. J.; Bourquin, M.; Burger, J.; Cadoux, F.; Cai, X. D.; Capell, M.; Caroff, S.; Casaus, J.; Cascioli, V.; Castellini, G.; Cernuda, I.; Cervelli, F.; Chae, M. J.; Chang, Y. H.; Chen, A. I.; Chen, H.; Cheng, G. M.; Chen, H. S.; Cheng, L.; Chikanian, A.; Chou, H. Y.; Choumilov, E.; Choutko, V.; Chung, C. H.; Clark, C.; Clavero, R.; Coignet, G.; Consolandi, C.; Contin, A.; Corti, C.; Coste, B.; Crispoltoni, M.; Cui, Z.; Dai, M.; Delgado, C.; Della Torre, S.; Demirköz, M. B.; Derome, L.; Di Falco, S.; Di Masso, L.; Dimiccoli, F.; Díaz, C.; von Doetinchem, P.; Donnini, F.; Du, W. J.; Duranti, M.; D'Urso, D.; Eline, A.; Eppling, F. J.; Eronen, T.; Fan, Y. Y.; Farnesini, L.; Feng, J.; Fiandrini, E.; Fiasson, A.; Finch, E.; Fisher, P.; Galaktionov, Y.; Gallucci, G.; García, B.; García-López, R.; Gargiulo, C.; Gast, H.; Gebauer, I.; Gervasi, M.; Ghelfi, A.; Gillard, W.; Giovacchini, F.; Goglov, P.; Gong, J.; Goy, C.; Grabski, V.; Grandi, D.; Graziani, M.; Guandalini, C.; Guerri, I.; Guo, K. H.; Habiby, M.; Haino, S.; Han, K. C.; He, Z. H.; Heil, M.; Hoffman, J.; Hsieh, T. H.; Huang, Z. C.; Huh, C.; Incagli, M.; Ionica, M.; Jang, W. Y.; Jinchi, H.; Kanishev, K.; Kim, G. N.; Kim, K. S.; Kirn, Th.; Kossakowski, R.; Kounina, O.; Kounine, A.; Koutsenko, V.; Krafczyk, M. S.; Kunz, S.; La Vacca, G.; Laudi, E.; Laurenti, G.; Lazzizzera, I.; Lebedev, A.; Lee, H. T.; Lee, S. C.; Leluc, C.; Li, H. L.; Li, J. Q.; Li, Q.; Li, Q.; Li, T. X.; Li, W.; Li, Y.; Li, Z. H.; Li, Z. Y.; Lim, S.; Lin, C. H.; Lipari, P.; Lippert, T.; Liu, D.; Liu, H.; Lomtadze, T.; Lu, M. J.; Lu, Y. S.; Luebelsmeyer, K.; Luo, F.; Luo, J. Z.; Lv, S. S.; Majka, R.; Malinin, A.; Mañá, C.; Marín, J.; Martin, T.; Martínez, G.; Masi, N.; Maurin, D.; Menchaca-Rocha, A.; Meng, Q.; Mo, D. C.; Morescalchi, L.; Mott, P.; Müller, M.; Ni, J. Q.; Nikonov, N.; Nozzoli, F.; Nunes, P.; Obermeier, A.; Oliva, A.; Orcinha, M.; Palmonari, F.; Palomares, C.; Paniccia, M.; Papi, A.; Pauluzzi, M.; Pedreschi, E.; Pensotti, S.; Pereira, R.; Pilo, F.; Piluso, A.; Pizzolotto, C.; Plyaskin, V.; Pohl, M.; Poireau, V.; Postaci, E.; Putze, A.; Quadrani, L.; Qi, X. M.; Räihä, T.; Rancoita, P. G.; Rapin, D.; Ricol, J. S.; Rodríguez, I.; Rosier-Lees, S.; Rozhkov, A.; Rozza, D.; Sagdeev, R.; Sandweiss, J.; Saouter, P.; Sbarra, C.; Schael, S.; Schmidt, S. M.; Schuckardt, D.; Schulz von Dratzig, A.; Schwering, G.; Scolieri, G.; Seo, E. S.; Shan, B. S.; Shan, Y. H.; Shi, J. Y.; Shi, X. Y.; Shi, Y. M.; Siedenburg, T.; Son, D.; Spada, F.; Spinella, F.; Sun, W.; Sun, W. H.; Tacconi, M.; Tang, C. P.; Tang, X. W.; Tang, Z. C.; Tao, L.; Tescaro, D.; Ting, Samuel C. C.; Ting, S. M.; Tomassetti, N.; Torsti, J.; Türkoǧlu, C.; Urban, T.; Vagelli, V.; Valente, E.; Vannini, C.; Valtonen, E.; Vaurynovich, S.; Vecchi, M.; Velasco, M.; Vialle, J. P.; Wang, L. Q.; Wang, Q. L.; Wang, R. S.; Wang, X.; Wang, Z. X.; Weng, Z. L.; Whitman, K.; Wienkenhöver, J.; Wu, H.; Xia, X.; Xie, M.; Xie, S.; Xiong, R. Q.; Xin, G. M.; Xu, N. S.; Xu, W.; Yan, Q.; Yang, J.; Yang, M.; Ye, Q. H.; Yi, H.; Yu, Y. J.; Yu, Z. Q.; Zeissler, S.; Zhang, J. H.; Zhang, M. T.; Zhang, X. B.; Zhang, Z.; Zheng, Z. M.; Zhuang, H. L.; Zhukov, V.; Zichichi, A.; Zimmermann, N.; Zuccon, P.; Zurbach, C.; AMS Collaboration
2014-11-01
We present a measurement of the cosmic ray (e++e-) flux in the range 0.5 GeV to 1 TeV based on the analysis of 10.6 million (e++e-) events collected by AMS. The statistics and the resolution of AMS provide a precision measurement of the flux. The flux is smooth and reveals new and distinct information. Above 30.2 GeV, the flux can be described by a single power law with a spectral index γ =-3.170 ±0.008 (stat+syst)±0.008 (energy scale) .
Quality Assurance handbook for air pollution measurement systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-08-01
The purpose of this volume of the QA Handbook is to provide information and guidance for both the meteorologist and the non-meteorologist who must make judgments about the validity of data and accuracy of measurement systems. Care has been taken to provide definitions to help those making these judgments to communicate without ambiguity. Methods are described in the handbook which will objectively define the quality of measurements so the non-meteorologist can communicate with the meteorologist or environmental scientist or engineer with precision of meaning.
F. Mauro; Vicente J. Monleon; H. Temesgen; L.A. Ruiz
2017-01-01
Accounting for spatial correlation of LiDAR model errors can improve the precision of model-based estimators. To estimate spatial correlation, sample designs that provide close observations are needed, but their implementation might be prohibitively expensive. To quantify the gains obtained by accounting for the spatial correlation of model errors, we examined (
ERIC Educational Resources Information Center
Stolzberg, Richard J.
1986-01-01
Background information and experimental procedures are provided for an experiment in which three samples of saccharin (a nickel plating solution, a dilute cola drink, and a more concentrated cola drink) are analyzed and the data interpreted using five methods. Precision and accuracy are evaluated and the best method is selected. (JN)
NASA Astrophysics Data System (ADS)
Powolny, F.; Auffray, E.; Brunner, S. E.; Garutti, E.; Goettlich, M.; Hillemanns, H.; Jarron, P.; Lecoq, P.; Meyer, T.; Schultz-Coulon, H. C.; Shen, W.; Williams, M. C. S.
2011-06-01
Time of flight (TOF) measurements in positron emission tomography (PET) are very challenging in terms of timing performance, and should ideally achieve less than 100 ps FWHM precision. We present a time-based differential technique to read out silicon photomultipliers (SiPMs) which has less than 20 ps FWHM electronic jitter. The novel readout is a fast front end circuit (NINO) based on a first stage differential current mode amplifier with 20 Ω input resistance. Therefore the amplifier inputs are connected differentially to the SiPM's anode and cathode ports. The leading edge of the output signal provides the time information, while the trailing edge provides the energy information. Based on a Monte Carlo photon-generation model, HSPICE simulations were run with a 3 × 3 mm2 SiPM-model, read out with a differential current amplifier. The results of these simulations are presented here and compared with experimental data obtained with a 3 × 3 × 15 mm3 LSO crystal coupled to a SiPM. The measured time coincidence precision and the limitations in the overall timing accuracy are interpreted using Monte Carlo/SPICE simulation, Poisson statistics, and geometric effects of the crystal.
A new field-laboratory methodology for assessing human response to noise
NASA Technical Reports Server (NTRS)
Borsky, P. N.
1973-01-01
Gross measures of community annoyance with intrusive noises have been made in a number of real environment surveys which indicate that aircraft noise may have to be reduced 30-40 EPNdb before it will generally be considered acceptable. Interview studies, however, cannot provide the precise information which is needed by noise abatement engineers of the variable human response to different types and degrees of noise exposure. A new methodological field-survey approach has been developed to provide such information. The integrated attitudes and experiences of a random sample of subjects in the real environment are obtained by a prior field survey. Then these subjects record their more precise responses to controlled noise exposures in a new realistic laboratory. The laboratory is a sound chamber furnished as a typical living room (18 ft x 14 ft) and subjects watch a color TV program while they judge simulated aircraft flyovers that occur at controlled levels and intervals. Methodological experiments indicate that subjects in the laboratory have the sensation that the airplanes are actually moving overhead across the ceiling of the chamber. It was also determined that annoyance judgments in the laboratory stabilize after three flyovers are heard prior to a judgment of annoyance.
Pina, Athanasia; Begou, Olga; Kanelis, Dimitris; Gika, Helen; Kalogiannis, Stavros; Tananaki, Chrysoula; Theodoridis, Georgios; Zotou, Anastasia
2018-01-05
In the present work a Hydrophilic Interaction Liquid Chromatography-tandem Mass Spectrometry (HILIC-MS/MS) method was developed for the efficient separation and quantification of a large number of small polar bioactive molecules in Royal Jelly. The method was validated and provided satisfactory detection sensitivity for 88 components. Quantification was proven to be precise for 64 components exhibiting good linearity, recoveries R% >90% for the majority of analytes and intra- and inter-day precision from 0.14 to 20% RSD. Analysis of 125 fresh royal jelly samples of Greek origin provided useful information on royal jelly's hydrophilic bioactive components revealing lysine, ribose, proline, melezitose and glutamic acid to be in high abundance. In addition the occurrence of 18 hydrophilic nutrients which have not been reported previously as royal jelly constituents is shown. Copyright © 2017 Elsevier B.V. All rights reserved.
Circulating tumor DNA as a liquid biopsy target for detection of pancreatic cancer
Takai, Erina; Yachida, Shinichi
2016-01-01
Most pancreatic cancer patients present with advanced metastatic disease, resulting in extremely poor 5-year survival, mainly because of the lack of a reliable modality for early detection and limited therapeutic options for advanced disease. Therefore, there is a need for minimally-invasive diagnostic tools for detecting pancreatic cancer at an early stage, when curative surgery and also novel therapeutic approaches including precision medicine may be feasible. The “liquid biopsy” addresses these unmet clinical needs based on the concept that simple peripheral blood sampling and detection of circulating tumor DNA (ctDNA) could provide diagnostic information. In this review, we provide an overview of the current status of blood-based tests for diagnosis of pancreatic cancer and the potential utility of ctDNA for precision medicine. We also discuss challenges that remain to be addressed in developing practical ctDNA-based liquid biopsy approaches for early diagnosis of pancreatic cancer. PMID:27784960
Circulating tumor DNA as a liquid biopsy target for detection of pancreatic cancer.
Takai, Erina; Yachida, Shinichi
2016-10-14
Most pancreatic cancer patients present with advanced metastatic disease, resulting in extremely poor 5-year survival, mainly because of the lack of a reliable modality for early detection and limited therapeutic options for advanced disease. Therefore, there is a need for minimally-invasive diagnostic tools for detecting pancreatic cancer at an early stage, when curative surgery and also novel therapeutic approaches including precision medicine may be feasible. The "liquid biopsy" addresses these unmet clinical needs based on the concept that simple peripheral blood sampling and detection of circulating tumor DNA (ctDNA) could provide diagnostic information. In this review, we provide an overview of the current status of blood-based tests for diagnosis of pancreatic cancer and the potential utility of ctDNA for precision medicine. We also discuss challenges that remain to be addressed in developing practical ctDNA-based liquid biopsy approaches for early diagnosis of pancreatic cancer.
Precision measurement of transition matrix elements via light shift cancellation.
Herold, C D; Vaidya, V D; Li, X; Rolston, S L; Porto, J V; Safronova, M S
2012-12-14
We present a method for accurate determination of atomic transition matrix elements at the 10(-3) level. Measurements of the ac Stark (light) shift around "magic-zero" wavelengths, where the light shift vanishes, provide precise constraints on the matrix elements. We make the first measurement of the 5s - 6p matrix elements in rubidium by measuring the light shift around the 421 and 423 nm zeros through diffraction of a condensate off a sequence of standing wave pulses. In conjunction with existing theoretical and experimental data, we find 0.3235(9)ea(0) and 0.5230(8)ea(0) for the 5s - 6p(1/2) and 5s - 6p(3/2) elements, respectively, an order of magnitude more accurate than the best theoretical values. This technique can provide needed, accurate matrix elements for many atoms, including those used in atomic clocks, tests of fundamental symmetries, and quantum information.
NASA Astrophysics Data System (ADS)
Rennick, Chris; Bausi, Francesco; Arnold, Tim
2017-04-01
On the global scale methane (CH4) concentrations have more than doubled over the last 150 years, and the contribution to the enhanced greenhouse effect is almost half of that due to the increase in carbon dioxide (CO2) over the same period. Microbial, fossil fuel, biomass burning and landfill are dominant methane sources with differing annual variabilities; however, in the UK for example, mixing ratio measurements from a tall tower network and regional scale inversion modelling have thus far been unable to disaggregate emissions from specific source categories with any significant certainty. Measurement of the methane isotopologue ratios will provide the additional information needed for more robust sector attribution, which will be important for directing policy action Here we explore the potential for isotope ratio measurements to improve the interpretation of atmospheric mixing ratios beyond calculation of total UK emissions, and describe current analytical work at the National Physical Laboratory that will realise deployment of such measurements. We simulate isotopic variations at the four UK greenhouse gas tall tower network sites to understand where deployment of the first isotope analyser would be best situated. We calculate the levels of precision needed in both δ-13C and δ-D in order to detect particular scenarios of emissions. Spectroscopic measurement in the infrared by quantum cascade laser (QCL) absorption is a well-established technique to quantify the mixing ratios of trace species in atmospheric samples and, as has been demonstrated in 2016, if coupled to a suitable preconcentrator then high-precision measurements are possible. The current preconcentration system under development at NPL is designed to make the highest precision measurements yet of the standard isotope ratios via a new large-volume cryogenic trap design and controlled thermal desorption into a QCL spectrometer. Finally we explore the potential for the measurement of clumped isotopes at high frequency and precision. The doubly-substituted 13CH3D isotopologue is a tracer for methane formed at geological temperatures, and will provide additional information for identification of these sources.
NASA Astrophysics Data System (ADS)
Huang, Y. W.; Berman, E. S.; Owano, T. G.; Verfaillie, J. G.; Oikawa, P. Y.; Baldocchi, D. D.; Still, C. J.; Gardner, A.; Baer, D. S.; Rastogi, B.
2015-12-01
Stable CO2 isotopes provide information on biogeochemical processes that occur at the soil-plant-atmosphere interface. While δ13C measurement can provide information on the sources of the CO2, be it photosynthesis, natural gas combustion, other fossil fuel sources, landfills or other sources, δ18O, and δ17O are thought to be determined by the hydrological cycling of the CO2. Though researchers have called for analytical tools for CO2 isotope measurements that are reliable and field-deployable, developing such instrument remains a challenge. The carbon dioxide isotope analyzer developed by Los Gatos Research (LGR) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology and incorporates proprietary internal thermal control for high sensitivity and optimal instrument stability. This new and improved analyzer measures CO2 concentration as well as δ13C, δ18O, and δ17O from CO2 at natural abundance (150-2500 ppm). The laboratory precision is ±200 ppb (1σ) in CO2 at 1 s, with a long-term (2 min) precision of ±20 ppb. The 1-second precision for both δ13C and δ18O is 0.7 ‰, and for δ17O is 1.8 ‰. The long-term (2 min) precision for both δ13C and δ18O is 0.08 ‰, and for δ17O is 0.18 ‰. The instrument has improved precision, stability and user interface over previous LGR CO2 isotope instruments and can be easily programmed for periodic referencing and sampling from different sources when coupled with LGR's multiport inlet unit (MIU). We have deployed two of these instruments at two different field sites, one at Twitchell Island in Sacramento County, CA to monitor the CO2 isotopic fluxes from an alfalfa field from 6/29/2015-7/13/2015, and the other at the Wind River Experimental Forest in Washington to monitor primarily the oxygen isotopes of CO2 within the canopy from 8/4/2015 through mid-November 2015. Methodology, laboratory development and testing and field performance are presented.
Development of Smart Precision Forest in Conifer Plantation in Japan Using Laser Scanning Data
NASA Astrophysics Data System (ADS)
Katoh, M.; Deng, S.; Takenaka, Y.; Cheung, K.; Oono, K.; Horisawa, M.; Hyyppä, J.; Yu, X.; Liang, X.; Wang, Y.
2017-10-01
Currently, the authors are planning to launch a consortium effort toward Japan's first smart precision forestry project using laser data and to develop this technology throughout the country. Smart precision forestry information gathered using the Nagano model (laser scanning from aircraft, drone, and backpack) is being developed to improve the sophistication of forest information, reduce labor-intensive work, maintain sustainable timber productivity, and facilitate supply chain management by laser sensing information in collaboration with industry, academia, and government. In this paper, we outline the research project and the technical development situation of unmanned aerial vehicle laser scanning.
High-rate RTK and PPP multi-GNSS positioning for small-scale dynamic displacements monitoring
NASA Astrophysics Data System (ADS)
Paziewski, Jacek; Sieradzki, Rafał; Baryła, Radosław; Wielgosz, Pawel
2017-04-01
The monitoring of dynamic displacements and deformations of engineering structures such as buildings, towers and bridges is of great interest due to several practical and theoretical reasons. The most important is to provide information required for safe maintenance of the constructions. High temporal resolution and precision of GNSS observations predestine this technology to be applied to most demanding application in terms of accuracy, availability and reliability. GNSS technique supported by appropriate processing methodology may meet the specific demands and requirements of ground and structures monitoring. Thus, high-rate multi-GNSS signals may be used as reliable source of information on dynamic displacements of ground and engineering structures, also in real time applications. In this study we present initial results of application of precise relative GNSS positioning for detection of small scale (cm level) high temporal resolution dynamic displacements. Methodology and algorithms applied in self-developed software allowing for relative positioning using high-rate dual-frequency phase and pseudorange GPS+Galileo observations are also given. Additionally, an approach was also made to use the Precise Point Positioning technique to such application. In the experiment were used the observations obtained from high-rate (20 Hz) geodetic receivers. The dynamic displacements were simulated using specially constructed device moving GNSS antenna with dedicated amplitude and frequency. The obtained results indicate on possibility of detection of dynamic displacements of the GNSS antenna even at the level of few millimetres using both relative and Precise Point Positioning techniques after suitable signals processing.
Obligatory encoding of task-irrelevant features depletes working memory resources.
Marshall, Louise; Bays, Paul M
2013-02-18
Selective attention is often considered the "gateway" to visual working memory (VWM). However, the extent to which we can voluntarily control which of an object's features enter memory remains subject to debate. Recent research has converged on the concept of VWM as a limited commodity distributed between elements of a visual scene. Consequently, as memory load increases, the fidelity with which each visual feature is stored decreases. Here we used changes in recall precision to probe whether task-irrelevant features were encoded into VWM when individuals were asked to store specific feature dimensions. Recall precision for both color and orientation was significantly enhanced when task-irrelevant features were removed, but knowledge of which features would be probed provided no advantage over having to memorize both features of all items. Next, we assessed the effect an interpolated orientation-or color-matching task had on the resolution with which orientations in a memory array were stored. We found that the presence of orientation information in the second array disrupted memory of the first array. The cost to recall precision was identical whether the interfering features had to be remembered, attended to, or could be ignored. Therefore, it appears that storing, or merely attending to, one feature of an object is sufficient to promote automatic encoding of all its features, depleting VWM resources. However, the precision cost was abolished when the match task preceded the memory array. So, while encoding is automatic, maintenance is voluntary, allowing resources to be reallocated to store new visual information.
Towards precision medicine; a new biomedical cosmology.
Vegter, M W
2018-02-10
Precision Medicine has become a common label for data-intensive and patient-driven biomedical research. Its intended future is reflected in endeavours such as the Precision Medicine Initiative in the USA. This article addresses the question whether it is possible to discern a new 'medical cosmology' in Precision Medicine, a concept that was developed by Nicholas Jewson to describe comprehensive transformations involving various dimensions of biomedical knowledge and practice, such as vocabularies, the roles of patients and physicians and the conceptualisation of disease. Subsequently, I will elaborate my assessment of the features of Precision Medicine with the help of Michel Foucault, by exploring how precision medicine involves a transformation along three axes: the axis of biomedical knowledge, of biomedical power and of the patient as a self. Patients are encouraged to become the managers of their own health status, while the medical domain is reframed as a data-sharing community, characterised by changing power relationships between providers and patients, producers and consumers. While the emerging Precision Medicine cosmology may surpass existing knowledge frameworks; it obscures previous traditions and reduces research-subjects to mere data. This in turn, means that the individual is both subjected to the neoliberal demand to share personal information, and at the same time has acquired the positive 'right' to become a member of the data-sharing community. The subject has to constantly negotiate the meaning of his or her data, which can either enable self-expression, or function as a commanding Superego.
Dental implant imaging: TeraRecon's Dental 3D Cone Beam Computed Tomography System.
Garg, Arun K
2007-06-01
Early in the development of implant technology, conventional dental imaging techniques were limited for evaluating the patient for implant surgery. During the treatment-planning phase, the recipient bed is routinely assessed by visual examination and palpation, as well as by periapical and panoramic radiology. These two imaging modalities provide a two-dimensional image of the mesiodistal and occlusoapical dimensions of the edentulous regions where the implants might be placed. When adequate occlusoapical bone height is available for endosteal implants, the buccolingual width and angulation of the available bone are the most important criteria for implant selection and success. However, neither buccolingual width nor angulation can be visualized on most traditional radiographs. Although clinical examination and traditional radiographs may be adequate for patients with wide residual ridges that exhibit sufficient bone crestal to the mandibular nerve and maxillary sinus, these methods do not allow for the precise measurement of the buccolingual dimension of the bone or assessment of the location of unanticipated undercuts. Because of these concerns, it is necessary to view the recipient site in a plane through the arch of the maxilla or mandible in the region of the proposed implants. Implant surgeons soon recognized that, for the optimum placement of implants, cross-sectional views of the maxilla and mandible are the ideal means for providing necessary preoperative information. For complex cases where multiple implants are required or where anatomical measurements are crucial, but also increasingly for more routine cases, more and more clinicians are recommending CT scan imaging procedure such as that offered by TeraRecon's Dental CBCT system. Because of its ability to reconstruct a fully three-dimensional model of the maxilla and mandible, CBCT provides a highly sophisticated format for precisely defining the jaw structure and locating critical anatomic structures. CBCT scans, in conjunction with software that renders immediate treatment plans using the most real and accurate information, provide the most precise radiographic modality currently available for the evaluation of patients for oral implants.
An extensive review of commercial product labels the good, bad and ugly.
Mrvos, R; Dean, B S; Krenzelok, E P
1986-02-01
Cautions and warnings on consumer products play an important role in the prevention and treatment of poison exposures. Frequently those exposed will follow the directions before calling the poison center, physician or emergency room. An extensive label review of 200 commercial products was conducted to determine if medical treatment advice was correct, if the general public was able to comprehend warning statements, and if warnings were adequate. We conclude there are products available that provide precise, correct information. However, there are many that contain incorrect, misleading, and often dangerous information to an unsuspecting public. Various examples of both types are given to make the poison information specialist aware of what information is presented.
Maximizing the Biochemical Resolving Power of Fluorescence Microscopy
Esposito, Alessandro; Popleteeva, Marina; Venkitaraman, Ashok R.
2013-01-01
Most recent advances in fluorescence microscopy have focused on achieving spatial resolutions below the diffraction limit. However, the inherent capability of fluorescence microscopy to non-invasively resolve different biochemical or physical environments in biological samples has not yet been formally described, because an adequate and general theoretical framework is lacking. Here, we develop a mathematical characterization of the biochemical resolution in fluorescence detection with Fisher information analysis. To improve the precision and the resolution of quantitative imaging methods, we demonstrate strategies for the optimization of fluorescence lifetime, fluorescence anisotropy and hyperspectral detection, as well as different multi-dimensional techniques. We describe optimized imaging protocols, provide optimization algorithms and describe precision and resolving power in biochemical imaging thanks to the analysis of the general properties of Fisher information in fluorescence detection. These strategies enable the optimal use of the information content available within the limited photon-budget typically available in fluorescence microscopy. This theoretical foundation leads to a generalized strategy for the optimization of multi-dimensional optical detection, and demonstrates how the parallel detection of all properties of fluorescence can maximize the biochemical resolving power of fluorescence microscopy, an approach we term Hyper Dimensional Imaging Microscopy (HDIM). Our work provides a theoretical framework for the description of the biochemical resolution in fluorescence microscopy, irrespective of spatial resolution, and for the development of a new class of microscopes that exploit multi-parametric detection systems. PMID:24204821
NASA Astrophysics Data System (ADS)
Uijt de Haag, Maarten; Campbell, Jacob; van Graas, Frank
2005-05-01
Synthetic Vision Systems (SVS) provide pilots with a virtual visual depiction of the external environment. When using SVS for aircraft precision approach guidance systems accurate positioning relative to the runway with a high level of integrity is required. Precision approach guidance systems in use today require ground-based electronic navigation components with at least one installation at each airport, and in many cases multiple installations to service approaches to all qualifying runways. A terrain-referenced approach guidance system is envisioned to provide precision guidance to an aircraft without the use of ground-based electronic navigation components installed at the airport. This autonomy makes it a good candidate for integration with an SVS. At the Ohio University Avionics Engineering Center (AEC), work has been underway in the development of such a terrain referenced navigation system. When used in conjunction with an Inertial Measurement Unit (IMU) and a high accuracy/resolution terrain database, this terrain referenced navigation system can provide navigation and guidance information to the pilot on a SVS or conventional instruments. The terrain referenced navigation system, under development at AEC, operates on similar principles as other terrain navigation systems: a ground sensing sensor (in this case an airborne laser scanner) gathers range measurements to the terrain; this data is then matched in some fashion with an onboard terrain database to find the most likely position solution and used to update an inertial sensor-based navigator. AEC's system design differs from today's common terrain navigators in its use of a high resolution terrain database (~1 meter post spacing) in conjunction with an airborne laser scanner which is capable of providing tens of thousands independent terrain elevation measurements per second with centimeter-level accuracies. When combined with data from an inertial navigator the high resolution terrain database and laser scanner system is capable of providing near meter-level horizontal and vertical position estimates. Furthermore, the system under development capitalizes on 1) The position and integrity benefits provided by the Wide Area Augmentation System (WAAS) to reduce the initial search space size and; 2) The availability of high accuracy/resolution databases. This paper presents results from flight tests where the terrain reference navigator is used to provide guidance cues for a precision approach.
Precision Therapy of Head and Neck Squamous Cell Carcinoma.
Polverini, P J; D'Silva, N J; Lei, Y L
2018-06-01
Precision medicine is an approach to disease prevention and treatment that takes into account genetic variability and environmental and lifestyle influences that are unique to each patient. It facilitates stratification of patient populations that vary in their susceptibility to disease and response to therapy. Shared databases and the implementation of new technology systems designed to advance the integration of this information will enable health care providers to more accurately predict and customize prevention and treatment strategies for patients. Although precision medicine has had a limited impact in most areas of medicine, it has been shown to be an increasingly successful approach to cancer therapy. Despite early promising results targeting aberrant signaling pathways or inhibitors designed to block tumor-driven processes such as angiogenesis, limited success emphasizes the need to discover new biomarkers and treatment targets that are more reliable in predicting response to therapy and result in better health outcomes. Recent successes in the use of immunity-inducing antibodies have stimulated increased interest in the use of precision immunotherapy of head and neck squamous cell carcinoma. Using next-generation sequencing, the precise profiling of tumor-infiltrating lymphocytes has great promise to identify hypoimmunogenic cancer that would benefit from a rationally designed combinatorial approach. Continued interrogation of tumors will reveal new actionable targets with increasing therapeutic efficacy and fulfill the promise of precision therapy of head and neck cancer.
Chen, Yixi; Guzauskas, Gregory F; Gu, Chengming; Wang, Bruce C M; Furnback, Wesley E; Xie, Guotong; Dong, Peng; Garrison, Louis P
2016-11-02
The "big data" era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient-level HEOR analyses. We propose the concept of "precision HEOR", which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient.
Chen, Yixi; Guzauskas, Gregory F.; Gu, Chengming; Wang, Bruce C. M.; Furnback, Wesley E.; Xie, Guotong; Dong, Peng; Garrison, Louis P.
2016-01-01
The “big data” era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient–level HEOR analyses. We propose the concept of “precision HEOR”, which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient. PMID:27827859
IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.
Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis
2018-04-01
Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.
2016-06-03
Ultracold Atoms 5:10 Zelevinsky Ye Inouye High-precision spectroscopy with two-body quantum systems Low entropy quantum gas of polar molecules New limit...12th US-Japan Seminar: Many Body Quantum Systems from Quantum Gases to Metrology and Information Processing Support was provided for The 12th US...Japan Seminar on many body quantum systems which was held in Madison, Wisconsin from September 20 to 24, 2015 at the Monona Terrace Convention Center
Integration of Temporal and Ordinal Information During Serial Interception Sequence Learning
Gobel, Eric W.; Sanchez, Daniel J.; Reber, Paul J.
2011-01-01
The expression of expert motor skills typically involves learning to perform a precisely timed sequence of movements (e.g., language production, music performance, athletic skills). Research examining incidental sequence learning has previously relied on a perceptually-cued task that gives participants exposure to repeating motor sequences but does not require timing of responses for accuracy. Using a novel perceptual-motor sequence learning task, learning a precisely timed cued sequence of motor actions is shown to occur without explicit instruction. Participants learned a repeating sequence through practice and showed sequence-specific knowledge via a performance decrement when switched to an unfamiliar sequence. In a second experiment, the integration of representation of action order and timing sequence knowledge was examined. When either action order or timing sequence information was selectively disrupted, performance was reduced to levels similar to completely novel sequences. Unlike prior sequence-learning research that has found timing information to be secondary to learning action sequences, when the task demands require accurate action and timing information, an integrated representation of these types of information is acquired. These results provide the first evidence for incidental learning of fully integrated action and timing sequence information in the absence of an independent representation of action order, and suggest that this integrative mechanism may play a material role in the acquisition of complex motor skills. PMID:21417511
Direction information in multiple object tracking is limited by a graded resource.
Horowitz, Todd S; Cohen, Michael A
2010-10-01
Is multiple object tracking (MOT) limited by a fixed set of structures (slots), a limited but divisible resource, or both? Here, we answer this question by measuring the precision of the direction representation for tracked targets. The signature of a limited resource is a decrease in precision as the square root of the tracking load. The signature of fixed slots is a fixed precision. Hybrid models predict a rapid decrease to asymptotic precision. In two experiments, observers tracked moving disks and reported target motion direction by adjusting a probe arrow. We derived the precision of representation of correctly tracked targets using a mixture distribution analysis. Precision declined with target load according to the square-root law up to six targets. This finding is inconsistent with both pure and hybrid slot models. Instead, directional information in MOT appears to be limited by a continuously divisible resource.
Systems and precision medicine approaches to diabetes heterogeneity: a Big Data perspective.
Capobianco, Enrico
2017-12-01
Big Data, and in particular Electronic Health Records, provide the medical community with a great opportunity to analyze multiple pathological conditions at an unprecedented depth for many complex diseases, including diabetes. How can we infer on diabetes from large heterogeneous datasets? A possible solution is provided by invoking next-generation computational methods and data analytics tools within systems medicine approaches. By deciphering the multi-faceted complexity of biological systems, the potential of emerging diagnostic tools and therapeutic functions can be ultimately revealed. In diabetes, a multidimensional approach to data analysis is needed to better understand the disease conditions, trajectories and the associated comorbidities. Elucidation of multidimensionality comes from the analysis of factors such as disease phenotypes, marker types, and biological motifs while seeking to make use of multiple levels of information including genetics, omics, clinical data, and environmental and lifestyle factors. Examining the synergy between multiple dimensions represents a challenge. In such regard, the role of Big Data fuels the rise of Precision Medicine by allowing an increasing number of descriptions to be captured from individuals. Thus, data curations and analyses should be designed to deliver highly accurate predicted risk profiles and treatment recommendations. It is important to establish linkages between systems and precision medicine in order to translate their principles into clinical practice. Equivalently, to realize their full potential, the involved multiple dimensions must be able to process information ensuring inter-exchange, reducing ambiguities and redundancies, and ultimately improving health care solutions by introducing clinical decision support systems focused on reclassified phenotypes (or digital biomarkers) and community-driven patient stratifications.
Use of generalized linear models and digital data in a forest inventory of Northern Utah
Moisen, Gretchen G.; Edwards, Thomas C.
1999-01-01
Forest inventories, like those conducted by the Forest Service's Forest Inventory and Analysis Program (FIA) in the Rocky Mountain Region, are under increased pressure to produce better information at reduced costs. Here we describe our efforts in Utah to merge satellite-based information with forest inventory data for the purposes of reducing the costs of estimates of forest population totals and providing spatial depiction of forest resources. We illustrate how generalized linear models can be used to construct approximately unbiased and efficient estimates of population totals while providing a mechanism for prediction in space for mapping of forest structure. We model forest type and timber volume of five tree species groups as functions of a variety of predictor variables in the northern Utah mountains. Predictor variables include elevation, aspect, slope, geographic coordinates, as well as vegetation cover types based on satellite data from both the Advanced Very High Resolution Radiometer (AVHRR) and Thematic Mapper (TM) platforms. We examine the relative precision of estimates of area by forest type and mean cubic-foot volumes under six different models, including the traditional double sampling for stratification strategy. Only very small gains in precision were realized through the use of expensive photointerpreted or TM-based data for stratification, while models based on topography and spatial coordinates alone were competitive. We also compare the predictive capability of the models through various map accuracy measures. The models including the TM-based vegetation performed best overall, while topography and spatial coordinates alone provided substantial information at very low cost.
Libertus, Melissa E.; Feigenson, Lisa; Halberda, Justin
2013-01-01
Previous research has found a relationship between individual differences in children’s precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the present study we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of two years. Additionally, at the last time point, we tested children’s informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3; Ginsburg & Baroody, 2003). We found that children’s numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned, non-symbolic system of quantity representation and the system of mathematical reasoning that children come to master through instruction. PMID:24076381
Su, Gui-yang; Li, Jian-hua; Ma, Ying-hua; Li, Sheng-hong
2004-09-01
With the flooding of pornographic information on the Internet, how to keep people away from that offensive information is becoming one of the most important research areas in network information security. Some applications which can block or filter such information are used. Approaches in those systems can be roughly classified into two kinds: metadata based and content based. With the development of distributed technologies, content based filtering technologies will play a more and more important role in filtering systems. Keyword matching is a content based method used widely in harmful text filtering. Experiments to evaluate the recall and precision of the method showed that the precision of the method is not satisfactory, though the recall of the method is rather high. According to the results, a new pornographic text filtering model based on reconfirming is put forward. Experiments showed that the model is practical, has less loss of recall than the single keyword matching method, and has higher precision.
Libertus, Melissa E; Feigenson, Lisa; Halberda, Justin
2013-12-01
Previous research has found a relationship between individual differences in children's precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the current study, we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of 2years. In addition, at the final time point, we tested children's informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3). We found that children's numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned nonsymbolic system of quantity representation and the system of mathematics reasoning that children come to master through instruction. Copyright © 2013 Elsevier Inc. All rights reserved.
A new patent-based approach for technology mapping in the pharmaceutical domain.
Russo, Davide; Montecchi, Tiziano; Carrara, Paolo
2013-09-01
The key factor in decision-making is the quality of information collected and processed in the problem analysis. In most cases, patents represent a very important source of information. The main problem is how to extract such information from the huge corpus of documents with a high recall and precision, and in a short time. This article demonstrates a patent search and classification method, called Knowledge Organizing Module, which consists of creating, almost automatically, a pool of patents based on polysemy expansion and homonymy disambiguation. Since the pool is done, an automatic patent technology landscaping is provided for fixing the state of the art of our product, and exploring competing alternative treatments and/or possible technological opportunities. An exemplary case study is provided, it deals with a patent analysis in the field of verruca treatments.
Franzen, Delwen L; Gleiss, Sarah A; Berger, Christina; Kümpfbeck, Franziska S; Ammer, Julian J; Felmy, Felix
2015-01-15
Passive and active membrane properties determine the voltage responses of neurons. Within the auditory brain stem, refinements in these intrinsic properties during late postnatal development usually generate short integration times and precise action-potential generation. This developmentally acquired temporal precision is crucial for auditory signal processing. How the interactions of these intrinsic properties develop in concert to enable auditory neurons to transfer information with high temporal precision has not yet been elucidated in detail. Here, we show how the developmental interaction of intrinsic membrane parameters generates high firing precision. We performed in vitro recordings from neurons of postnatal days 9-28 in the ventral nucleus of the lateral lemniscus of Mongolian gerbils, an auditory brain stem structure that converts excitatory to inhibitory information with high temporal precision. During this developmental period, the input resistance and capacitance decrease, and action potentials acquire faster kinetics and enhanced precision. Depending on the stimulation time course, the input resistance and capacitance contribute differentially to action-potential thresholds. The decrease in input resistance, however, is sufficient to explain the enhanced action-potential precision. Alterations in passive membrane properties also interact with a developmental change in potassium currents to generate the emergence of the mature firing pattern, characteristic of coincidence-detector neurons. Cholinergic receptor-mediated depolarizations further modulate this intrinsic excitability profile by eliciting changes in the threshold and firing pattern, irrespective of the developmental stage. Thus our findings reveal how intrinsic membrane properties interact developmentally to promote temporally precise information processing. Copyright © 2015 the American Physiological Society.
Multiscale Documentation and Monitoring of L'aquila Historical Centre Using Uav Photogrammetry
NASA Astrophysics Data System (ADS)
Dominici, D.; Alicandro, M.; Rosciano, E.; Massimi, V.
2017-05-01
Nowadays geomatic techniques can guarantee not only a precise and accurate survey for the documentation of our historical heritage but also a solution to monitor its behaviour over time after, for example, a catastrophic event (earthquakes, landslides, ecc). Europe is trying to move towards harmonized actions to store information on cultural heritage (MIBAC with the ICCS forms, English heritage with the MIDAS scheme, etc) but it would be important to provide standardized methods in order to perform measuring operations to collect certified metric data. The final result could be a database to support the entire management of the cultural heritage and also a checklist of "what to do" and "when to do it". The wide range of geomatic techniques provides many solutions to acquire, to organize and to manage data at a multiscale level: high resolution satellite images can provide information in a short time during the "early emergency" while UAV photogrammetry and laser scanning can provide digital high resolution 3D models of buildings, ortophotos of roofs and facades and so on. This paper presents some multiscale survey case studies using UAV photogrammetry: from a minor historical village (Aielli) to the centre of L'Aquila (Santa Maria di Collemaggio Church) from the post-emergency to now. This choice has been taken not only to present how geomatics is an effective science for modelling but also to present a complete and reliable way to perform conservation and/or restoration through precise monitoring techniques, as shown in the third case study.
Target tracking and pointing for arrays of phase-locked lasers
NASA Astrophysics Data System (ADS)
Macasaet, Van P.; Hughes, Gary B.; Lubin, Philip; Madajian, Jonathan; Zhang, Qicheng; Griswold, Janelle; Kulkarni, Neeraj; Cohen, Alexander; Brashears, Travis
2016-09-01
Arrays of phase-locked lasers are envisioned for planetary defense and exploration systems. High-energy beams focused on a threatening asteroid evaporate surface material, creating a reactionary thrust that alters the asteroid's orbit. The same system could be used to probe an asteroid's composition, to search for unknown asteroids, and to propel interplanetary and interstellar spacecraft. Phased-array designs are capable of producing high beam intensity, and allow beam steering and beam profile manipulation. Modular designs allow ongoing addition of emitter elements to a growing array. This paper discusses pointing control for extensible laser arrays. Rough pointing is determined by spacecraft attitude control. Lateral movement of the laser emitter tips behind the optical elements provides intermediate pointing adjustment for individual array elements and beam steering. Precision beam steering and beam formation is accomplished by coordinated phase modulation across the array. Added cells are incorporated into the phase control scheme by precise alignment to local mechanical datums using fast, optical relative position sensors. Infrared target sensors are also positioned within the datum scheme, and provide information about the target vector relative to datum coordinates at each emitter. Multiple target sensors allow refined determination of the target normal plane, providing information to the phase controller for each emitter. As emitters and sensors are added, local position data allows accurate prediction of the relative global position of emitters across the array, providing additional constraints to the phase controllers. Mechanical design and associated phase control that is scalable for target distance and number of emitters is presented.
Spatio-temporal filtering techniques for the detection of disaster-related communication.
Fitzhugh, Sean M; Ben Gibson, C; Spiro, Emma S; Butts, Carter T
2016-09-01
Individuals predominantly exchange information with one another through informal, interpersonal channels. During disasters and other disrupted settings, information spread through informal channels regularly outpaces official information provided by public officials and the press. Social scientists have long examined this kind of informal communication in the rumoring literature, but studying rumoring in disrupted settings has posed numerous methodological challenges. Measuring features of informal communication-timing, content, location-with any degree of precision has historically been extremely challenging in small studies and infeasible at large scales. We address this challenge by using online, informal communication from a popular microblogging website and for which we have precise spatial and temporal metadata. While the online environment provides a new means for observing rumoring, the abundance of data poses challenges for parsing hazard-related rumoring from countless other topics in numerous streams of communication. Rumoring about disaster events is typically temporally and spatially constrained to places where that event is salient. Accordingly, we use spatio and temporal subsampling to increase the resolution of our detection techniques. By filtering out data from known sources of error (per rumor theories), we greatly enhance the signal of disaster-related rumoring activity. We use these spatio-temporal filtering techniques to detect rumoring during a variety of disaster events, from high-casualty events in major population centers to minimally destructive events in remote areas. We consistently find three phases of response: anticipatory excitation where warnings and alerts are issued ahead of an event, primary excitation in and around the impacted area, and secondary excitation which frequently brings a convergence of attention from distant locales onto locations impacted by the event. Our results demonstrate the promise of spatio-temporal filtering techniques for "tuning" measurement of hazard-related rumoring to enable observation of rumoring at scales that have long been infeasible. Copyright © 2016 Elsevier Inc. All rights reserved.
Usability-driven pruning of large ontologies: the case of SNOMED CT
Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-01-01
Objectives To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Materials and Methods Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Results Graph-traversal heuristics provided high coverage (71–96% of terms in the test sets of discharge summaries) at the expense of subset size (17–51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24–55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Discussion Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Conclusion Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available. PMID:22268217
LANDSAT-D investigations in snow hydrology
NASA Technical Reports Server (NTRS)
Dozier, J. (Principal Investigator)
1984-01-01
Two stream methods provide rapid approximate calculations of radiative transfer in scattering and absorbing media. Although they provide information on fluxes only, and not on intensities, their speed makes them attractive to more precise methods. The methods provide a comprehensive, unified review for a homogeneous layer, and solve the equations for reflectance and transmittance for a homogeneous layer over a non reflecting surface. Any of the basic kernels for a single layer can be extended to a vertically inhomogeneous medium over a surface whose reflectance properties vary with illumination angle, as long as the medium can be subdivided into homogeneous layers.
Mars Transportation Environment Definition Document
NASA Technical Reports Server (NTRS)
Alexander, M. (Editor)
2001-01-01
This document provides a compilation of environments knowledge about the planet Mars. Information is divided into three catagories: (1) interplanetary space environments (environments required by the technical community to travel to and from Mars); (2) atmospheric environments (environments needed to aerocapture, aerobrake, or use aeroassist for precision trajectories down to the surface); and (3) surface environments (environments needed to have robots or explorers survive and work on the surface).
F-16 Instructional Sequencing Plan Report.
1981-03-01
information). 2. Interference (learning of some tasks interferes with the learning of other tasks when they possess similar but confusing differences ...profound effect on the total training expense. This increases the desirability of systematic, precise methods of syllabus generation. Inherent in a given...the expensive to acquire. resource. Least cost The syllabus must Select sequences which provide a least total make maximum use of cost method of
ERIC Educational Resources Information Center
Ho, Tsung-Han
2010-01-01
Computerized adaptive testing (CAT) provides a highly efficient alternative to the paper-and-pencil test. By selecting items that match examinees' ability levels, CAT not only can shorten test length and administration time but it can also increase measurement precision and reduce measurement error. In CAT, maximum information (MI) is the most…
ERIC Educational Resources Information Center
Tassé, Marc J.; Schalock, Robert L.; Thissen, David; Balboni, Giulia; Bersani, Henry, Jr.; Borthwick-Duffy, Sharon A.; Spreat, Scott; Widaman, Keith F.; Zhang, Dalun; Navas, Patricia
2016-01-01
The Diagnostic Adaptive Behavior Scale (DABS) was developed using item response theory (IRT) methods and was constructed to provide the most precise and valid adaptive behavior information at or near the cutoff point of making a decision regarding a diagnosis of intellectual disability. The DABS initial item pool consisted of 260 items. Using IRT…
AMS with light nuclei at small accelerators
NASA Astrophysics Data System (ADS)
Stan-Sion, C.; Enachescu, M.
2017-06-01
AMS applications with lighter nuclei are presented. It will be shown how Carbon-14, Boron-10, Beryllium-10, and Tritium-3 can be used to provide valuable information in forensic science, environmental physics, nuclear pollution, in material science and for diagnose of the plasma confinement in fusion reactors. Small accelerators are reliable, efficient and possess the highest ion beam transmissions that confer high precision in measurements.
Micron Accuracy Deployment Experiment (MADE), phase A. Volume 1
NASA Technical Reports Server (NTRS)
Peterson, Lee D.; Lake, Mark S.
1995-01-01
This report documents a Phase A In-STEP flight experiment development effort. The objective of the experiment is to deploy a portion of a segmented reflector on the Shuttle and study its micron-level mechanics. Ground test data are presented which projects that the on-orbit precision of the test article should be approximately 5 microns. Extensive hardware configuration development information is also provided.
Resource selection by elk at two spatial scales in the Black Hills, South Dakota
Mark A. Rumble; R. Scott Gamo
2011-01-01
Understanding resource selection by elk (Cervus elaphus) at multiple spatial scales may provide information that will help resolve the increasing number of resource conflicts involving elk. We quantified vegetation at 412 sites where the precise location of elk was known by direct observation and 509 random sites in the Black Hills of South Dakota during 1998-2001. We...
The NASA Meter Class Autonomous Telescope: Ascension Island
2013-09-01
understand the debris environment by providing high fidelity data in a timely manner to protect satellites and spacecraft in orbit around the Earth...gigabytes of image data nightly. With fainter detection limits, precision detection, acquisition and tracking of targets, multi-color photometry ...ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for
Cui, Licong; Xu, Rong; Luo, Zhihui; Wentz, Susan; Scarberry, Kyle; Zhang, Guo-Qiang
2014-08-03
Finding quality consumer health information online can effectively bring important public health benefits to the general population. It can empower people with timely and current knowledge for managing their health and promoting wellbeing. Despite a popular belief that search engines such as Google can solve all information access problems, recent studies show that using search engines and simple search terms is not sufficient. Our objective is to provide an approach to organizing consumer health information for navigational exploration, complementing keyword-based direct search. Multi-topic assignment to health information, such as online questions, is a fundamental step for navigational exploration. We introduce a new multi-topic assignment method combining semantic annotation using UMLS concepts (CUIs) and Formal Concept Analysis (FCA). Each question was tagged with CUIs identified by MetaMap. The CUIs were filtered with term-frequency and a new term-strength index to construct a CUI-question context. The CUI-question context and a topic-subject context were used for multi-topic assignment, resulting in a topic-question context. The topic-question context was then directly used for constructing a prototype navigational exploration interface. Experimental evaluation was performed on the task of automatic multi-topic assignment of 99 predefined topics for about 60,000 consumer health questions from NetWellness. Using example-based metrics, suitable for multi-topic assignment problems, our method achieved a precision of 0.849, recall of 0.774, and F₁ measure of 0.782, using a reference standard of 278 questions with manually assigned topics. Compared to NetWellness' original topic assignment, a 36.5% increase in recall is achieved with virtually no sacrifice in precision. Enhancing the recall of multi-topic assignment without sacrificing precision is a prerequisite for achieving the benefits of navigational exploration. Our new multi-topic assignment method, combining term-strength, FCA, and information retrieval techniques, significantly improved recall and performed well according to example-based metrics.
2014-01-01
Background Finding quality consumer health information online can effectively bring important public health benefits to the general population. It can empower people with timely and current knowledge for managing their health and promoting wellbeing. Despite a popular belief that search engines such as Google can solve all information access problems, recent studies show that using search engines and simple search terms is not sufficient. Our objective is to provide an approach to organizing consumer health information for navigational exploration, complementing keyword-based direct search. Multi-topic assignment to health information, such as online questions, is a fundamental step for navigational exploration. Methods We introduce a new multi-topic assignment method combining semantic annotation using UMLS concepts (CUIs) and Formal Concept Analysis (FCA). Each question was tagged with CUIs identified by MetaMap. The CUIs were filtered with term-frequency and a new term-strength index to construct a CUI-question context. The CUI-question context and a topic-subject context were used for multi-topic assignment, resulting in a topic-question context. The topic-question context was then directly used for constructing a prototype navigational exploration interface. Results Experimental evaluation was performed on the task of automatic multi-topic assignment of 99 predefined topics for about 60,000 consumer health questions from NetWellness. Using example-based metrics, suitable for multi-topic assignment problems, our method achieved a precision of 0.849, recall of 0.774, and F1 measure of 0.782, using a reference standard of 278 questions with manually assigned topics. Compared to NetWellness’ original topic assignment, a 36.5% increase in recall is achieved with virtually no sacrifice in precision. Conclusion Enhancing the recall of multi-topic assignment without sacrificing precision is a prerequisite for achieving the benefits of navigational exploration. Our new multi-topic assignment method, combining term-strength, FCA, and information retrieval techniques, significantly improved recall and performed well according to example-based metrics. PMID:25086916
Research on the high-precision non-contact optical detection technology for banknotes
NASA Astrophysics Data System (ADS)
Jin, Xiaofeng; Liang, Tiancai; Luo, Pengfeng; Sun, Jianfeng
2015-09-01
The technology of high-precision laser interferometry was introduced for optical measurement of the banknotes in this paper. Taking advantage of laser short wavelength and high sensitivity, information of adhesive tape and cavity about the banknotes could be checked efficiently. Compared with current measurement devices, including mechanical wheel measurement device, Infrared measurement device, ultrasonic measurement device, the laser interferometry measurement has higher precision and reliability. This will improve the ability of banknotes feature information in financial electronic equipment.
Dhawan, Atam P
2016-01-01
Recent advances in biosensors, medical instrumentation, and information processing and communication technologies (ICT) have enabled significant improvements in healthcare. However, these technologies have been mainly applied in clinical environments, such as hospitals and healthcare facilities, under managed care by well-trained and specialized individuals. The global challenge of providing quality healthcare at affordable cost leads to the proposed paradigm of P reventive, Personalized, and Precision Medicine that requires a seamless use of technology and infrastructure support for patients and healthcare providers at point-of-care (POC) locations including homes, semi or pre-clinical facilities, and hospitals. The complexity of the global healthcare challenge necessitates strong collaborative interdisciplinary synergies involving all stakeholder groups including academia, federal research institutions, industry, regulatory agencies, and clinical communities. It is critical to evolve with collaborative efforts on the translation of research to technology development toward clinical validation and potential healthcare applications. This special issue is focused on technology innovation and translational research for POC applications with potential impact in improving global healthcare in the respective areas. Some of these papers were presented at the NIH-IEEE Strategic Conference on Healthcare Innovations and POC Technologies for Precision Medicine (HI-POCT) held at the NIH on November 9-10, 2015. The papers included in the Special Issue provide a spectrum of critical issues and collaborative resources on translational research of advanced POC devices and ICT into global healthcare environment.
Muris, Peter; van Zwol, Lisanne; Huijding, Jorg; Mayer, Birgit
2010-04-01
This study investigated whether fear beliefs can be installed in children after parents had received negatively tinted information about a novel stimulus. Parents of children aged 8-13 years (N = 88) were presented with negative, positive, or ambiguous information about an unknown animal and then given a number of open-ended vignettes describing confrontations with the animal with the instruction to tell their children what would happen in these situations. Results indicated that children's fear beliefs were influenced by the information that was provided to the parent. That is, parents who had received negative information provided more threatening narratives about the animal and hence installed higher levels of fear beliefs in their children than parents who had received positive information. In the case of ambiguous information, the transmission of fear was dependent on parents' trait anxiety levels. More precisely, high trait anxious parents told more negative stories about the unknown animal, which produced higher fear levels in children. 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cooper, Matthew J.; Martin, Randall V.; Lyapustin, Alexei I.; McLinden, Chris A.
2018-05-01
Accurate representation of surface reflectivity is essential to tropospheric trace gas retrievals from solar backscatter observations. Surface snow cover presents a significant challenge due to its variability and thus snow-covered scenes are often omitted from retrieval data sets; however, the high reflectance of snow is potentially advantageous for trace gas retrievals. We first examine the implications of surface snow on retrievals from the upcoming TEMPO geostationary instrument for North America. We use a radiative transfer model to examine how an increase in surface reflectivity due to snow cover changes the sensitivity of satellite retrievals to NO2 in the lower troposphere. We find that a substantial fraction (> 50 %) of the TEMPO field of regard can be snow covered in January and that the average sensitivity to the tropospheric NO2 column substantially increases (doubles) when the surface is snow covered.We then evaluate seven existing satellite-derived or reanalysis snow extent products against ground station observations over North America to assess their capability of informing surface conditions for TEMPO retrievals. The Interactive Multisensor Snow and Ice Mapping System (IMS) had the best agreement with ground observations (accuracy of 93 %, precision of 87 %, recall of 83 %). Multiangle Implementation of Atmospheric Correction (MAIAC) retrievals of MODIS-observed radiances had high precision (90 % for Aqua and Terra), but underestimated the presence of snow (recall of 74 % for Aqua, 75 % for Terra). MAIAC generally outperforms the standard MODIS products (precision of 51 %, recall of 43 % for Aqua; precision of 69 %, recall of 45 % for Terra). The Near-real-time Ice and Snow Extent (NISE) product had good precision (83 %) but missed a significant number of snow-covered pixels (recall of 45 %). The Canadian Meteorological Centre (CMC) Daily Snow Depth Analysis Data set had strong performance metrics (accuracy of 91 %, precision of 79 %, recall of 82 %). We use the Fscore, which balances precision and recall, to determine overall product performance (F = 85 %, 82 (82) %, 81 %, 58 %, 46 (54) % for IMS, MAIAC Aqua (Terra), CMC, NISE, MODIS Aqua (Terra), respectively) for providing snow cover information for TEMPO retrievals from solar backscatter observations. We find that using IMS to identify snow cover and enable inclusion of snow-covered scenes in clear-sky conditions across North America in January can increase both the number of observations by a factor of 2.1 and the average sensitivity to the tropospheric NO2 column by a factor of 2.7.
Atmospheric neutrino oscillations for Earth tomography
NASA Astrophysics Data System (ADS)
Winter, Walter
2016-07-01
Modern proposed atmospheric neutrino oscillation experiments, such as PINGU in the Antarctic ice or ORCA in Mediterranean sea water, aim for precision measurements of the oscillation parameters including the ordering of the neutrino masses. They can, however, go far beyond that: Since neutrino oscillations are affected by the coherent forward scattering with matter, neutrinos can provide a new view on the interior of the earth. We show that the proposed atmospheric oscillation experiments can measure the lower mantle density of the earth with a precision at the level of a few percent, including the uncertainties of the oscillation parameters and correlations among different density layers. While the earth's core is, in principle, accessible by the angular resolution, new technology would be required to extract degeneracy-free information.
[Galen's "On bones for beginners" translation from the Greek text and discussion].
Sakai, Tatsuo; Ikeda, Reitaro; Sawai, Tadashi
2007-09-01
Galen's article "On bones for beginners" was translated literally from the Greek text (Kühn's edition, vol. 2, pp. 732-778) into Japanese, applying the knowledge of modern anatomy. The previous Latin and English translations were utilized as references for the present translation. The present study has revealed that many of the current basic vocabularies for the bones and junctions were established already in Galen's treatises, but have changed their meanings and usages considerably. It has become also apparent that, for the skull, Galen did not observe individual bones but distinguished them by precise observations on the sutures of the skull in monkeys. The precise understanding of Galenic anatomy provides essential information to understand the origin of current anatomy.
MOLA: The Future of Mars Global Cartography
NASA Technical Reports Server (NTRS)
Duxbury, T. C.; Smith, D. E.; Zuber, M. T.; Frey, H. V.; Garvin, J. B.; Head, J. W.; Muhleman, D. O.; Pettengill, G. H.; Phillips, R. J.; Solomon, S. C.
1999-01-01
The MGS Orbiter is carrying the high-precision Mars Orbiter Laser Altimeter (MOLA) which, when combined with precision reconstructed orbital data and telemetered attitude data, provides a tie between inertial space and Mars-fixed coordinates to an accuracy of 100 m in latitude / longitude and 10 m in radius (1 sigma), orders of magnitude more accurate than previous global geodetic/ cartographic control data. Over the 2 year MGS mission lifetime, it is expected that over 30,000 MOLA Global Cartographic Control Points will be produced to form the basis for new and re-derived map and geodetic products, key to the analysis of existing and evolving MGS data as well as future Mars exploration. Additional information is contained in the original extended abstract.
Precision diamond grinding of ceramics and glass
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, S.; Paul, H.; Scattergood, R.O.
A new research initiative will be undertaken to investigate the effect of machine parameters and material properties on precision diamond grinding of ceramics and glass. The critical grinding depth to initiate the plastic flow-to-brittle fracture regime will be directly measured using plunge-grind tests. This information will be correlated with machine parameters such as wheel bonding and diamond grain size. Multiaxis grinding tests will then be made to provide data more closely coupled with production technology. One important aspect of the material property studies involves measuring fracture toughness at the very short crack sizes commensurate with grinding damage. Short crack toughnessmore » value`s can be much less than the long-crack toughness values measured in conventional fracture tests.« less
Laser-ranging long-baseline differential atom interferometers for space
NASA Astrophysics Data System (ADS)
Chiow, Sheng-wey; Williams, Jason; Yu, Nan
2015-12-01
High-sensitivity differential atom interferometers (AIs) are promising for precision measurements in science frontiers in space, including gravity-field mapping for Earth science studies and gravitational wave detection. Difficulties associated with implementing long-baseline differential AIs have previously included the need for a high optical power, large differential Doppler shifts, and narrow dynamic range. We propose a configuration of twin AIs connected by a laser-ranging interferometer (LRI-AI) to provide precise information of the displacements between the two AI reference mirrors and also to phase-lock the two independent interferometer lasers over long distances, thereby drastically improving the practical feasibility of long-baseline differential AI measurements. We show that a properly implemented LRI-AI can achieve equivalent functionality to the conventional differential AI measurement configuration.
NASA Astrophysics Data System (ADS)
Peng, Dong; Du, Yang; Shi, Yiwen; Mao, Duo; Jia, Xiaohua; Li, Hui; Zhu, Yukun; Wang, Kun; Tian, Jie
2016-07-01
Photoacoustic imaging and fluorescence molecular imaging are emerging as important research tools for biomedical studies. Photoacoustic imaging offers both strong optical absorption contrast and high ultrasonic resolution, and fluorescence molecular imaging provides excellent superficial resolution, high sensitivity, high throughput, and the ability for real-time imaging. Therefore, combining the imaging information of both modalities can provide comprehensive in vivo physiological and pathological information. However, currently there are limited probes available that can realize both fluorescence and photoacoustic imaging, and advanced biomedical applications for applying this dual-modality imaging approach remain underexplored. In this study, we developed a dual-modality photoacoustic-fluorescence imaging nanoprobe, ICG-loaded Au@SiO2, which was uniquely designed, consisting of gold nanorod cores and indocyanine green with silica shell spacer layers to overcome fluorophore quenching. This nanoprobe was examined by both PAI and FMI for in vivo imaging on tumor and ischemia mouse models. Our results demonstrated that the nanoparticles can specifically accumulate at the tumor and ischemic areas and be detected by both imaging modalities. Moreover, this dual-modality imaging strategy exhibited superior advantages for a precise diagnosis in different scenarios. The new nanoprobe with the dual-modality imaging approach holds great potential for diagnosis and stage classification of tumor and ischemia related diseases.Photoacoustic imaging and fluorescence molecular imaging are emerging as important research tools for biomedical studies. Photoacoustic imaging offers both strong optical absorption contrast and high ultrasonic resolution, and fluorescence molecular imaging provides excellent superficial resolution, high sensitivity, high throughput, and the ability for real-time imaging. Therefore, combining the imaging information of both modalities can provide comprehensive in vivo physiological and pathological information. However, currently there are limited probes available that can realize both fluorescence and photoacoustic imaging, and advanced biomedical applications for applying this dual-modality imaging approach remain underexplored. In this study, we developed a dual-modality photoacoustic-fluorescence imaging nanoprobe, ICG-loaded Au@SiO2, which was uniquely designed, consisting of gold nanorod cores and indocyanine green with silica shell spacer layers to overcome fluorophore quenching. This nanoprobe was examined by both PAI and FMI for in vivo imaging on tumor and ischemia mouse models. Our results demonstrated that the nanoparticles can specifically accumulate at the tumor and ischemic areas and be detected by both imaging modalities. Moreover, this dual-modality imaging strategy exhibited superior advantages for a precise diagnosis in different scenarios. The new nanoprobe with the dual-modality imaging approach holds great potential for diagnosis and stage classification of tumor and ischemia related diseases. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03809c
Cannon, William F.; Woodruff, Laurel G.
2003-01-01
This data set consists of nine files of geochemical information on various types of surficial deposits in northwestern Wisconsin and immediately adjacent parts of Michigan and Minnesota. The files are presented in two formats: as dbase files in dbaseIV form and Microsoft Excel form. The data present multi-element chemical analyses of soils, stream sediments, and lake sediments. Latitude and longitude values are provided in each file so that the dbf files can be readily imported to GIS applications. Metadata files are provided in outline form, question and answer form and text form. The metadata includes information on procedures for sample collection, sample preparation, and chemical analyses including sensitivity and precision.
Preface: High-rate GNSS: Theory, methods and engineering/geophysical applications
NASA Astrophysics Data System (ADS)
Xu, Peiliang
2017-06-01
Global Navigation Satellite Systems (GNSS) have revolutionized the science and engineering of positioning, timing and navigation and have become an indispensable means to rapidly obtain precise positioning-related information, profoundly affecting our daily life and infrastructure. With GNSS, the position of an object, either stationary or moving, can be determined anywhere, anytime and under any weather condition. In addition to providing a positioning and timing information service, GNSS are now also used to reconstruct physical properties of media through which GNSS signals travel. The utilization of additional GNSS systems such as the European Galileo and the Chinese Beidou (both expected to complete their final global constellations in 2020) will contribute to positioning/navigation science and engineering, provide more industrial opportunities and surely open more challenges.
NASA Astrophysics Data System (ADS)
Jackson, R. J.; Jeffries, R. D.; Lewis, J.; Koposov, S. E.; Sacco, G. G.; Randich, S.; Gilmore, G.; Asplund, M.; Binney, J.; Bonifacio, P.; Drew, J. E.; Feltzing, S.; Ferguson, A. M. N.; Micela, G.; Neguerela, I.; Prusti, T.; Rix, H.-W.; Vallenari, A.; Alfaro, E. J.; Allende Prieto, C.; Babusiaux, C.; Bensby, T.; Blomme, R.; Bragaglia, A.; Flaccomio, E.; Francois, P.; Hambly, N.; Irwin, M.; Korn, A. J.; Lanzafame, A. C.; Pancino, E.; Recio-Blanco, A.; Smiljanic, R.; Van Eck, S.; Walton, N.; Bayo, A.; Bergemann, M.; Carraro, G.; Costado, M. T.; Damiani, F.; Edvardsson, B.; Franciosini, E.; Frasca, A.; Heiter, U.; Hill, V.; Hourihane, A.; Jofré, P.; Lardo, C.; de Laverny, P.; Lind, K.; Magrini, L.; Marconi, G.; Martayan, C.; Masseron, T.; Monaco, L.; Morbidelli, L.; Prisinzano, L.; Sbordone, L.; Sousa, S. G.; Worley, C. C.; Zaggia, S.
2015-08-01
Context. The Gaia-ESO Survey (GES) is a large public spectroscopic survey at the European Southern Observatory Very Large Telescope. Aims: A key aim is to provide precise radial velocities (RVs) and projected equatorial velocities (vsini) for representative samples of Galactic stars, which will complement information obtained by the Gaia astrometry satellite. Methods: We present an analysis to empirically quantify the size and distribution of uncertainties in RV and vsini using spectra from repeated exposures of the same stars. Results: We show that the uncertainties vary as simple scaling functions of signal-to-noise ratio (S/N) and vsini, that the uncertainties become larger with increasing photospheric temperature, but that the dependence on stellar gravity, metallicity and age is weak. The underlying uncertainty distributions have extended tails that are better represented by Student's t-distributions than by normal distributions. Conclusions: Parametrised results are provided, which enable estimates of the RV precision for almost all GES measurements, and estimates of the vsini precision for stars in young clusters, as a function of S/N, vsini and stellar temperature. The precision of individual high S/N GES RV measurements is 0.22-0.26 km s-1, dependent on instrumental configuration. Based on observations collected with the FLAMES spectrograph at VLT/UT2 telescope (Paranal Observatory, ESO, Chile), for the Gaia- ESO Large Public Survey (188.B-3002).Full Table 2 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/580/A75
NASA Astrophysics Data System (ADS)
Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan
2016-02-01
In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.
Blasimme, Alessandro; Vayena, Effy
2016-11-04
Precision medicine promises to develop diagnoses and treatments that take individual variability into account. According to most specialists, turning this promise into reality will require adapting the established framework of clinical research ethics, and paying more attention to participants' attitudes towards sharing genotypic, phenotypic, lifestyle data and health records, and ultimately to their desire to be engaged as active partners in medical research.Notions such as participation, engagement and partnership have been introduced in bioethics debates concerning genetics and large-scale biobanking to broaden the focus of discussion beyond individual choice and individuals' moral interests. The uptake of those concepts in precision medicine is to be welcomed. However, as data and medical information from research participants in precision medicine cohorts will be collected on an individual basis, translating a participatory approach in this emerging area may prove cumbersome. Therefore, drawing on Joseph Raz's perfectionism, we propose a principle of respect for autonomous agents that, we reckon, can address many of the concerns driving recent scholarship on partnership and public participation, while avoiding some of the limitations these concept have in the context of precision medicine. Our approach offers a normative clarification to how becoming partners in precision is compatible with retaining autonomy.Realigning the value of autonomy with ideals of direct engagement, we show, can provide adequate normative orientation to precision medicine; it can do justice to the idea of moral pluralism by stressing the value of moral self-determination: and, finally, it can reconcile the notion of autonomy with other more communitarian values such as participation and solidarity.
Influence of sectioning location on age estimates from common carp dorsal spines
Watkins, Carson J.; Klein, Zachary B.; Terrazas, Marc M.; Quist, Michael C.
2015-01-01
Dorsal spines have been shown to provide precise age estimates for Common CarpCyprinus carpio and are commonly used by management agencies to gain information on Common Carp populations. However, no previous studies have evaluated variation in the precision of age estimates obtained from different sectioning locations along Common Carp dorsal spines. We evaluated the precision, relative readability, and distribution of age estimates obtained from various sectioning locations along Common Carp dorsal spines. Dorsal spines from 192 Common Carp were sectioned at the base (section 1), immediately distal to the basal section (section 2), and at 25% (section 3), 50% (section 4), and 75% (section 5) of the total length of the dorsal spine. The exact agreement and within-1-year agreement among readers was highest and the coefficient of variation lowest for section 2. In general, age estimates derived from sections 2 and 3 had similar age distributions and displayed the highest concordance in age estimates with section 1. Our results indicate that sections taken at ≤ 25% of the total length of the dorsal spine can be easily interpreted and provide precise estimates of Common Carp age. The greater consistency in age estimates obtained from section 2 indicates that by using a standard sectioning location, fisheries scientists can expect age-based estimates of population metrics to be more comparable and thus more useful for understanding Common Carp population dynamics.
Safety and Certification Considerations for Expanding the Use of UAS in Precision Agriculture
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Neogi, Natasha A.; Vertstynen, Harry A.
2016-01-01
The agricultural community is actively engaged in adopting new technologies such as unmanned aircraft systems (UAS) to help assess the condition of crops and develop appropriate treatment plans. In the United States, agricultural use of UAS has largely been limited to small UAS, generally weighing less than 55 lb and operating within the line of sight of a remote pilot. A variety of small UAS are being used to monitor and map crops, while only a few are being used to apply agricultural inputs based on the results of remote sensing. Larger UAS with substantial payload capacity could provide an option for site-specific application of agricultural inputs in a timely fashion, without substantive damage to the crops or soil. A recent study by the National Aeronautics and Space Administration (NASA) investigated certification requirements needed to enable the use of larger UAS to support the precision agriculture industry. This paper provides a brief introduction to aircraft certification relevant to agricultural UAS, an overview of and results from the NASA study, and a discussion of how those results might affect the precision agriculture community. Specific topics of interest include business model considerations for unmanned aerial applicators and a comparison with current means of variable rate application. The intent of the paper is to inform the precision agriculture community of evolving technologies that will enable broader use of unmanned vehicles to reduce costs, reduce environmental impacts, and enhance yield, especially for specialty crops that are grown on small to medium size farms.
Differential porosimetry and permeametry for random porous media.
Hilfer, R; Lemmer, A
2015-07-01
Accurate determination of geometrical and physical properties of natural porous materials is notoriously difficult. Continuum multiscale modeling has provided carefully calibrated realistic microstructure models of reservoir rocks with floating point accuracy. Previous measurements using synthetic microcomputed tomography (μ-CT) were based on extrapolation of resolution-dependent properties for discrete digitized approximations of the continuum microstructure. This paper reports continuum measurements of volume and specific surface with full floating point precision. It also corrects an incomplete description of rotations in earlier publications. More importantly, the methods of differential permeametry and differential porosimetry are introduced as precision tools. The continuum microstructure chosen to exemplify the methods is a homogeneous, carefully calibrated and characterized model for Fontainebleau sandstone. The sample has been publicly available since 2010 on the worldwide web as a benchmark for methodical studies of correlated random media. High-precision porosimetry gives the volume and internal surface area of the sample with floating point accuracy. Continuum results with floating point precision are compared to discrete approximations. Differential porosities and differential surface area densities allow geometrical fluctuations to be discriminated from discretization effects and numerical noise. Differential porosimetry and Fourier analysis reveal subtle periodic correlations. The findings uncover small oscillatory correlations with a period of roughly 850μm, thus implying that the sample is not strictly stationary. The correlations are attributed to the deposition algorithm that was used to ensure the grain overlap constraint. Differential permeabilities are introduced and studied. Differential porosities and permeabilities provide scale-dependent information on geometry fluctuations, thereby allowing quantitative error estimates.
Drugp-Induced Rhabdomyolysis Atlas (DIRA) for idiosyncratic adverse drug reaction management.
Wen, Zhining; Liang, Yu; Hao, Yingyi; Delavan, Brian; Huang, Ruili; Mikailov, Mike; Tong, Weida; Li, Menglong; Liu, Zhichao
2018-06-11
Drug-induced rhabdomyolysis (DIR) is an idiosyncratic and fatal adverse drug reaction (ADR) characterized in severe muscle injuries accompanied by multiple-organ failure. Limited knowledge regarding the pathophysiology of rhabdomyolysis is the main obstacle to developing early biomarkers and prevention strategies. Given the lack of a centralized data resource to curate, organize, and standardize widespread DIR information, here we present a Drug-Induced Rhabdomyolysis Atlas (DIRA) that provides DIR-related information, including: a classification scheme for DIR based on drug labeling information; postmarketing surveillance data of DIR; and DIR drug property information. To elucidate the utility of DIRA, we used precision dosing, concomitant use of DIR drugs, and predictive modeling development to exemplify strategies for idiosyncratic ADR (IADR) management. Published by Elsevier Ltd.
Zhang, Chi; Zhang, Ge; Chen, Ke-ji; Lu, Ai-ping
2016-04-01
The development of an effective classification method for human health conditions is essential for precise diagnosis and delivery of tailored therapy to individuals. Contemporary classification of disease systems has properties that limit its information content and usability. Chinese medicine pattern classification has been incorporated with disease classification, and this integrated classification method became more precise because of the increased understanding of the molecular mechanisms. However, we are still facing the complexity of diseases and patterns in the classification of health conditions. With continuing advances in omics methodologies and instrumentation, we are proposing a new classification approach: molecular module classification, which is applying molecular modules to classifying human health status. The initiative would be precisely defining the health status, providing accurate diagnoses, optimizing the therapeutics and improving new drug discovery strategy. Therefore, there would be no current disease diagnosis, no disease pattern classification, and in the future, a new medicine based on this classification, molecular module medicine, could redefine health statuses and reshape the clinical practice.
Human genomics projects and precision medicine.
Carrasco-Ramiro, F; Peiró-Pastor, R; Aguado, B
2017-09-01
The completion of the Human Genome Project (HGP) in 2001 opened the floodgates to a deeper understanding of medicine. There are dozens of HGP-like projects which involve from a few tens to several million genomes currently in progress, which vary from having specialized goals or a more general approach. However, data generation, storage, management and analysis in public and private cloud computing platforms have raised concerns about privacy and security. The knowledge gained from further research has changed the field of genomics and is now slowly permeating into clinical medicine. The new precision (personalized) medicine, where genome sequencing and data analysis are essential components, allows tailored diagnosis and treatment according to the information from the patient's own genome and specific environmental factors. P4 (predictive, preventive, personalized and participatory) medicine is introducing new concepts, challenges and opportunities. This review summarizes current sequencing technologies, concentrates on ongoing human genomics projects, and provides some examples in which precision medicine has already demonstrated clinical impact in diagnosis and/or treatment.
Current status and future trends of precision agricultural aviation technologies
USDA-ARS?s Scientific Manuscript database
Modern technologies and information tools can be used to maximize agricultural aviation productivity allowing for precision application of agrochemical products. This paper reviews and summarizes the state-of-the-art in precision agricultural aviation technology highlighting remote sensing, aerial s...
[Medical imaging in tumor precision medicine: opportunities and challenges].
Xu, Jingjing; Tan, Yanbin; Zhang, Minming
2017-05-25
Tumor precision medicine is an emerging approach for tumor diagnosis, treatment and prevention, which takes account of individual variability of environment, lifestyle and genetic information. Tumor precision medicine is built up on the medical imaging innovations developed during the past decades, including the new hardware, new imaging agents, standardized protocols, image analysis and multimodal imaging fusion technology. Also the development of automated and reproducible analysis algorithm has extracted large amount of information from image-based features. With the continuous development and mining of tumor clinical and imaging databases, the radiogenomics, radiomics and artificial intelligence have been flourishing. Therefore, these new technological advances bring new opportunities and challenges to the application of imaging in tumor precision medicine.
Error analysis of high-rate GNSS precise point positioning for seismic wave measurement
NASA Astrophysics Data System (ADS)
Shu, Yuanming; Shi, Yun; Xu, Peiliang; Niu, Xiaoji; Liu, Jingnan
2017-06-01
High-rate GNSS precise point positioning (PPP) has been playing a more and more important role in providing precise positioning information in fast time-varying environments. Although kinematic PPP is commonly known to have a precision of a few centimeters, the precision of high-rate PPP within a short period of time has been reported recently with experiments to reach a few millimeters in the horizontal components and sub-centimeters in the vertical component to measure seismic motion, which is several times better than the conventional kinematic PPP practice. To fully understand the mechanism of mystified excellent performance of high-rate PPP within a short period of time, we have carried out a theoretical error analysis of PPP and conducted the corresponding simulations within a short period of time. The theoretical analysis has clearly indicated that the high-rate PPP errors consist of two types: the residual systematic errors at the starting epoch, which affect high-rate PPP through the change of satellite geometry, and the time-varying systematic errors between the starting epoch and the current epoch. Both the theoretical error analysis and simulated results are fully consistent with and thus have unambiguously confirmed the reported high precision of high-rate PPP, which has been further affirmed here by the real data experiments, indicating that high-rate PPP can indeed achieve the millimeter level of precision in the horizontal components and the sub-centimeter level of precision in the vertical component to measure motion within a short period of time. The simulation results have clearly shown that the random noise of carrier phases and higher order ionospheric errors are two major factors to affect the precision of high-rate PPP within a short period of time. The experiments with real data have also indicated that the precision of PPP solutions can degrade to the cm level in both the horizontal and vertical components, if the geometry of satellites is rather poor with a large DOP value.
Thalamic nuclei convey diverse contextual information to layer 1 of visual cortex
Imhof, Fabia; Martini, Francisco J.; Hofer, Sonja B.
2017-01-01
Sensory perception depends on the context within which a stimulus occurs. Prevailing models emphasize cortical feedback as the source of contextual modulation. However, higher-order thalamic nuclei, such as the pulvinar, interconnect with many cortical and subcortical areas, suggesting a role for the thalamus in providing sensory and behavioral context – yet the nature of the signals conveyed to cortex by higher-order thalamus remains poorly understood. Here we use axonal calcium imaging to measure information provided to visual cortex by the pulvinar equivalent in mice, the lateral posterior nucleus (LP), as well as the dorsolateral geniculate nucleus (dLGN). We found that dLGN conveys retinotopically precise visual signals, while LP provides distributed information from the visual scene. Both LP and dLGN projections carry locomotion signals. However, while dLGN inputs often respond to positive combinations of running and visual flow speed, LP signals discrepancies between self-generated and external visual motion. This higher-order thalamic nucleus therefore conveys diverse contextual signals that inform visual cortex about visual scene changes not predicted by the animal’s own actions. PMID:26691828
Day surgery nurses' selection of patient preoperative information.
Mitchell, Mark
2017-01-01
To determine selection and delivery of preoperative verbal information deemed important by nurses to relay to patients immediately prior to day surgery. Elective day-case surgery is expanding, patient turnover is high and nurse-patient contact limited. In the brief time-frame available, nurses must select and precisely deliver information to patients, provide answers to questions and gain compliance to ensure a sustained, co-ordinated patient throughput. Concise information selection is therefore necessary especially given continued day surgery expansion. Electronic questionnaire. A survey investigating nurses' choice of patient information prior to surgery was distributed throughout the UK via email addresses listed on the British Association of Day Surgery member's website (January 2015-April 2015). Participants were requested to undertake the survey within 2-3 weeks, with 137 participants completing the survey giving a 44% response rate. Verbal information deemed most important by nurses preoperatively was checking fasting time, information about procedure/operation, checking medication, ensuring presence of medical records/test results and concluding medical investigations checks. To a lesser extent was theatre environment information, procedure/operation start time and possible time to discharge. Significant differences were established between perceived importance of information and information delivery concerning the procedure/operation and anaesthesia details. Nurses working with competing demands and frequent interruptions, prioritised patient safety information. Although providing technical details during time-limited encounters, efforts were made to individualise provision. A more formal plan of verbal information provision could help ease nurses' cognitive workload and enhance patient satisfaction. This study provides evidence that verbal information provided immediately prior to day surgery may vary with experience. Nurse educators and managers may need to provide greater guidance for such complex care settings as delivery of increasingly technical details during brief encounters is gaining increasing priority. © 2016 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Collis, R. T. H.
1969-01-01
Lidar is an optical radar technique employing laser energy. Variations in signal intensity as a function of range provide information on atmospheric constituents, even when these are too tenuous to be normally visible. The theoretical and technical basis of the technique is described and typical values of the atmospheric optical parameters given. The significance of these parameters to atmospheric and meteorological problems is discussed. While the basic technique can provide valuable information about clouds and other material in the atmosphere, it is not possible to determine particle size and number concentrations precisely. There are also inherent difficulties in evaluating lidar observations. Nevertheless, lidar can provide much useful information as is shown by illustrations. These include lidar observations of: cirrus cloud, showing mountain wave motions; stratification in clear air due to the thermal profile near the ground; determinations of low cloud and visibility along an air-field approach path; and finally the motion and internal structure of clouds of tracer materials (insecticide spray and explosion-caused dust) which demonstrate the use of lidar for studying transport and diffusion processes.
Neural timing signal for precise tactile timing judgments
Watanabe, Junji; Nishida, Shin'ya
2016-01-01
The brain can precisely encode the temporal relationship between tactile inputs. While behavioural studies have demonstrated precise interfinger temporal judgments, the underlying neural mechanism remains unknown. Computationally, two kinds of neural responses can act as the information source. One is the phase-locked response to the phase of relatively slow inputs, and the other is the response to the amplitude change of relatively fast inputs. To isolate the contributions of these components, we measured performance of a synchrony judgment task for sine wave and amplitude-modulation (AM) wave stimuli. The sine wave stimulus was a low-frequency sinusoid, with the phase shifted in the asynchronous stimulus. The AM wave stimulus was a low-frequency sinusoidal AM of a 250-Hz carrier, with only the envelope shifted in the asynchronous stimulus. In the experiment, three stimulus pairs, two synchronous ones and one asynchronous one, were sequentially presented to neighboring fingers, and participants were asked to report which one was the asynchronous pair. We found that the asynchrony of AM waves could be detected as precisely as single impulse pair, with the threshold asynchrony being ∼20 ms. On the other hand, the asynchrony of sine waves could not be detected at all in the range from 5 to 30 Hz. Our results suggest that the timing signal for tactile judgments is provided not by the stimulus phase information but by the envelope of the response of the high-frequency-sensitive Pacini channel (PC), although they do not exclude a possible contribution of the envelope of non-PCs. PMID:26843600
Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi
2010-05-01
The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-08-01
The purpose of this volume of the QA Handbook is to provide information and guidance for both the meteorologist and the non-meteorologist who must make judgments about the validity of data and accuracy of measurement systems. Care has been taken to provide definitions to help those making these judgments to communicate without ambiguity. Methods are described in the handbook which will objectively define the quality of measurements so the non-meteorologist can communicate with the meteorologist or environmental scientist or engineer with precision of meaning.
Spatial attention improves the quality of population codes in human visual cortex.
Saproo, Sameer; Serences, John T
2010-08-01
Selective attention enables sensory input from behaviorally relevant stimuli to be processed in greater detail, so that these stimuli can more accurately influence thoughts, actions, and future goals. Attention has been shown to modulate the spiking activity of single feature-selective neurons that encode basic stimulus properties (color, orientation, etc.). However, the combined output from many such neurons is required to form stable representations of relevant objects and little empirical work has formally investigated the relationship between attentional modulations on population responses and improvements in encoding precision. Here, we used functional MRI and voxel-based feature tuning functions to show that spatial attention induces a multiplicative scaling in orientation-selective population response profiles in early visual cortex. In turn, this multiplicative scaling correlates with an improvement in encoding precision, as evidenced by a concurrent increase in the mutual information between population responses and the orientation of attended stimuli. These data therefore demonstrate how multiplicative scaling of neural responses provides at least one mechanism by which spatial attention may improve the encoding precision of population codes. Increased encoding precision in early visual areas may then enhance the speed and accuracy of perceptual decisions computed by higher-order neural mechanisms.
NASA Technical Reports Server (NTRS)
Agnes, Gregory S.; Waldman, Jeff; Hughes, Richard; Peterson, Lee D.
2015-01-01
NASA's proposed Surface Water Ocean Topography (SWOT) mission, scheduled to launch in 2020, would provide critical information about Earth's oceans, ocean circulation, fresh water storage, and river discharge. The mission concept calls for a dual-antenna Ka-band radar interferometer instrument, known as KaRIn, that would map the height of water globally along two 50 km wide swaths. The KaRIn antennas, which would be separated by 10 meters on either side of the spacecraft, would need to be precisely deployable in order to meet demanding pointing requirements. Consequently, an effort was undertaken to design build and prototype a precision deployable Mast for the KaRIn instrument. Each mast was 4.5-m long with a required dilitation stability of 2.5 microns over 3 minutes. It required a minimum first mode of 7 Hz. Deployment repeatability was less than +/- 7 arcsec in all three rotation directions. Overall mass could not exceed 41.5 Kg including any actuators and thermal blanketing. This set of requirements meant the boom had to be three times lighter and two orders of magnitude more precise than the existing state of the art for deployable booms.
Budget impact and cost-effectiveness: can we afford precision medicine in oncology?
Doble, Brett
2016-01-01
Over the past decade there have been remarkable advancements in the understanding of the molecular underpinnings of malignancy. Methods of testing capable of elucidating patients' molecular profiles are now readily available and there is an increased desire to incorporate the information derived from such tests into treatment selection for cancer patients. This has led to more appropriate application of existing treatments as well as the development of a number of innovative and highly effective treatments or what is known collectively as precision medicine. The impact that precision medicine will have on health outcomes is uncertain, as are the costs it will incur. There is, therefore, a need to develop economic evidence and appropriate methods of evaluation to support its implementation to ensure the resources allocated to these approaches are affordable and offer value for money. The market for precision medicine in oncology continues to rapidly expand, placing an increased pressure on reimbursement decision-makers to consider the value and opportunity cost of funding such approaches to care. The benefits of molecular testing can be complex and difficult to evaluate given currently available economic methods, potentially causing a distorted appreciation of their value. Funding decisions of precision medicine will also have far-reaching implications, requiring the consideration of both patient and public perspectives in decision-making. Recommendations to improve the value proposition of precision medicine are, therefore, provided with the hopes of facilitating a better understanding of its impact on outcomes and the overall health budget.
Measurement of latent cognitive abilities involved in concept identification learning.
Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Nock, Matthew K; Naifeh, James A; Heeringa, Steven; Ursano, Robert J; Stein, Murray B
2015-01-01
We used cognitive and psychometric modeling techniques to evaluate the construct validity and measurement precision of latent cognitive abilities measured by a test of concept identification learning: the Penn Conditional Exclusion Test (PCET). Item response theory parameters were embedded within classic associative- and hypothesis-based Markov learning models and were fitted to 35,553 Army soldiers' PCET data from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Data were consistent with a hypothesis-testing model with multiple latent abilities-abstraction and set shifting. Latent abstraction ability was positively correlated with number of concepts learned, and latent set-shifting ability was negatively correlated with number of perseverative errors, supporting the construct validity of the two parameters. Abstraction was most precisely assessed for participants with abilities ranging from 1.5 standard deviations below the mean to the mean itself. Measurement of set shifting was acceptably precise only for participants making a high number of perseverative errors. The PCET precisely measures latent abstraction ability in the Army STARRS sample, especially within the range of mildly impaired to average ability. This precision pattern is ideal for a test developed to measure cognitive impairment as opposed to cognitive strength. The PCET also measures latent set-shifting ability, but reliable assessment is limited to the impaired range of ability, reflecting that perseverative errors are rare among cognitively healthy adults. Integrating cognitive and psychometric models can provide information about construct validity and measurement precision within a single analytical framework.
Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding
Gardner, Brian; Grüning, André
2016-01-01
Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule’s error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism. PMID:27532262
Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding.
Gardner, Brian; Grüning, André
2016-01-01
Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule's error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism.
LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; ...
2015-08-19
An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd (x)Sb 2 and T´-La 2CuOmore » 4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.« less
Algorithmic and heuristic processing of information by the nervous system.
Restian, A
1980-01-01
Starting from the fact that the nervous system must discover the information it needs, the author describes the way it decodes the received message. The logical circuits of the nervous system, submitting the received signals to a process by means of which information brought is discovered step by step, participates in decoding the message. The received signals, as information, can be algorithmically or heuristically processed. Algorithmic processing is done according to precise rules, which must be fulfilled step by step. By algorithmic processing, one develops somatic and vegetative reflexes as blood pressure, heart frequency or water metabolism control. When it does not dispose of precise rules of information processing or when algorithmic processing needs a very long time, the nervous system must use heuristic processing. This is the feature that differentiates the human brain from the electronic computer that can work only according to some extremely precise rules. The human brain can work according to less precise rules because it can resort to trial and error operations, and because it works according to a form of logic. Working with superior order signals which represent the class of all inferior type signals from which they begin, the human brain need not perform all the operations that it would have to perform by superior type of signals. Therefore the brain tries to submit the received signals to intensive as possible superization. All informational processing, and especially heuristical processing, is accompanied by a certain affective color and the brain cannot operate without it. Emotions, passions and sentiments usually complete the lack of precision of the heuristical programmes. Finally, the author shows that informational and especially heuristical processes study can contribute to a better understanding of the transition from neurological to psychological activity.
SLATE: scanning laser automatic threat extraction
NASA Astrophysics Data System (ADS)
Clark, David J.; Prickett, Shaun L.; Napier, Ashley A.; Mellor, Matthew P.
2016-10-01
SLATE is an Autonomous Sensor Module (ASM) designed to work with the SAPIENT system providing accurate location tracking and classifications of targets that pass through its field of view. The concept behind the SLATE ASM is to produce a sensor module that provides a complementary view of the world to the camera-based systems that are usually used for wide area surveillance. Cameras provide a hi-fidelity, human understandable view of the world with which tracking and identification algorithms can be used. Unfortunately, positioning and tracking in a 3D environment is difficult to implement robustly, making location-based threat assessment challenging. SLATE uses a Scanning Laser Rangefinder (SLR) that provides precise (<1cm) positions, sizes, shapes and velocities of targets within its field-of-view (FoV). In this paper we will discuss the development of the SLATE ASM including the techniques used to track and classify detections that move through the field of view of the sensor providing the accurate tracking information to the SAPIENT system. SLATE's ability to locate targets precisely allows subtle boundary-crossing judgements, e.g. on which side of a chain-link fence a target is. SLATE's ability to track targets in 3D throughout its FoV enables behavior classification such as running and walking which can provide an indication of intent and help reduce false alarm rates.
Phyllis C. Adams; Glenn A. Christensen
2012-01-01
A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a Stateâs data to the national FIA...
XUV Frequency Comb Development for Precision Spectroscopy and Ultrafast Science
2015-07-28
first time and provide insight to the underlying 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a...TERMS. Key words or phrases identifying major concepts in the report. 16. SECURITY CLASSIFICATION. Enter security classification in accordance with... security classification regulations, e.g. U, C, S, etc. If this form contains classified information, stamp classification level on the top and bottom
Morphology and force probing of primary murine liver sinusoidal endothelial cells.
Zapotoczny, B; Owczarczyk, K; Szafranska, K; Kus, E; Chlopicki, S; Szymonski, M
2017-07-01
Liver sinusoidal endothelial cells (LSECs) represent unique type of endothelial cells featured by their characteristic morphology, ie, lack of a basement membrane and presence of fenestrations-transmembrane pores acting as a dynamic filter between the vascular space and the liver parenchyma. Delicate structure of LSECs membrane combined with a submicron size of fenestrations hinders their visualization in live cells. In this work, we apply atomic force microscopy contact mode to characterize fenestrations in LSECs. We reveal the structure of fenestrations in live LSECs. Moreover, we show that the high-resolution imaging of fenestrations is possible for the glutaraldehyde-fixed LSECs. Finally, thorough information about the morphology of LSECs including great contrast in visualization of sieve plates and fenestrations is provided using Force Modulation mode. We show also the ability to precisely localize the cell nuclei in fixed LSECs. It can be helpful for more precise description of nanomechanical properties of cell nuclei using atomic force microscopy. Presented methodology combining high-quality imaging of fixed cells with an additional nanomechanical information of both live and fixed LSECs provides a unique approach to study LSECs morphology and nanomechanics that could foster understanding of the role of LSECs in maintaining liver homeostasis. Copyright © 2017 John Wiley & Sons, Ltd.
Item response theory and the measurement of motor behavior.
Safrit, M J; Cohen, A S; Costa, M G
1989-12-01
Item response theory (IRT) has been the focus of intense research and development activity in educational and psychological measurement during the past decade. Because this theory can provide more precise information about test items than other theories usually used in measuring motor behavior, the application of IRT in physical education and exercise science merits investigation. In IRT, the difficulty level of each item (e.g., trial or task) can be estimated and placed on the same scale as the ability of the examinee. Using this information, the test developer can determine the ability levels at which the test functions best. Equating the scores of individuals on two or more items or tests can be handled efficiently by applying IRT. The precision of the identification of performance standards in a mastery test context can be enhanced, as can adaptive testing procedures. In this tutorial, several potential benefits of applying IRT to the measurement of motor behavior were described. An example is provided using bowling data and applying the graded-response form of the Rasch IRT model. The data were calibrated and the goodness of fit was examined. This analysis is described in a step-by-step approach. Limitations to using an IRT model with a test consisting of repeated measures were noted.
Comparative study of aircraft approach and landing performance using ILS, MLS and GLS
NASA Astrophysics Data System (ADS)
Ferdous, Mahbuba; Rashid, Mohsina; China, Mst Mowsumie Akhter; Hossam-E-Haider, Md
2017-12-01
Aircraft landing is one of the most challenging stages of a flight. At this stage, the risk for aircraft to be drifted away from the runway or to collide with other aircraft is very high. So, a supreme accuracy is required to guide aircraft to runway touchdown point precisely. And the precision of approaches are permitted by means of appropriate ground and airborne systems such as Instrument Landing System (ILS) and Microwave Landing System (MLS). Also satellite-based systems can be used like Global Positioning System (GPS) via augmented information supplied by ground-based systems (GBAS). This paper provides an overall review over aircraft performance with different landing aids available to enable the aircraft for executing a safe landing. It encompasses the performance of different landing systems in relation to azimuth and elevation information provided to the pilot and also the different errors encountered by them. This paper also addresses that in addition to eliminating the errors of ground based systems (ILS or MLS), the augmented GPS or GBAS is able to fulfill the ICAO aircraft landing category CAT I to CAT IIIB requirements. And category CAT IIIC standards are still not in use anywhere in the world which require landing with no visibility and runway visual range.
Accurate object tracking system by integrating texture and depth cues
NASA Astrophysics Data System (ADS)
Chen, Ju-Chin; Lin, Yu-Hang
2016-03-01
A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.
Rough set soft computing cancer classification and network: one stone, two birds.
Zhang, Yue
2010-07-15
Gene expression profiling provides tremendous information to help unravel the complexity of cancer. The selection of the most informative genes from huge noise for cancer classification has taken centre stage, along with predicting the function of such identified genes and the construction of direct gene regulatory networks at different system levels with a tuneable parameter. A new study by Wang and Gotoh described a novel Variable Precision Rough Sets-rooted robust soft computing method to successfully address these problems and has yielded some new insights. The significance of this progress and its perspectives will be discussed in this article.
Estimation of the Parameters in a Two-State System Coupled to a Squeezed Bath
NASA Astrophysics Data System (ADS)
Hu, Yao-Hua; Yang, Hai-Feng; Tan, Yong-Gang; Tao, Ya-Ping
2018-04-01
Estimation of the phase and weight parameters of a two-state system in a squeezed bath by calculating quantum Fisher information is investigated. The results show that, both for the phase estimation and for the weight estimation, the quantum Fisher information always decays with time and changes periodically with the phases. The estimation precision can be enhanced by choosing the proper values of the phases and the squeezing parameter. These results can be provided as an analysis reference for the practical application of the parameter estimation in a squeezed bath.
Grundmeier, Robert W; Masino, Aaron J; Casper, T Charles; Dean, Jonathan M; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M; Alpern, Elizabeth R
2016-11-09
Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English "stop words" and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents.
COBALT Flight Demonstrations Fuse Technologies
2017-06-07
This 5-minute, 50-second video shows how the CoOperative Blending of Autonomous Landing Technologies (COBALT) system pairs new landing sensor technologies that promise to yield the highest precision navigation solution ever tested for NASA space landing applications. The technologies included a navigation doppler lidar (NDL), which provides ultra-precise velocity and line-of-sight range measurements, and the Lander Vision System (LVS), which provides terrain-relative navigation. Through flight campaigns conducted in March and April 2017 aboard Masten Space Systems' Xodiac, a rocket-powered vertical takeoff, vertical landing (VTVL) platform, the COBALT system was flight tested to collect sensor performance data for NDL and LVS and to check the integration and communication between COBALT and the rocket. The flight tests provided excellent performance data for both sensors, as well as valuable information on the integrated performance with the rocket that will be used for subsequent COBALT modifications prior to follow-on flight tests. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Aboud, Maurice J; Gassmann, Marcus; McCord, Bruce
2015-09-01
There are situations in which it is important to quickly and positively identify an individual. Examples include suspects detained in the neighborhood of a bombing or terrorist incident, individuals detained attempting to enter or leave the country, and victims of mass disasters. Systems utilized for these purposes must be fast, portable, and easy to maintain. DNA typing methods provide the best biometric information yielding identity, kinship, and geographical origin, but they are not portable and rapid. This study details the development of a portable short-channel microfluidic device based on a modified Agilent 2100 bioanalyzer for applications in forensic genomics. The system utilizes a denaturing polymer matrix with dual-channel laser-induced fluorescence and is capable of producing a genotype in 80 sec. The device was tested for precision and resolution using an allelic ladder created from 6 short tandem repeat (STR) loci and a sex marker (amelogenin). The results demonstrated a precision of 0.09-0.21 bp over the entire size range and resolution values from 2.5 to 4.1 bp. Overall, the results demonstrate the chip provides a portable, rapid, and precise method for screening amplified short tandem repeats and human identification screening. © 2015 American Academy of Forensic Sciences.
Hung, Man; Baumhauer, Judith F; Latt, L Daniel; Saltzman, Charles L; SooHoo, Nelson F; Hunt, Kenneth J
2013-11-01
In 2012, the American Orthopaedic Foot & Ankle Society(®) established a national network for collecting and sharing data on treatment outcomes and improving patient care. One of the network's initiatives is to explore the use of computerized adaptive tests (CATs) for patient-level outcome reporting. We determined whether the CAT from the NIH Patient Reported Outcome Measurement Information System(®) (PROMIS(®)) Physical Function (PF) item bank provides efficient, reliable, valid, precise, and adequately covered point estimates of patients' physical function. After informed consent, 288 patients with a mean age of 51 years (range, 18-81 years) undergoing surgery for common foot and ankle problems completed a web-based questionnaire. Efficiency was determined by time for test administration. Reliability was assessed with person and item reliability estimates. Validity evaluation included content validity from expert review and construct validity measured against the PROMIS(®) Pain CAT and patient responses based on tradeoff perceptions. Precision was assessed by standard error of measurement (SEM) across patients' physical function levels. Instrument coverage was based on a person-item map. Average time of test administration was 47 seconds. Reliability was 0.96 for person and 0.99 for item. Construct validity against the Pain CAT had an r value of -0.657 (p < 0.001). Precision had an SEM of less than 3.3 (equivalent to a Cronbach's alpha of ≥ 0.90) across a broad range of function. Concerning coverage, the ceiling effect was 0.32% and there was no floor effect. The PROMIS(®) PF CAT appears to be an excellent method for measuring outcomes for patients with foot and ankle surgery. Further validation of the PROMIS(®) item banks may ultimately provide a valid and reliable tool for measuring patient-reported outcomes after injuries and treatment.
McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.
2012-01-01
Roving–roving and roving–access creel surveys are the primary techniques used to obtain information on harvest of Chinook salmon Oncorhynchus tshawytscha in Idaho sport fisheries. Once interviews are conducted using roving–roving or roving–access survey designs, mean catch rate can be estimated with the ratio-of-means (ROM) estimator, the mean-of-ratios (MOR) estimator, or the MOR estimator with exclusion of short-duration (≤0.5 h) trips. Our objective was to examine the relative bias and precision of total catch estimates obtained from use of the two survey designs and three catch rate estimators for Idaho Chinook salmon fisheries. Information on angling populations was obtained by direct visual observation of portions of Chinook salmon fisheries in three Idaho river systems over an 18-d period. Based on data from the angling populations, Monte Carlo simulations were performed to evaluate the properties of the catch rate estimators and survey designs. Among the three estimators, the ROM estimator provided the most accurate and precise estimates of mean catch rate and total catch for both roving–roving and roving–access surveys. On average, the root mean square error of simulated total catch estimates was 1.42 times greater and relative bias was 160.13 times greater for roving–roving surveys than for roving–access surveys. Length-of-stay bias and nonstationary catch rates in roving–roving surveys both appeared to affect catch rate and total catch estimates. Our results suggest that use of the ROM estimator in combination with an estimate of angler effort provided the least biased and most precise estimates of total catch for both survey designs. However, roving–access surveys were more accurate than roving–roving surveys for Chinook salmon fisheries in Idaho.
Hori, Kenta; Kuroda, Tomohiro; Oyama, Hiroshi; Ozaki, Yasuhiko; Nakamura, Takehiko; Takahashi, Takashi
2005-12-01
For faultless collaboration among the surgeon, surgical staffs, and surgical robots in telesurgery, communication must include environmental information of the remote operating room, such as behavior of robots and staffs, vital information of a patient, named supporting information, in addition to view of surgical field. "Surgical Cockpit System, " which is a telesurgery support system that has been developed by the authors, is mainly focused on supporting information exchange between remote sites. Live video presentation is important technology for Surgical Cockpit System. Visualization method to give precise location/posture of surgical instruments is indispensable for accurate control and faultless operation. In this paper, the authors propose three-side-view presentation method for precise location/posture control of surgical instruments in telesurgery. The experimental results show that the proposed method improved accurate positioning of a telemanipulator.
Opportunities and Challenges for Personal Heat Exposure Research
Kuras, Evan R.; Richardson, Molly B.; Calkins, Miriam M.; Ebi, Kristie L.; Hess, Jeremy J.; Kintziger, Kristina W.; Jagger, Meredith A.; Middel, Ariane; Scott, Anna A.; Spector, June T.; Uejio, Christopher K.; Vanos, Jennifer K.; Zaitchik, Benjamin F.; Gohlke, Julia M.
2017-01-01
Background: Environmental heat exposure is a public health concern. The impacts of environmental heat on mortality and morbidity at the population scale are well documented, but little is known about specific exposures that individuals experience. Objectives: The first objective of this work was to catalyze discussion of the role of personal heat exposure information in research and risk assessment. The second objective was to provide guidance regarding the operationalization of personal heat exposure research methods. Discussion: We define personal heat exposure as realized contact between a person and an indoor or outdoor environment that poses a risk of increases in body core temperature and/or perceived discomfort. Personal heat exposure can be measured directly with wearable monitors or estimated indirectly through the combination of time–activity and meteorological data sets. Complementary information to understand individual-scale drivers of behavior, susceptibility, and health and comfort outcomes can be collected from additional monitors, surveys, interviews, ethnographic approaches, and additional social and health data sets. Personal exposure research can help reveal the extent of exposure misclassification that occurs when individual exposure to heat is estimated using ambient temperature measured at fixed sites and can provide insights for epidemiological risk assessment concerning extreme heat. Conclusions: Personal heat exposure research provides more valid and precise insights into how often people encounter heat conditions and when, where, to whom, and why these encounters occur. Published literature on personal heat exposure is limited to date, but existing studies point to opportunities to inform public health practice regarding extreme heat, particularly where fine-scale precision is needed to reduce health consequences of heat exposure. https://doi.org/10.1289/EHP556 PMID:28796630
Achterberg, Peter
2014-05-01
This research note studies experimentally how the public translates information about hydrogen technology into evaluations of the latter. It does so by means of a nationally representative factorial survey in the Netherlands (n = 1,012), in which respondents have been given seven randomly selected pieces of (negative, positive and/or neutral) information about this technology. Findings are consistent with framing theory. For those with high trust in science and technology, positive information increases support, while negative information detracts from it. For those with low trust in science and technology, however, information provision has no effect at all on the evaluation of hydrogen technology. Precisely among the most likely targets of science communication, i.e., those without much trust in science and technology, providing positive information fails to evoke a more favorable evaluation from the latter.
A Concept for Airborne Precision Spacing for Dependent Parallel Approaches
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay
2012-01-01
The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.
Layered compression for high-precision depth data.
Miao, Dan; Fu, Jingjing; Lu, Yan; Li, Shipeng; Chen, Chang Wen
2015-12-01
With the development of depth data acquisition technologies, access to high-precision depth with more than 8-b depths has become much easier and determining how to efficiently represent and compress high-precision depth is essential for practical depth storage and transmission systems. In this paper, we propose a layered high-precision depth compression framework based on an 8-b image/video encoder to achieve efficient compression with low complexity. Within this framework, considering the characteristics of the high-precision depth, a depth map is partitioned into two layers: 1) the most significant bits (MSBs) layer and 2) the least significant bits (LSBs) layer. The MSBs layer provides rough depth value distribution, while the LSBs layer records the details of the depth value variation. For the MSBs layer, an error-controllable pixel domain encoding scheme is proposed to exploit the data correlation of the general depth information with sharp edges and to guarantee the data format of LSBs layer is 8 b after taking the quantization error from MSBs layer. For the LSBs layer, standard 8-b image/video codec is leveraged to perform the compression. The experimental results demonstrate that the proposed coding scheme can achieve real-time depth compression with satisfactory reconstruction quality. Moreover, the compressed depth data generated from this scheme can achieve better performance in view synthesis and gesture recognition applications compared with the conventional coding schemes because of the error control algorithm.
Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J
2014-05-15
Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.
Toward malaysian sustainable agriculture in 21st century
NASA Astrophysics Data System (ADS)
Khorramnia, K.; Shariff, A. R. M.; Rahim, A. Abdul; Mansor, S.
2014-02-01
Sustainable agriculture should be able to meet various social goals and objectives so that it can be maintained for an indefinite period without significant negative impacts on environment and natural resources. A wide variety of agricultural activities are running in Malaysia. Maintaining high quality of agricultural products with lower environmental impacts through a sustainable economic viability and life satisfaction of farmers and community are important factors helping to meet sustainable agriculture. Human resources are playing key role in directing the community toward sustainable development. The trend of improving the human development index in Malaysia is highest in the East Asia and the Pacific, high human development countries and the world, since 2000. Precision agriculture is providing strong tools to achieve sustainable agriculture. Different types of sensors, positioning and navigation systems, GIS, software and variable rate technology are well known components of precision agriculture. Drones and robots are promising tools that enabling farmers and managers to collect information or perform particular actions in remote areas or tough conditions. According to a survey, forestry and timber, rubber production and oil palm estates are three main agricultural divisions that precision agriculture may improve the productivity in respect to area of cropland/worker. Main factors affecting the adoption of precision agriculture in Malaysia are: a) Political and legal supports, b) Decision support systems and user interfaces c) Experienced research team works d) National educational policy e) Success in commercialization of precision agriculture system.
Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.
2017-01-01
Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625
Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.
Pearl, Lisa S; Sprouse, Jon
2015-06-01
Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.
Zhang, Peng; Lee, Seungah; Yu, Hyunung; ...
2015-06-15
Super-resolution imaging of fluorescence-free plasmonic nanoparticles (NPs) was achieved using enhanced dark-field (EDF) illumination based on wavelength-modulation. Indistinguishable adjacent EDF images of 103-nm gold nanoparticles (GNPs), 40-nm gold nanorods (GNRs), and 80-nm silver nanoparticles (SNPs) were modulated at their wavelengths of specific localized surface plasmon scattering. The coordinates (x, y) of each NP were resolved by fitting their point spread functions with a two-dimensional Gaussian. The measured localization precisions of GNPs, GNRs, and SNPs were 2.5 nm, 5.0 nm, and 2.9 nm, respectively. From the resolved coordinates of NPs and the corresponding localization precisions, super-resolution images were reconstructed. Depending onmore » the spontaneous polarization of GNR scattering, the orientation angle of GNRs in two-dimensions was resolved and provided more elaborate localization information. This novel fluorescence-free super-resolution method was applied to live HeLa cells to resolve NPs and provided remarkable subdiffraction limit images.« less
Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M
2017-04-01
The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.
Optimization of the MINERVA Exoplanet Search Strategy via Simulations
NASA Astrophysics Data System (ADS)
Nava, Chantell; Johnson, Samson; McCrady, Nate; Minerva
2015-01-01
Detection of low-mass exoplanets requires high spectroscopic precision and high observational cadence. MINERVA is a dedicated observatory capable of sub meter-per-second radial velocity precision. As a dedicated observatory, MINERVA can observe with every-clear-night cadence that is essential for low-mass exoplanet detection. However, this cadence complicates the determination of an optimal observing strategy. We simulate MINERVA observations to optimize our observing strategy and maximize exoplanet detections. A dispatch scheduling algorithm provides observations of MINERVA targets every day over a three-year observing campaign. An exoplanet population with a distribution informed by Kepler statistics is assigned to the targets, and radial velocity curves induced by the planets are constructed. We apply a correlated noise model that realistically simulates stellar astrophysical noise sources. The simulated radial velocity data is fed to the MINERVA planet detection code and the expected exoplanet yield is calculated. The full simulation provides a tool to test different strategies for scheduling observations of our targets and optimizing the MINERVA exoplanet search strategy.
Direct-Y: Fast Acquisition of the GPS PPS Signal
NASA Technical Reports Server (NTRS)
Namoos, Omar M.; DiEsposti, Raymond S.
1996-01-01
The NAVSTAR Global Positioning System (GPS) provides positioning and time information to military users via the Precise Positioning Service (PPS) which typically allows users a significant margin of precision over the commercially available Standard Positioning Service (SPS), Military sets that rely on first acquiring the SPS Coarse Acquisition (C/A) code, read from the data message the handover word (HOW) that provides the time-of-signal transmission needed to acquire and lock onto the PPS Y-code. Under extreme battlefield conditions, the use of GPS would be denied to the warfighter who cannot pick up the un-encrypted C/A code. Studies are underway at the GPS Joint Program Office (JPO) at the Space and Missile Center, Los Angeles Air Force Base that are aimed at developing the capability to directly acquire Y-code without first acquiring C/A code. This paper briefly outlines efforts to develop 'direct-Y' acquisition, and various approaches to solving this problem. The potential ramifications of direct-Y to military users are also discussed.
Probing Black Holes With Gravitational Radiation
NASA Astrophysics Data System (ADS)
Cornish, Neil J.
2006-09-01
Gravitational radiation can provide unique insights into the dynamics and evolution of black holes. Gravitational waveforms encode detailed information about the spacetime geometry, much as the sounds made by a musical instrument reflect the geometry of the instrument. The LISA gravitational wave observatory will be able to record black holes colliding out to the edge of the visible Universe, with an expected event rate of tens to thousands per year. LISA has unmatched capabilities for studying the role of black holes in galactic evolution, in particular, by studying the mergers of seed black holes at very high redshift, z > 5. Merger events at lower redshift will be detected at extremely high signal-to-noise, allowing for precision tests of the black hole paradigm. Below z=1 LISA will be able to record stellar remnants falling into supermassive black holes. These extreme mass ratio inspiral events will yield insights into the dynamics of galactic cusps, and the brighter events will provide incredibly precise tests of strong field, dynamical gravity.
Method to make accurate concentration and isotopic measurements for small gas samples
NASA Astrophysics Data System (ADS)
Palmer, M. R.; Wahl, E.; Cunningham, K. L.
2013-12-01
Carbon isotopic ratio measurements of CO2 and CH4 provide valuable insight into carbon cycle processes. However, many of these studies, like soil gas, soil flux, and water head space experiments, provide very small gas sample volumes, too small for direct measurement by current constant-flow Cavity Ring-Down (CRDS) isotopic analyzers. Previously, we addressed this issue by developing a sample introduction module which enabled the isotopic ratio measurement of 40ml samples or smaller. However, the system, called the Small Sample Isotope Module (SSIM), does dilute the sample during the delivery with inert carrier gas which causes a ~5% reduction in concentration. The isotopic ratio measurements are not affected by this small dilution, but researchers are naturally interested accurate concentration measurements. We present the accuracy and precision of a new method of using this delivery module which we call 'double injection.' Two portions of the 40ml of the sample (20ml each) are introduced to the analyzer, the first injection of which flushes out the diluting gas and the second injection is measured. The accuracy of this new method is demonstrated by comparing the concentration and isotopic ratio measurements for a gas sampled directly and that same gas measured through the SSIM. The data show that the CO2 concentration measurements were the same within instrument precision. The isotopic ratio precision (1σ) of repeated measurements was 0.16 permil for CO2 and 1.15 permil for CH4 at ambient concentrations. This new method provides a significant enhancement in the information provided by small samples.
State Space Model with hidden variables for reconstruction of gene regulatory networks.
Wu, Xi; Li, Peng; Wang, Nan; Gong, Ping; Perkins, Edward J; Deng, Youping; Zhang, Chaoyang
2011-01-01
State Space Model (SSM) is a relatively new approach to inferring gene regulatory networks. It requires less computational time than Dynamic Bayesian Networks (DBN). There are two types of variables in the linear SSM, observed variables and hidden variables. SSM uses an iterative method, namely Expectation-Maximization, to infer regulatory relationships from microarray datasets. The hidden variables cannot be directly observed from experiments. How to determine the number of hidden variables has a significant impact on the accuracy of network inference. In this study, we used SSM to infer Gene regulatory networks (GRNs) from synthetic time series datasets, investigated Bayesian Information Criterion (BIC) and Principle Component Analysis (PCA) approaches to determining the number of hidden variables in SSM, and evaluated the performance of SSM in comparison with DBN. True GRNs and synthetic gene expression datasets were generated using GeneNetWeaver. Both DBN and linear SSM were used to infer GRNs from the synthetic datasets. The inferred networks were compared with the true networks. Our results show that inference precision varied with the number of hidden variables. For some regulatory networks, the inference precision of DBN was higher but SSM performed better in other cases. Although the overall performance of the two approaches is compatible, SSM is much faster and capable of inferring much larger networks than DBN. This study provides useful information in handling the hidden variables and improving the inference precision.
Albano, Maria Grazia; Jourdain, Patrick; De Andrade, Vincent; Domenke, Aukse; Desnos, Michel; d'Ivernois, Jean-François
2014-05-01
Therapeutic patient education programmes on heart failure have been widely proposed for many years for heart failure patients, but their efficiency remains questionable, partly because most articles lack a precise programme description, which makes comparative analysis of the studies difficult. To analyse the degree of precision in describing therapeutic patient education programmes in recent randomized controlled trials. Three major recent recommendations on therapeutic patient education in heart failure inspired us to compile a list of 23 relevant items that an 'ideal' description of a therapeutic patient education programme should contain. To discover the extent to which recent studies into therapeutic patient education in heart failure included these items, we analysed 19 randomized controlled trials among 448 articles published in this field from 2005 to 2012. The major elements required to describe a therapeutic patient education programme were present, but some other very important pieces of information were missing in most of the studies we analysed: the patient's educational needs, health literacy, projects, expectations regarding therapeutic patient education and psychosocial status; the educational methodology used; outcomes evaluation; and follow-up strategies. Research into how therapeutic patient education can help heart failure patients will be improved if more precise descriptions of patients, educational methodology and evaluation protocols are given by authors, ideally in a standardized format. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Kohlmeier, Martin; De Caterina, Raffaele; Ferguson, Lynnette R; Görman, Ulf; Allayee, Hooman; Prasad, Chandan; Kang, Jing X; Nicoletti, Carolina Ferreira; Martinez, J Alfredo
2016-01-01
Nutrigenetics considers the influence of individual genetic variation on differences in response to dietary components, nutrient requirements and predisposition to disease. Nutrigenomics involves the study of interactions between the genome and diet, including how nutrients affect the transcription and translation process plus subsequent proteomic and metabolomic changes, and also differences in response to dietary factors based on the individual genetic makeup. Personalized characteristics such as age, gender, physical activity, physiological state and social status, and special conditions such as pregnancy and risk of disease can inform dietary advice that more closely meets individual needs. Precision nutrition has a promising future in treating the individual according to their phenotype and genetic characteristics, aimed at both the treatment and prevention of disease. However, many aspects are still in progress and remain as challenges for the future of nutrition. The integration of the human genotype and microbiome needs to be better understood. Further advances in data interpretation tools are also necessary, so that information obtained through newer tests and technologies can be properly transferred to consumers. Indeed, precision nutrition will integrate genetic data with phenotypical, social, cultural and personal preferences and lifestyles matters to provide a more individual nutrition, but considering public health perspectives, where ethical, legal and policy aspects need to be defined and implemented. © 2016 S. Karger AG, Basel.
Precision medicine in chronic disease management: the MS BioScreen
Gourraud, Pierre-Antoine; Henry, Roland; Cree, Bruce AC; Crane, Jason C; Lizee, Antoine; Olson, Marram P; Santaniello, Adam V.; Datta, Esha; Zhu, Alyssa H.; Bevan, Carolyn J.; Gelfand, Jeffrey M.; Graves, Jennifer A.; Goodin, Douglas E.; Green, Ari; von Büdingen, H.-Christian; Waubant, Emmanuelle; Zamvil, Scott S.; Crabtree-Hartman, Elizabeth; Nelson, Sarah; Baranzini, Sergio E.; Hauser, Stephen L.
2014-01-01
We present a precision medicine application developed for multiple sclerosis (MS): the MS BioScreen. This new tool addresses the challenges of dynamic management of a complex chronic disease; the interaction of clinicians and patients with such a tool illustrates the extent to which translational digital medicine – i.e. the application of information technology to medicine—has the potential to radically transform medical practice. We introduce three key evolutionary phases in displaying data to health care providers, patients, and researchers: visualization (accessing data), contextualization (understanding the data), and actionable interpretation (real-time use of the data to assist decision-making). Together these form the stepping-stones that are expected to accelerate standardization of data across platforms, promote evidence-based medicine, support shared decision-making, and ultimately lead to improved outcomes. PMID:25263997
Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Tom.
2013-01-01
NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.
A topological multilayer model of the human body.
Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João
2015-11-04
Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.
NASA Astrophysics Data System (ADS)
Nair, S. P.; Righetti, R.
2015-05-01
Recent elastography techniques focus on imaging information on properties of materials which can be modeled as viscoelastic or poroelastic. These techniques often require the fitting of temporal strain data, acquired from either a creep or stress-relaxation experiment to a mathematical model using least square error (LSE) parameter estimation. It is known that the strain versus time relationships for tissues undergoing creep compression have a non-linear relationship. In non-linear cases, devising a measure of estimate reliability can be challenging. In this article, we have developed and tested a method to provide non linear LSE parameter estimate reliability: which we called Resimulation of Noise (RoN). RoN provides a measure of reliability by estimating the spread of parameter estimates from a single experiment realization. We have tested RoN specifically for the case of axial strain time constant parameter estimation in poroelastic media. Our tests show that the RoN estimated precision has a linear relationship to the actual precision of the LSE estimator. We have also compared results from the RoN derived measure of reliability against a commonly used reliability measure: the correlation coefficient (CorrCoeff). Our results show that CorrCoeff is a poor measure of estimate reliability for non-linear LSE parameter estimation. While the RoN is specifically tested only for axial strain time constant imaging, a general algorithm is provided for use in all LSE parameter estimation.
2016-01-01
Recent advances in biosensors, medical instrumentation, and information processing and communication technologies (ICT) have enabled significant improvements in healthcare. However, these technologies have been mainly applied in clinical environments, such as hospitals and healthcare facilities, under managed care by well-trained and specialized individuals. The global challenge of providing quality healthcare at affordable cost leads to the proposed paradigm of P reventive, Personalized, and Precision Medicine that requires a seamless use of technology and infrastructure support for patients and healthcare providers at point-of-care (POC) locations including homes, semi or pre-clinical facilities, and hospitals. The complexity of the global healthcare challenge necessitates strong collaborative interdisciplinary synergies involving all stakeholder groups including academia, federal research institutions, industry, regulatory agencies, and clinical communities. It is critical to evolve with collaborative efforts on the translation of research to technology development toward clinical validation and potential healthcare applications. This special issue is focused on technology innovation and translational research for POC applications with potential impact in improving global healthcare in the respective areas. Some of these papers were presented at the NIH-IEEE Strategic Conference on Healthcare Innovations and POC Technologies for Precision Medicine (HI-POCT) held at the NIH on November 9–10, 2015. The papers included in the Special Issue provide a spectrum of critical issues and collaborative resources on translational research of advanced POC devices and ICT into global healthcare environment. PMID:28560119
Fiore, Stephen M.; Wiltshire, Travis J.
2016-01-01
In this paper we advance team theory by describing how cognition occurs across the distribution of members and the artifacts and technology that support their efforts. We draw from complementary theorizing coming out of cognitive engineering and cognitive science that views forms of cognition as external and extended and integrate this with theorizing on macrocognition in teams. Two frameworks are described that provide the groundwork for advancing theory and aid in the development of more precise measures for understanding team cognition via focus on artifacts and the technologies supporting their development and use. This includes distinctions between teamwork and taskwork and the notion of general and specific competencies from the organizational sciences along with the concepts of offloading and scaffolding from the cognitive sciences. This paper contributes to the team cognition literature along multiple lines. First, it aids theory development by synthesizing a broad set of perspectives on the varied forms of cognition emerging in complex collaborative contexts. Second, it supports research by providing diagnostic guidelines to study how artifacts are related to team cognition. Finally, it supports information systems designers by more precisely describing how to conceptualize team-supporting technology and artifacts. As such, it provides a means to more richly understand process and performance as it occurs within sociotechnical systems. Our overarching objective is to show how team cognition can both be more clearly conceptualized and more precisely measured by integrating theory from cognitive engineering and the cognitive and organizational sciences. PMID:27774074
Fiore, Stephen M; Wiltshire, Travis J
2016-01-01
In this paper we advance team theory by describing how cognition occurs across the distribution of members and the artifacts and technology that support their efforts. We draw from complementary theorizing coming out of cognitive engineering and cognitive science that views forms of cognition as external and extended and integrate this with theorizing on macrocognition in teams. Two frameworks are described that provide the groundwork for advancing theory and aid in the development of more precise measures for understanding team cognition via focus on artifacts and the technologies supporting their development and use. This includes distinctions between teamwork and taskwork and the notion of general and specific competencies from the organizational sciences along with the concepts of offloading and scaffolding from the cognitive sciences. This paper contributes to the team cognition literature along multiple lines. First, it aids theory development by synthesizing a broad set of perspectives on the varied forms of cognition emerging in complex collaborative contexts. Second, it supports research by providing diagnostic guidelines to study how artifacts are related to team cognition. Finally, it supports information systems designers by more precisely describing how to conceptualize team-supporting technology and artifacts. As such, it provides a means to more richly understand process and performance as it occurs within sociotechnical systems. Our overarching objective is to show how team cognition can both be more clearly conceptualized and more precisely measured by integrating theory from cognitive engineering and the cognitive and organizational sciences.
Relative receiver autonomous integrity monitoring for future GNSS-based aircraft navigation
NASA Astrophysics Data System (ADS)
Gratton, Livio Rafael
The Global Positioning System (GPS) has enabled reliable, safe, and practical aircraft positioning for en-route and non-precision phases of flight for more than a decade. Intense research is currently devoted to extending the use of Global Navigation Satellite Systems (GNSS), including GPS, to precision approach and landing operations. In this context, this work is focused on the development, analysis, and verification of the concept of Relative Receiver Autonomous Integrity Monitoring (RRAIM) and its potential applications to precision approach navigation. RRAIM fault detection algorithms are developed, and associated mathematical bounds on position error are derived. These are investigated as possible solutions to some current key challenges in precision approach navigation, discussed below. Augmentation systems serving continent-size areas (like the Wide Area Augmentation System or WAAS) allow certain precision approach operations within the covered region. More and better satellites, with dual frequency capabilities, are expected to be in orbit in the mid-term future, which will potentially allow WAAS-like capabilities worldwide with a sparse ground station network. Two main challenges in achieving this goal are (1) ensuring that navigation fault detection functions are fast enough to alert worldwide users of hazardously misleading information, and (2) minimizing situations in which navigation is unavailable because the user's local satellite geometry is insufficient for safe position estimation. Local augmentation systems (implemented at individual airports, like the Local Area Augmentation System or LAAS) have the potential to allow precision approach and landing operations by providing precise corrections to user-satellite range measurements. An exception to these capabilities arises during ionospheric storms (caused by solar activity), when hazardous situations can exist with residual range errors several orders of magnitudes higher than nominal. Until dual frequency civil GPS signals are available, the ability to provide integrity during ionospheric storms, without excessive loss of availability is a major challenge. For all users, with or without augmentation, some situations cause short duration losses of satellites in view. Two examples are aircraft banking during turns and ionospheric scintillation. The loss of range signals can translate into gaps in good satellite geometry, and the resulting challenge is to ensure navigation continuity by bridging these gaps, while simultaneously maintaining high integrity. It is shown that the RRAIM methods developed in this research can be applied to mitigate each of these obstacles to safe and reliable precision aircraft navigation.
An efficient multilevel optimization method for engineering design
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.; Yang, Y. J.; Kim, D. S.
1988-01-01
An efficient multilevel deisgn optimization technique is presented. The proposed method is based on the concept of providing linearized information between the system level and subsystem level optimization tasks. The advantages of the method are that it does not require optimum sensitivities, nonlinear equality constraints are not needed, and the method is relatively easy to use. The disadvantage is that the coupling between subsystems is not dealt with in a precise mathematical manner.
Neuroscience-Enabled Complex Visual Scene Understanding
2012-04-12
some cases, it is hard to precisely say where or what we are looking at since a complex task governs eye fixations, for example in driving. While in...another objects ( say a door) can be resolved using the prior information about the scene. This knowledge can be provided from gist models, such as one...separation and combination of class-dependent features for handwriting recognition,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 21, no. 10, pp. 1089
1991-06-05
information would provide more precise control of the vehicle. To this extent, research has been ongoing at the Biological Acoustics Section of AAMRL... researching questions of neurobiology, particularly neurochemistry and neuroanatomy. Furthermore, I am strongly interested in the effects of ionizing and non ...administered to the animal intraperitoneally. Control animals received an injection of saline in an equivalent volume. When the colonic temperature returned to
Combining path integration and remembered landmarks when navigating without vision.
Kalia, Amy A; Schrater, Paul R; Legge, Gordon E
2013-01-01
This study investigated the interaction between remembered landmark and path integration strategies for estimating current location when walking in an environment without vision. We asked whether observers navigating without vision only rely on path integration information to judge their location, or whether remembered landmarks also influence judgments. Participants estimated their location in a hallway after viewing a target (remembered landmark cue) and then walking blindfolded to the same or a conflicting location (path integration cue). We found that participants averaged remembered landmark and path integration information when they judged that both sources provided congruent information about location, which resulted in more precise estimates compared to estimates made with only path integration. In conclusion, humans integrate remembered landmarks and path integration in a gated fashion, dependent on the congruency of the information. Humans can flexibly combine information about remembered landmarks with path integration cues while navigating without visual information.
Combining Path Integration and Remembered Landmarks When Navigating without Vision
Kalia, Amy A.; Schrater, Paul R.; Legge, Gordon E.
2013-01-01
This study investigated the interaction between remembered landmark and path integration strategies for estimating current location when walking in an environment without vision. We asked whether observers navigating without vision only rely on path integration information to judge their location, or whether remembered landmarks also influence judgments. Participants estimated their location in a hallway after viewing a target (remembered landmark cue) and then walking blindfolded to the same or a conflicting location (path integration cue). We found that participants averaged remembered landmark and path integration information when they judged that both sources provided congruent information about location, which resulted in more precise estimates compared to estimates made with only path integration. In conclusion, humans integrate remembered landmarks and path integration in a gated fashion, dependent on the congruency of the information. Humans can flexibly combine information about remembered landmarks with path integration cues while navigating without visual information. PMID:24039742
Information and informatics literacy: skills, timing, and estimates of competence.
Scott, C S; Schaad, D C; Mandel, L S; Brock, D M; Kim, S
2000-01-01
Computing and biomedical informatics technologies are providing almost instantaneous access to vast amounts of possibly relevant information. Although students are entering medical school with increasingly sophisticated basic technological skills, medical educators must determine what curricular enhancements are needed to prepare learners for the world of electronic information. The purpose was to examine opinions of academic affairs and informatics administrators, curriculum deans and recently matriculated medical students about prematriculation competence and medical education learning expectations. Two surveys were administered: an Information Literacy Survey for curriculum/informatics deans and a Computing Skills Survey for entering medical students. Results highlight differences of opinion about entering competencies. They also indicate that medical school administrators believe that most basic information skills fall within the domain of undergraduate medical education. Further investigations are needed to determine precise entry-level skills and whether information literacy will increase as a result of rising levels of technical competence.
The progress on time & frequency during the past 5 decades
NASA Astrophysics Data System (ADS)
Wang, Zheng-Ming
2002-06-01
The number and variety of applications using precise timing are astounding and increasing along with the new technology in communication, computer science, space science as well as in other fields. The world has evolved into the information age, and precise timing is at the heart of managing the flow of that information, which prompts the progress on precise timing itself rapidly. The development of time scales, UT1 determination, frequency standards, time transfer and the time dissemination for the past half century in the world and in China are described in this paper. The expectation in this field is discussed.
Reliability and precision of stress sonography of the ulnar collateral ligament.
Bica, David; Armen, Joseph; Kulas, Anthony S; Youngs, Kevin; Womack, Zachary
2015-03-01
Musculoskeletal sonography has emerged as an additional diagnostic tool that can be used to assess medial elbow pain and laxity in overhead throwers. It provides a dynamic, rapid, and noninvasive modality in the evaluation of ligamentous structural integrity. Many studies have demonstrated the utility of dynamic sonography for medial elbow and ulnar collateral ligament (UCL) integrity. However, evaluating the reliabilityand precision of these measurements is critical if sonography is ultimately used as a clinical diagnostic tool. The purpose of this study was to evaluate the reliability and precision of stress sonography applied to the medial elbow. We conducted a cross-sectional study during the 2011 baseball off-season. Eighteen National Collegiate Athletic Association Division I pitchers were enrolled, and 36 elbows were studied. Using sonography, the medial elbow was assessed, and measurements of the UCL length and ulnohumeral joint gapping were performed twice under two conditions (unloaded and loaded) and bilaterally. Intraclass correlation coefficients (0.72-0.94) and standard errors of measurements (0.3-0.9 mm) for UCL length and ulnohumeral joint gapping were good to excellent. Mean differences between unloaded and loaded conditions for the dominant arms were 1.3 mm (gapping; P < .001) and 1.4 mm (UCL length; P < .001). Medial elbow stress sonography is a reliable and precise method for detecting changes in ulnohumeral joint gapping and UCL lengthening. Ultimately, this method may provide clinicians valuable information regarding the medial elbow's response to valgus loading and may help guide treatment options. © 2015 by the American Institute of Ultrasound in Medicine.
Pretorius, Etheresia; Bester, Janette
2016-08-09
Type 2 diabetes patients (T2D) have a considerably higher cardiovascular risk, which is closely associated with systemic inflammation, and an accompanying pathologic coagulation system. Due to the complexity of the diabetic profile, we suggest that we need to look at each patient individually and particularly at his or her clotting profile; as the healthiness of the coagulation system gives us an indication of the success of clinical intervention. T2D coagulability varied markedly, although there were no clear difference in medication use and the standards of HbA1c levels. Our sample consisted of 90 poorly controlled T2D and 71 healthy individuals. We investigated the medication use and standards of HbA1c levels of T2D and we used thromboelastography (TEG) and scanning electron microscopy (SEM) to study their clot formation. The latest NIH guidelines suggest that clinical medicine should focus on precision medicine, and the current broad understanding is that precision medicine may in future, provide personalized targets for preventative and therapeutic interventions. Here we suggest a practical example where TEG can be used as an easily accessible point-of-care tool to establish a comprehensive clotting profile analysis for T2D patients; and additionally may provide valuable information that may be used in the envisaged precision medicine approach. Only by closely following each individual patient's progress and healthiness and thereby managing systemic inflammation, will we be able to reduce this pandemic.
Precise FIA plot registration using field and dense LIDAR data
Demetrios Gatziolis
2009-01-01
Precise registration of forest inventory and analysis (FIA) plots is a prerequisite for an effective fusion of field data with ancillary spatial information, which is an approach commonly employed in the mapping of various forest parameters. Although the adoption of Global Positioning System technology has improved the precision of plot coordinates obtained during...
Opening plenary speaker: Human genomics, precision medicine, and advancing human health.
Green, Eric D
2016-08-01
Starting with the launch of the Human Genome Project in 1990, the past quarter-century has brought spectacular achievements in genomics that dramatically empower the study of human biology and disease. The human genomics enterprise is now in the midst of an important transition, as the growing foundation of genomic knowledge is being used by researchers and clinicians to tackle increasingly complex problems in biomedicine. Of particular prominence is the use of revolutionary new DNA sequencing technologies for generating prodigious amounts of DNA sequence data to elucidate the complexities of genome structure, function, and evolution, as well as to unravel the genomic bases of rare and common diseases. Together, these developments are ushering in the era of genomic medicine. Augmenting the advances in human genomics have been innovations in technologies for measuring environmental and lifestyle information, electronic health records, and data science; together, these provide opportunities of unprecedented scale and scope for investigating the underpinnings of health and disease. To capitalize on these opportunities, U.S. President Barack Obama recently announced a major new research endeavor - the U.S. Precision Medicine Initiative. This bold effort will be framed around several key aims, which include accelerating the use of genomically informed approaches to cancer care, making important policy and regulatory changes, and establishing a large research cohort of >1 million volunteers to facilitate precision medicine research. The latter will include making the partnership with all participants a centerpiece feature in the cohort's design and development. The Precision Medicine Initiative represents a broad-based research program that will allow new approaches for individualized medical care to be rigorously tested, so as to establish a new evidence base for advancing clinical practice and, eventually, human health.
Optimal structure of metaplasticity for adaptive learning
2017-01-01
Learning from reward feedback in a changing environment requires a high degree of adaptability, yet the precise estimation of reward information demands slow updates. In the framework of estimating reward probability, here we investigated how this tradeoff between adaptability and precision can be mitigated via metaplasticity, i.e. synaptic changes that do not always alter synaptic efficacy. Using the mean-field and Monte Carlo simulations we identified ‘superior’ metaplastic models that can substantially overcome the adaptability-precision tradeoff. These models can achieve both adaptability and precision by forming two separate sets of meta-states: reservoirs and buffers. Synapses in reservoir meta-states do not change their efficacy upon reward feedback, whereas those in buffer meta-states can change their efficacy. Rapid changes in efficacy are limited to synapses occupying buffers, creating a bottleneck that reduces noise without significantly decreasing adaptability. In contrast, more-populated reservoirs can generate a strong signal without manifesting any observable plasticity. By comparing the behavior of our model and a few competing models during a dynamic probability estimation task, we found that superior metaplastic models perform close to optimally for a wider range of model parameters. Finally, we found that metaplastic models are robust to changes in model parameters and that metaplastic transitions are crucial for adaptive learning since replacing them with graded plastic transitions (transitions that change synaptic efficacy) reduces the ability to overcome the adaptability-precision tradeoff. Overall, our results suggest that ubiquitous unreliability of synaptic changes evinces metaplasticity that can provide a robust mechanism for mitigating the tradeoff between adaptability and precision and thus adaptive learning. PMID:28658247
Schlienger, M. Eric; Schmale, David T.; Oliver, Michael S.
2001-07-10
A new class of precision powder feeders is disclosed. These feeders provide a precision flow of a wide range of powdered materials, while remaining robust against jamming or damage. These feeders can be precisely controlled by feedback mechanisms.
Accommodating Uncertainty in Prior Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Vander Wiel, Scott Alan
2017-01-19
A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.
Energy-Efficient Wireless Sensor Networks for Precision Agriculture: A Review.
Jawad, Haider Mahmood; Nordin, Rosdiadee; Gharghan, Sadik Kamel; Jawad, Aqeel Mahmood; Ismail, Mahamod
2017-08-03
Wireless sensor networks (WSNs) can be used in agriculture to provide farmers with a large amount of information. Precision agriculture (PA) is a management strategy that employs information technology to improve quality and production. Utilizing wireless sensor technologies and management tools can lead to a highly effective, green agriculture. Based on PA management, the same routine to a crop regardless of site environments can be avoided. From several perspectives, field management can improve PA, including the provision of adequate nutrients for crops and the wastage of pesticides for the effective control of weeds, pests, and diseases. This review outlines the recent applications of WSNs in agriculture research as well as classifies and compares various wireless communication protocols, the taxonomy of energy-efficient and energy harvesting techniques for WSNs that can be used in agricultural monitoring systems, and comparison between early research works on agriculture-based WSNs. The challenges and limitations of WSNs in the agricultural domain are explored, and several power reduction and agricultural management techniques for long-term monitoring are highlighted. These approaches may also increase the number of opportunities for processing Internet of Things (IoT) data.
NASA Astrophysics Data System (ADS)
Zhang, Xian; Zhou, Binquan; Li, Hong; Zhao, Xinghua; Mu, Weiwei; Wu, Wenfeng
2017-10-01
Navigation technology is crucial to the national defense and military, which can realize the measurement of orientation, positioning, attitude and speed for moving object. Inertial navigation is not only autonomous, real-time, continuous, hidden, undisturbed but also no time-limited and environment-limited. The gyroscope is the core component of the inertial navigation system, whose precision and size are the bottleneck of the performance. However, nuclear magnetic resonance gyroscope is characteristic of the advantage of high precision and small size. Nuclear magnetic resonance gyroscope can meet the urgent needs of high-tech weapons and equipment development of new generation. This paper mainly designs a set of photoelectric signal processing system for nuclear magnetic resonance gyroscope based on FPGA, which process and control the information of detecting laser .The photoelectric signal with high frequency carrier is demodulated by in-phase and quadrature demodulation method. Finally, the processing system of photoelectric signal can compensate the residual magnetism of the shielding barrel and provide the information of nuclear magnetic resonance gyroscope angular velocity.
Flat-Lens Focusing of Electron Beams in Graphene
Tang, Yang; Cao, Xiyuan; Guo, Ran; Zhang, Yanyan; Che, Zhiyuan; Yannick, Fouodji T.; Zhang, Weiping; Du, Junjie
2016-01-01
Coupling electron beams carrying information into electronic units is fundamental in microelectronics. This requires precision manipulation of electron beams through a coupler with a good focusing ability. In graphene, the focusing of wide electron beams has been successfully demonstrated by a circular p-n junction. However, it is not favorable for information coupling since the focal length is so small that the focal spot locates inside the circular gated region, rather than in the background region. Here, we demonstrate that an array of gate-defined quantum dots, which has gradually changing lattice spacing in the direction transverse to propagation, can focus electrons outside itself, providing a possibility to make a coupler in graphene. The focusing effect can be understood as due to the gradient change of effective refractive indices, which are defined by the local energy band in a periodic potential. The strong focusing can be achieved by suitably choosing the lattice gradient and the layer number in the incident direction, offering an effective solution to precision manipulation of electron beams with wide electron energy range and high angular tolerance. PMID:27628099
Precisely Tracking Childhood Death.
Farag, Tamer H; Koplan, Jeffrey P; Breiman, Robert F; Madhi, Shabir A; Heaton, Penny M; Mundel, Trevor; Ordi, Jaume; Bassat, Quique; Menendez, Clara; Dowell, Scott F
2017-07-01
Little is known about the specific causes of neonatal and under-five childhood death in high-mortality geographic regions due to a lack of primary data and dependence on inaccurate tools, such as verbal autopsy. To meet the ambitious new Sustainable Development Goal 3.2 to eliminate preventable child mortality in every country, better approaches are needed to precisely determine specific causes of death so that prevention and treatment interventions can be strengthened and focused. Minimally invasive tissue sampling (MITS) is a technique that uses needle-based postmortem sampling, followed by advanced histopathology and microbiology to definitely determine cause of death. The Bill & Melinda Gates Foundation is supporting a new surveillance system called the Child Health and Mortality Prevention Surveillance network, which will determine cause of death using MITS in combination with other information, and yield cause-specific population-based mortality rates, eventually in up to 12-15 sites in sub-Saharan Africa and south Asia. However, the Gates Foundation funding alone is not enough. We call on governments, other funders, and international stakeholders to expand the use of pathology-based cause of death determination to provide the information needed to end preventable childhood mortality.
Active inference, evidence accumulation and the urn task
FitzGerald, Thomas HB; Schwartenbeck, Philipp; Moutoussis, Michael; Dolan, Raymond J; Friston, Karl
2015-01-01
Deciding how much evidence to accumulate before making a decision is a problem we and other animals often face, but one which is not completely understood. This issue is particularly important because a tendency to sample less information (often known as reflection impulsivity) is a feature in several psychopathologies, such as psychosis. A formal understanding information sampling may therefore clarify the computational anatomy of psychopathology. In this theoretical paper, we consider evidence accumulation in terms of active (Bayesian) inference using a generic model of Markov decision processes. Here, agents are equipped with beliefs about their own behaviour – in this case, that they will make informed decisions. Normative decision-making is then modelled using variational Bayes to minimise surprise about choice outcomes. Under this scheme, different facets of belief updating map naturally onto the functional anatomy of the brain (at least at a heuristic level). Of particular interest is the key role played by the expected precision of beliefs about control, which we have previously suggested may be encoded by dopaminergic neurons in the midbrain. We show that manipulating expected precision strongly affects how much information an agent characteristically samples, and thus provides a possible link between impulsivity and dopaminergic dysfunction. Our study therefore represents a step towards understanding evidence accumulation in terms of neurobiologically plausible Bayesian inference, and may cast light on why this process is disordered in psychopathology. PMID:25514108
Mohan, Shalini V; Chang, Anne Lynn S
2014-06-01
Precision medicine and precision therapeutics is currently in its infancy with tremendous potential to improve patient care by better identifying individuals at risk for skin cancer and predict tumor responses to treatment. This review focuses on the Hedgehog signaling pathway, its critical role in the pathogenesis of basal cell carcinoma, and the emergence of targeted treatments for advanced basal cell carcinoma. Opportunities to utilize precision medicine are outlined, such as molecular profiling to predict basal cell carcinoma response to targeted therapy and to inform therapeutic decisions.
Measurement of magnetic field gradients using Raman spectroscopy in a fountain
NASA Astrophysics Data System (ADS)
Srinivasan, Arvind; Zimmermann, Matthias; Efremov, Maxim A.; Davis, Jon P.; Narducci, Frank A.
2017-02-01
In many experiments involving cold atoms, it is crucial to know the strength of the magnetic field and/or the magnetic field gradient at the precise location of a measurement. While auxiliary sensors can provide some of this information, the sensors are usually not perfectly co-located with the atoms and so can only provide an approximation to the magnetic field strength. In this article, we describe a technique to measure the magnetic field, based on Raman spectroscopy, using the same atomic fountain source that will be used in future magnetically sensitive measurements.
Study of Air Pollution from Space Using TOMS: Challenges and Promises for Future Missions
NASA Technical Reports Server (NTRS)
Bhartia, Pawan K.
2002-01-01
A series of TOMS instruments built by NASA has flown on US, Russian, and Japanese satellites in the last 24 years. These instruments are well known for producing spectacular maps of the ozone hole that forms over Antarctica each spring. However, it is less well known that these instruments also provided first evidence that space-based measurements in UV of sufficiently high precision and accuracy can provide valuable information to study global air quality. We will use the TOMS experience to highlight the promises and challenges of future space-based missions designed specifically for air quality studies.
Mann, G; Birkmann, C; Schmidt, T; Schaeffler, V
1999-01-01
Introduction Present solutions for the representation and retrieval of medical information from online sources are not very satisfying. Either the retrieval process lacks of precision and completeness the representation does not support the update and maintenance of the represented information. Most efforts are currently put into improving the combination of search engines and HTML based documents. However, due to the current shortcomings of methods for natural language understanding there are clear limitations to this approach. Furthermore, this approach does not solve the maintenance problem. At least medical information exceeding a certain complexity seems to afford approaches that rely on structured knowledge representation and corresponding retrieval mechanisms. Methods Knowledge-based information systems are based on the following fundamental ideas. The representation of information is based on ontologies that define the structure of the domain's concepts and their relations. Views on domain models are defined and represented as retrieval schemata. Retrieval schemata can be interpreted as canonical query types focussing on specific aspects of the provided information (e.g. diagnosis or therapy centred views). Based on these retrieval schemata it can be decided which parts of the information in the domain model must be represented explicitly and formalised to support the retrieval process. As representation language propositional logic is used. All other information can be represented in a structured but informal way using text, images etc. Layout schemata are used to assign layout information to retrieved domain concepts. Depending on the target environment HTML or XML can be used. Results Based on this approach two knowledge-based information systems have been developed. The 'Ophthalmologic Knowledge-based Information System for Diabetic Retinopathy' (OKIS-DR) provides information on diagnoses, findings, examinations, guidelines, and reference images related to diabetic retinopathy. OKIS-DR uses combinations of findings to specify the information that must be retrieved. The second system focuses on nutrition related allergies and intolerances. Information on allergies and intolerances of a patient are used to retrieve general information on the specified combination of allergies and intolerances. As a special feature the system generates tables showing food types and products that are tolerated or not tolerated by patients. Evaluation by external experts and user groups showed that the described approach of knowledge-based information systems increases the precision and completeness of knowledge retrieval. Due to the structured and non-redundant representation of information the maintenance and update of the information can be simplified. Both systems are available as WWW based online knowledge bases and CD-ROMs (cf. http://mta.gsf.de topic: products).
Better Higgs-C P tests through information geometry
NASA Astrophysics Data System (ADS)
Brehmer, Johann; Kling, Felix; Plehn, Tilman; Tait, Tim M. P.
2018-05-01
Measuring the C P symmetry in the Higgs sector is one of the key tasks of the LHC and a crucial ingredient for precision studies, for example in the language of effective Lagrangians. We systematically analyze which LHC signatures offer dedicated C P measurements in the Higgs-gauge sector and discuss the nature of the information they provide. Based on the Fisher information measure, we compare the maximal reach for C P -violating effects in weak boson fusion, associated Z H production, and Higgs decays into four leptons. We find a subtle balance between more theory-independent approaches and more powerful analysis channels, indicating that rigorous evidence for C P violation in the Higgs-gauge sector will likely require a multistep process.
Cloud Absorption Radiometer Autonomous Navigation System - CANS
NASA Technical Reports Server (NTRS)
Kahle, Duncan; Gatebe, Charles; McCune, Bill; Hellwig, Dustan
2013-01-01
CAR (cloud absorption radiometer) acquires spatial reference data from host aircraft navigation systems. This poses various problems during CAR data reduction, including navigation data format, accuracy of position data, accuracy of airframe inertial data, and navigation data rate. Incorporating its own navigation system, which included GPS (Global Positioning System), roll axis inertia and rates, and three axis acceleration, CANS expedites data reduction and increases the accuracy of the CAR end data product. CANS provides a self-contained navigation system for the CAR, using inertial reference and GPS positional information. The intent of the software application was to correct the sensor with respect to aircraft roll in real time based upon inputs from a precision navigation sensor. In addition, the navigation information (including GPS position), attitude data, and sensor position details are all streamed to a remote system for recording and later analysis. CANS comprises a commercially available inertial navigation system with integral GPS capability (Attitude Heading Reference System AHRS) integrated into the CAR support structure and data system. The unit is attached to the bottom of the tripod support structure. The related GPS antenna is located on the P-3 radome immediately above the CAR. The AHRS unit provides a RS-232 data stream containing global position and inertial attitude and velocity data to the CAR, which is recorded concurrently with the CAR data. This independence from aircraft navigation input provides for position and inertial state data that accounts for very small changes in aircraft attitude and position, sensed at the CAR location as opposed to aircraft state sensors typically installed close to the aircraft center of gravity. More accurate positional data enables quicker CAR data reduction with better resolution. The CANS software operates in two modes: initialization/calibration and operational. In the initialization/calibration mode, the software aligns the precision navigation sensors and initializes the communications interfaces with the sensor and the remote computing system. It also monitors the navigation data state for quality and ensures that the system maintains the required fidelity for attitude and positional information. In the operational mode, the software runs at 12.5 Hz and gathers the required navigation/attitude data, computes the required sensor correction values, and then commands the sensor to the required roll correction. In this manner, the sensor will stay very near to vertical at all times, greatly improving the resulting collected data and imagery. CANS greatly improves quality of resulting imagery and data collected. In addition, the software component of the system outputs a concisely formatted, high-speed data stream that can be used for further science data processing. This precision, time-stamped data also can benefit other instruments on the same aircraft platform by providing extra information from the mission flight.
Otin, Sofia; Fuertes, Maria I.; Vilades, Elisa; Gracia, Hector; Ara, Jose R.; Alarcia, Raquel; Polo, Vicente; Larrosa, Jose M.; Pablo, Luis E.
2016-01-01
Neurodegenerative diseases present a current challenge for accurate diagnosis and for providing precise prognostic information. Developing imaging biomarkers for multiple sclerosis (MS), Parkinson disease (PD), and Alzheimer's disease (AD) will improve the clinical management of these patients and may be useful for monitoring treatment effectiveness. Recent research using optical coherence tomography (OCT) has demonstrated that parameters provided by this technology may be used as potential biomarkers for MS, PD, and AD. Retinal thinning has been observed in these patients and new segmentation software for the analysis of the different retinal layers may provide accurate information on disease progression and prognosis. In this review we analyze the application of retinal evaluation using OCT technology to provide better understanding of the possible role of the retinal layers thickness as biomarker for the detection of these neurodegenerative pathologies. Current OCT analysis of the retinal nerve fiber layer and, specially, the ganglion cell layer thickness may be considered as a good biomarker for disease diagnosis, severity, and progression. PMID:27840739
A laser spectrometer and wavemeter for pulsed lasers
NASA Technical Reports Server (NTRS)
Mckay, J. A.; Laufer, P. M.; Cotnoir, L. J.
1989-01-01
The design, construction, calibration, and evaluation of a pulsed laser wavemeter and spectral analyzer are described. This instrument, called the Laserscope for its oscilloscope-like display of laser spectral structure, was delivered to NASA Langley Research Center as a prototype of a laboratory instrument. The key component is a multibeam Fizeau wedge interferometer, providing high (0.2 pm) spectral resolution and a linear dispersion of spectral information, ideally suited to linear array photodiode detectors. Even operating alone, with the classic order-number ambiguity of interferometers unresolved, this optical element will provide a fast, real-time display of the spectral structure of a laser output. If precise wavelength information is also desired then additional stages must be provided to obtain a wavelength measurement within the order-number uncertainty, i.e., within the free spectral range of the Fizeau wedge interferometer. A Snyder (single-beam Fizeau) wedge is included to provide this initial wavelength measurement. Difficulties in achieving the required wide-spectrum calibration limit the usefulness of this function.
Mourning dove population trend estimates from Call-Count and North American Breeding Bird Surveys
Sauer, J.R.; Dolton, D.D.; Droege, S.
1994-01-01
The mourning dove (Zenaida macroura) Callcount Survey and the North American Breeding Bird Survey provide information on population trends of mourning doves throughout the continental United States. Because surveys are an integral part of the development of hunting regulations, a need exists to determine which survey provides precise information. We estimated population trends from 1966 to 1988 by state and dove management unit, and assessed the relative efficiency of each survey. Estimates of population trend differ (P lt 0.05) between surveys in 11 of 48 states; 9 of 11 states with divergent results occur in the Eastern Management Unit. Differences were probably a consequence of smaller sample sizes in the Callcount Survey. The Breeding Bird Survey generally provided trend estimates with smaller variances than did the Callcount Survey. Although the Callcount Survey probably provides more withinroute accuracy because of survey methods and timing, the Breeding Bird Survey has a larger sample size of survey routes and greater consistency of coverage in the Eastern Unit.
Deep Space Network-Wide Portal Development: Planning Service Pilot Project
NASA Technical Reports Server (NTRS)
Doneva, Silviya
2011-01-01
The Deep Space Network (DSN) is an international network of antennas that supports interplanetary spacecraft missions and radio and radar astronomy observations for the exploration of the solar system and the universe. DSN provides the vital two-way communications link that guides and controls planetary explorers, and brings back the images and new scientific information they collect. In an attempt to streamline operations and improve overall services provided by the Deep Space Network a DSN-wide portal is under development. The project is one step in a larger effort to centralize the data collected from current missions including user input parameters for spacecraft to be tracked. This information will be placed into a principal repository where all operations related to the DSN are stored. Furthermore, providing statistical characterization of data volumes will help identify technically feasible tracking opportunities and more precise mission planning by providing upfront scheduling proposals. Business intelligence tools are to be incorporated in the output to deliver data visualization.
Abdulla, Ahmed AbdoAziz Ahmed; Lin, Hongfei; Xu, Bo; Banbhrani, Santosh Kumar
2016-07-25
Biomedical literature retrieval is becoming increasingly complex, and there is a fundamental need for advanced information retrieval systems. Information Retrieval (IR) programs scour unstructured materials such as text documents in large reserves of data that are usually stored on computers. IR is related to the representation, storage, and organization of information items, as well as to access. In IR one of the main problems is to determine which documents are relevant and which are not to the user's needs. Under the current regime, users cannot precisely construct queries in an accurate way to retrieve particular pieces of data from large reserves of data. Basic information retrieval systems are producing low-quality search results. In our proposed system for this paper we present a new technique to refine Information Retrieval searches to better represent the user's information need in order to enhance the performance of information retrieval by using different query expansion techniques and apply a linear combinations between them, where the combinations was linearly between two expansion results at one time. Query expansions expand the search query, for example, by finding synonyms and reweighting original terms. They provide significantly more focused, particularized search results than do basic search queries. The retrieval performance is measured by some variants of MAP (Mean Average Precision) and according to our experimental results, the combination of best results of query expansion is enhanced the retrieved documents and outperforms our baseline by 21.06 %, even it outperforms a previous study by 7.12 %. We propose several query expansion techniques and their combinations (linearly) to make user queries more cognizable to search engines and to produce higher-quality search results.
Investigating Temporal and Spatial Variations in Near Surface Water Content using GPR
NASA Astrophysics Data System (ADS)
Hubbard, S. S.; Grote, K.; Kowalsky, M. B.; Rubin, Y.
2001-12-01
Using only conventional point or well logging measurements, it is difficult to obtain information about water content with sufficient spatial resolution and coverage to be useful for near surface applications such as for input to vadose zone predictive models or for assisting with precision crop management. Prompted by successful results of a controlled ground penetrating radar (GPR) pilot study, we are investigating the applicability of GPR methods to estimate near surface water content at a study site within the Robert Mondavi vineyards in Napa County, California. Detailed information about soil variability and water content within vineyards could assist in estimation of plantable acreage, in the design of vineyard layout and in the design of an efficient irrigation/agrochemical application procedure. Our research at the winery study site involves investigation of optimal GPR acquisition and processing techniques, modeling of GPR attributes, and inversion of the attributes for water content information over space and time. A secondary goal of our project is to compare water content information obtained from the GPR data with information available from other types of measurements that are being used to assist in precision crop management. This talk will focus on point and spatial correlation estimation of water content obtained using GPR groundwave information only, and comparison of those estimates with information obtained from analysis of soils, TDR, neutron probe and remote sensing data sets. This comparison will enable us to 1) understand the potential of GPR for providing water content information in the very shallow subsurface, and to 2) investigate the interrelationships between the different types of measurements (and associated measurement scales) that are being utilized to characterize the shallow subsurface water content over space and time.
Field precision machining technology of target chamber in ICF lasers
NASA Astrophysics Data System (ADS)
Xu, Yuanli; Wu, Wenkai; Shi, Sucun; Duan, Lin; Chen, Gang; Wang, Baoxu; Song, Yugang; Liu, Huilin; Zhu, Mingzhi
2016-10-01
In ICF lasers, many independent laser beams are required to be positioned on target with a very high degree of accuracy during a shot. The target chamber provides a precision platform and datum reference for final optics assembly and target collimation and location system. The target chamber consists of shell with welded flanges, reinforced concrete pedestal, and lateral support structure. The field precision machining technology of target chamber in ICF lasers have been developed based on ShenGuangIII (SGIII). The same center of the target chamber is adopted in the process of design, fabrication, and alignment. The technologies of beam collimation and datum reference transformation are developed for the fabrication, positioning and adjustment of target chamber. A supporting and rotating mechanism and a special drilling machine are developed to bore the holes of ports. An adjustment mechanism is designed to accurately position the target chamber. In order to ensure the collimation requirements of the beam leading and focusing and the target positioning, custom-machined spacers are used to accurately correct the alignment error of the ports. Finally, this paper describes the chamber center, orientation, and centering alignment error measurements of SGIII. The measurements show the field precision machining of SGIII target chamber meet its design requirement. These information can be used on similar systems.
Lombaert, Herve; Grady, Leo; Polimeni, Jonathan R.; Cheriet, Farida
2013-01-01
Existing methods for surface matching are limited by the trade-off between precision and computational efficiency. Here we present an improved algorithm for dense vertex-to-vertex correspondence that uses direct matching of features defined on a surface and improves it by using spectral correspondence as a regularization. This algorithm has the speed of both feature matching and spectral matching while exhibiting greatly improved precision (distance errors of 1.4%). The method, FOCUSR, incorporates implicitly such additional features to calculate the correspondence and relies on the smoothness of the lowest-frequency harmonics of a graph Laplacian to spatially regularize the features. In its simplest form, FOCUSR is an improved spectral correspondence method that nonrigidly deforms spectral embeddings. We provide here a full realization of spectral correspondence where virtually any feature can be used as additional information using weights on graph edges, but also on graph nodes and as extra embedded coordinates. As an example, the full power of FOCUSR is demonstrated in a real case scenario with the challenging task of brain surface matching across several individuals. Our results show that combining features and regularizing them in a spectral embedding greatly improves the matching precision (to a sub-millimeter level) while performing at much greater speed than existing methods. PMID:23868776
Cosmic ray measurements with LOPES: Status and recent results
NASA Astrophysics Data System (ADS)
Schröder, F. G.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Chiavassa, A.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Fuchs, B.; Fuhrmann, D.; Gemmeke, H.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huber, D.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Krömer, O.; Kuijpers, J.; Link, K.; Łuczak, P.; Ludwig, M.; Mathes, H. J.; Melissas, M.; Morello, C.; Oehlschläger, J.; Palmieri, N.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Rühle, C.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Sima, O.; Toma, G.; Trinchero, G. C.; Weindl, A.; Wochele, J.; Zabierowski, J.; Zensus, J. A.
2013-05-01
LOPES is a digital antenna array at the Karlsruhe Institute of Technology, Germany, for cosmic-ray air-shower measurements. Triggered by the co-located KASCADE-Grande air-shower array, LOPES detects the radio emission of air showers via digital radio interferometry. We summarize the status of LOPES and recent results. In particular, we present an update on the reconstruction of the primary-particle properties based on almost 500 events above 100PeV. With LOPES, the arrival direction can be reconstructed with a precision of at least 0.65°, and the energy with a precision of at least 20%, which, however, does not include systematic uncertainties on the absolute energy scale. For many particle and astrophysics questions the reconstruction of the atmospheric depth of the shower maximum, Xmax, is important, since it yields information on the type of the primary particle and its interaction with the atmosphere. Recently, we found experimental evidence that the slope of the radio lateral distribution is indeed sensitive to the longitudinal development of the air shower, but unfortunately, the Xmax precision at LOPES is limited by the high level of anthropogenic radio background. Nevertheless, the developed methods can be transferred to next generation experiments with lower background, which should provide an Xmax precision competitive to other detection technologies.
Using conceptual work products of health care to design health IT.
Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark
2016-02-01
This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.
New Evidence on Employment Effects of Informal Care Provision in Europe.
Kolodziej, Ingo W K; Reichert, Arndt R; Schmitz, Hendrik
2018-02-22
To estimate how labor force participation is affected when adult children provide informal care to their parents. Survey of Health, Ageing and Retirement in Europe from 2004 to 2013. To offset the problem of endogeneity, we exploit the availability of other potential caregivers within the family as predictors of the probability to provide care for a dependent parent. Contrary to most previous studies, the dataset covers the whole working-age population in the majority of European countries. Individuals explicitly had to opt for or against the provision of care to their care-dependent parents, which allows us to more precisely estimate the effect of caregiving on labor force participation. Results reveal a negative causal effect that indicates that informal care provision reduces labor force participation by 14.0 percentage points (95 percent CI: -0.307, 0.026). Point estimates suggest that the effect is larger for men; however, this gender difference is not significantly different from zero at conventional levels. Results apply to individuals whose consideration in long-term care policy is highly relevant, that is, children whose willingness to provide informal care to their parents is altered by available alternatives of family caregivers. © Health Research and Educational Trust.
Adams, Richard; Tabernero, Josep; Seufferlein, Thomas; Taieb, Julien; Moiseyenko, Vladimir; Ma, Brigette; Lopez, Gustavo; Vansteenkiste, Johan F.; Esser, Regina; Tejpar, Sabine
2016-01-01
Background. Two separate multinational surveys of oncologists and patients with cancer were conducted to assess the awareness and use of biomarkers in clinical practice. These data explore the self-reported and physician-assessed levels of patient cancer literacy and factors affecting physicians’ choice to use biomarkers in treatment decisions. Patients and Methods. Interviews were conducted via telephone with patients and online with physicians. Physicians had 3–35 years of experience; were treating more than 15 patients/month; and specialized in breast, lung, or colorectal cancer. Patients had received treatment for breast, lung, or colorectal cancer within the previous 5 years. Results. Interviews with 895 physicians and 811 patients were completed. Most patients and physicians reported that patients understood that a tumor could be tested to determine what treatment would be most effective (78% and 73%, respectively) and that patients would be willing to participate in a personalized treatment plan. Whereas 85% of patients felt that they understood their treatment when it was explained to them, only 23% of doctors felt that their patients were always fully informed. Most physicians (90%) reported using biomarkers; among the 10% not performing biomarker analysis, the most cited obstacles were local availability, speed of obtaining results, and cost. Conclusion. These data demonstrate wide global use of biomarker testing but with regional variations reflecting cultural and local practice. Self-reported and physician-assessed cancer literacy, although generally high, highlighted important regional variations and the need to provide patients with additional information. Implications for Practice: Two surveys were conducted to evaluate the global use of biomarkers in clinical practice and the largely unreported patient experience of precision medicine. These findings are especially relevant because they address both self-reported and physician-assessed levels of patients’ “cancer literacy.” This unique opportunity allowed for identification of areas where patients and physicians are communicating effectively, and also where there is a teachable gap in patient education. Furthermore, surveying physicians about the advantages and roadblocks they experience with biomarker testing provided valuable information on ways to improve the delivery of precision medicine to provide personalized care and ultimately enhance patient care. PMID:26888693
Rough Set Soft Computing Cancer Classification and Network: One Stone, Two Birds
Zhang, Yue
2010-01-01
Gene expression profiling provides tremendous information to help unravel the complexity of cancer. The selection of the most informative genes from huge noise for cancer classification has taken centre stage, along with predicting the function of such identified genes and the construction of direct gene regulatory networks at different system levels with a tuneable parameter. A new study by Wang and Gotoh described a novel Variable Precision Rough Sets-rooted robust soft computing method to successfully address these problems and has yielded some new insights. The significance of this progress and its perspectives will be discussed in this article. PMID:20706619
Opto-mechanical system design of test system for near-infrared and visible target
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Zhu, Guodong; Wang, Yuchao
2014-12-01
Guidance precision is the key indexes of the guided weapon shooting. The factors of guidance precision including: information processing precision, control system accuracy, laser irradiation accuracy and so on. The laser irradiation precision is an important factor. This paper aimed at the demand of the precision test of laser irradiator,and developed the laser precision test system. The system consists of modified cassegrain system, the wide range CCD camera, tracking turntable and industrial PC, and makes visible light and near infrared target imaging at the same time with a Near IR camera. Through the analysis of the design results, when it exposures the target of 1000 meters that the system measurement precision is43mm, fully meet the needs of the laser precision test.
Garaizar, Pablo; Vadillo, Miguel A.; López-de-Ipiña, Diego; Matute, Helena
2014-01-01
Because of the features provided by an abundance of specialized experimental software packages, personal computers have become prominent and powerful tools in cognitive research. Most of these programs have mechanisms to control the precision and accuracy with which visual stimuli are presented as well as the response times. However, external factors, often related to the technology used to display the visual information, can have a noticeable impact on the actual performance and may be easily overlooked by researchers. The aim of this study is to measure the precision and accuracy of the timing mechanisms of some of the most popular software packages used in a typical laboratory scenario in order to assess whether presentation times configured by researchers do not differ from measured times more than what is expected due to the hardware limitations. Despite the apparent precision and accuracy of the results, important issues related to timing setups in the presentation of visual stimuli were found, and they should be taken into account by researchers in their experiments. PMID:24409318
Ultra-precision process of CaF2 single crystal
NASA Astrophysics Data System (ADS)
Yin, Guoju; Li, Shengyi; Xie, Xuhui; Zhou, Lin
2014-08-01
This paper proposes a new chemical mechanical polishing (CMP) process method for CaF2 single crystal to get ultraprecision surface. The CMP processes are improving polishing pad and using alkaline SiO2 polishing slurry with PH=8, PH=11 two phases to polish, respectively, and the roughness can be 0.181nm Rq (10μm×10μm). The CMP process can't get high surface figure, so we use ion beam figuring (IBF) technology to obtain high surface figure. However, IBF is difficult to improve the CaF2 surface roughness. We optimize IBF process to improve surface figure and keep good surface roughness too. Different IBF incident ion energy from 400ev to 800ev does not affect on the surface roughness obviously but the depth of material removal is reverse. CaF2 single crystal can get high precision surface figure (RMS=2.251nm) and still keep ultra-smooth surface (Rq=0.207nm) by IBF when removal depth is less than 200nm. The researches above provide important information for CaF2 single crystal to realize ultra-precision manufacture.
Achieving quantum precision limit in adaptive qubit state tomography
NASA Astrophysics Data System (ADS)
Hou, Zhibo; Zhu, Huangjun; Xiang, Guo-Yong; Li, Chuan-Feng; Guo, Guang-Can
2016-02-01
The precision limit in quantum state tomography is of great interest not only to practical applications but also to foundational studies. However, little is known about this subject in the multiparameter setting even theoretically due to the subtle information trade-off among incompatible observables. In the case of a qubit, the theoretic precision limit was determined by Hayashi as well as Gill and Massar, but attaining the precision limit in experiments has remained a challenging task. Here we report the first experiment that achieves this precision limit in adaptive quantum state tomography on optical polarisation qubits. The two-step adaptive strategy used in our experiment is very easy to implement in practice. Yet it is surprisingly powerful in optimising most figures of merit of practical interest. Our study may have significant implications for multiparameter quantum estimation problems, such as quantum metrology. Meanwhile, it may promote our understanding about the complementarity principle and uncertainty relations from the information theoretic perspective.
NASA Astrophysics Data System (ADS)
Berrada, K.; Eleuch, H.
2017-09-01
Various schemes have been proposed to improve the parameter-estimation precision. In the present work, we suggest an alternative method to preserve the estimation precision by considering a model that closely describes a realistic experimental scenario. We explore this active way to control and enhance the measurements precision for a two-level quantum system interacting with classical electromagnetic field using ultra-short strong pulses with an exact analytical solution, i.e. beyond the rotating wave approximation. In particular, we investigate the variation of the precision with a few cycles pulse and a smooth phase jump over a finite time interval. We show that by acting on the shape of the phase transient and other parameters of the considered system, the amount of information may be increased and has smaller decay rate in the long time. These features make two-level systems incorporated in ultra-short, of-resonant and gradually changing phase good candidates for implementation of schemes for the quantum computation and the coherent information processing.
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
The Paradox of Abstraction: Precision Versus Concreteness
ERIC Educational Resources Information Center
Iliev, Rumen; Axelrod, Robert
2017-01-01
We introduce a novel measure of abstractness based on the amount of information of a concept computed from its position in a semantic taxonomy. We refer to this measure as "precision". We propose two alternative ways to measure precision, one based on the path length from a concept to the root of the taxonomic tree, and another one based…
A Window Into Clinical Next-Generation Sequencing-Based Oncology Testing Practices.
Nagarajan, Rakesh; Bartley, Angela N; Bridge, Julia A; Jennings, Lawrence J; Kamel-Reid, Suzanne; Kim, Annette; Lazar, Alexander J; Lindeman, Neal I; Moncur, Joel; Rai, Alex J; Routbort, Mark J; Vasalos, Patricia; Merker, Jason D
2017-12-01
- Detection of acquired variants in cancer is a paradigm of precision medicine, yet little has been reported about clinical laboratory practices across a broad range of laboratories. - To use College of American Pathologists proficiency testing survey results to report on the results from surveys on next-generation sequencing-based oncology testing practices. - College of American Pathologists proficiency testing survey results from more than 250 laboratories currently performing molecular oncology testing were used to determine laboratory trends in next-generation sequencing-based oncology testing. - These presented data provide key information about the number of laboratories that currently offer or are planning to offer next-generation sequencing-based oncology testing. Furthermore, we present data from 60 laboratories performing next-generation sequencing-based oncology testing regarding specimen requirements and assay characteristics. The findings indicate that most laboratories are performing tumor-only targeted sequencing to detect single-nucleotide variants and small insertions and deletions, using desktop sequencers and predesigned commercial kits. Despite these trends, a diversity of approaches to testing exists. - This information should be useful to further inform a variety of topics, including national discussions involving clinical laboratory quality systems, regulation and oversight of next-generation sequencing-based oncology testing, and precision oncology efforts in a data-driven manner.
Precise definition of anonymization in genetic polymorphism studies.
Hamajima, Nobuyuki; Atsuta, Yoshiko; Niwa, Yoshimitsu; Nishio, Kazuko; Tanaka, Daisuke; Yamamoto, Kazuhito; Tamakoshi, Akiko
2004-01-01
Anonymization is an essential tool to protect privacy of participants in epidemiological studies. This paper classifies types of anonymization in genetic polymorphism studies, providing precise definitions. They are: 1) unlinkable anonymization at enrollment without a participant list; 2) unlinkable anonymization before genotyping with a participant list; 3) linkable anonymization; 4) unlinkable anonymization for outsiders; and 5) linkable anonymization for outsiders. The classification in view of accessibility to a table including genotype data with directly identifiable data such as names is important; if such tables exist, staff may obtain genotype information about participants. The first three modes are defined here as anonymization unaccessible to genotype data with directly identifiable information for research staff. Anonymization with a key code held by participants is possible with any of the above anonymization modes, by which participants can access to their own genotypes through telephone or internet. A guideline issued on March 29, 2001 with collaboration of three Ministries in Japan defines "anonymization in a linkable fashion" and "anonymization in an unlinkable fashion", "for the purpose of preventing the personal information from being divulged externally in violation of law, the present guidelines or a research protocol", but the contents are not clear in practice. The proposed definitions will be useful when we describe and discuss the preferable mode of anonymization for a given polymorphism study.
Sciorati, Clara; Esposito, Antonio; Campana, Lara; Canu, Tamara; Monno, Antonella; Palmisano, Anna; De Cobelli, Francesco; Del Maschio, Alessandro; Ascheman, Dana P.; Manfredi, Angelo A.; Rovere-Querini, Patrizia
2014-01-01
Inflammatory myopathies comprise heterogeneous disorders. Their etiopathogenesis is poorly understood, because of the paucity of informative experimental models and of approaches for the noninvasive study of inflamed tissues. Magnetic resonance imaging (MRI) provides information about the state of the skeletal muscle that reflects various facets of inflammation and remodeling. This technique has been scarcely used in experimental models of inflammatory myopathies. We characterized the performance of MRI in a well-established mouse model of myositis and the antisynthetase syndrome, based on the immunization of wild-type mice with the amino-terminal fragment of histidyl-tRNA synthetase (HisRS). Over an eight-week period following myositis induction, MRI enabled precise identification of pathological events taking place in muscle tissue. Areas of edema and of active inflammation identified by histopathology paralleled muscle modifications detected noninvasively by MRI. Muscles changes were chronologically associated with the establishment of autoimmunity, as reflected by the development of anti-HisRS antibodies in the blood of immunized mice. MR imaging easily appreciated muscle damage and remodeling even if actual disruption of myofiber integrity (as assessed by serum concentrations of creatinine phosphokinase) was limited. Thus, MR imaging represents an informative and noninvasive analytical tool for studying in vivo immune-mediated muscle involvement. PMID:24895622
Bray, Molly S; Loos, Ruth J F; McCaffery, Jeanne M; Ling, Charlotte; Franks, Paul W; Weinstock, George M; Snyder, Michael P; Vassy, Jason L; Agurs-Collins, Tanya
2016-01-01
Precision medicine utilizes genomic and other data to optimize and personalize treatment. Although more than 2,500 genetic tests are currently available, largely for extreme and/or rare phenotypes, the question remains whether this approach can be used for the treatment of common, complex conditions like obesity, inflammation, and insulin resistance, which underlie a host of metabolic diseases. This review, developed from a Trans-NIH Conference titled "Genes, Behaviors, and Response to Weight Loss Interventions," provides an overview of the state of genetic and genomic research in the area of weight change and identifies key areas for future research. Although many loci have been identified that are associated with cross-sectional measures of obesity/body size, relatively little is known regarding the genes/loci that influence dynamic measures of weight change over time. Although successful short-term weight loss has been achieved using many different strategies, sustainable weight loss has proven elusive for many, and there are important gaps in our understanding of energy balance regulation. Elucidating the molecular basis of variability in weight change has the potential to improve treatment outcomes and inform innovative approaches that can simultaneously take into account information from genomic and other sources in devising individualized treatment plans. © 2015 The Obesity Society.
Bray, Molly S; Loos, Ruth JF; McCaffery, Jeanne M; Ling, Charlotte; Franks, Paul W; Weinstock, George M; Snyder, Michael P; Vassy, Jason L; Agurs-Collins, Tanya
2016-01-01
Objective Precision medicine utilizes genomic and other data to optimize and personalize treatment. Although more than 2,500 genetic tests are currently available, largely for extreme and/or rare phenotypes, the question remains whether this approach can be used for the treatment of common, complex conditions like obesity, inflammation, and insulin resistance, which underlie a host of metabolic diseases. Methods This review, developed from a Trans-NIH Conference titled “Genes, Behaviors, and Response to Weight Loss Interventions,” provides an overview of the state of genetic and genomic research in the area of weight change and identifies key areas for future research. Results Although many loci have been identified that are associated with cross-sectional measures of obesity/body size, relatively little is known regarding the genes/loci that influence dynamic measures of weight change over time. Although successful short-term weight loss has been achieved using many different strategies, sustainable weight loss has proven elusive for many, and there are important gaps in our understanding of energy balance regulation. Conclusions Elucidating the molecular basis of variability in weight change has the potential to improve treatment outcomes and inform innovative approaches that can simultaneously take into account information from genomic and other sources in devising individualized treatment plans. PMID:26692578
Space Weather Needs of an Evolving Customer Base (Invited)
NASA Astrophysics Data System (ADS)
Rutledge, B.; Viereck, R. A.; Onsager, T. G.
2013-12-01
Great progress has been made in raising the global awareness of space weather and the associated impacts on Earth and our technological systems. However, significant gaps still exist in providing comprehensive and easily understood space weather information, products, and services to the diverse and growing customer base. As technologies, such as Global Navigation Satellite Systems (GNSS), have become more ingrained in applications and fields of work that previously did not rely on systems sensitive to space weather, the customer base has grown substantially. Furthermore, the causes and effects of space weather can be difficult to interpret without a detailed understanding of the scientific underpinnings. In response to this change, space weather service providers must address this evolution by both improving services and by representing space weather information and impacts in ways that are meaningful to each facet of this diverse customer base. The NOAA Space Weather Prediction Center (SWPC) must work with users, spanning precision agriculture, emergency management, power grid operators and beyond, to both identify unmet space weather service requirements and to ensure information and decision support services are provided in meaningful and more easily understood forms.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
Dube, Blaire; Emrich, Stephen M; Al-Aidroos, Naseem
2017-10-01
Across 2 experiments we revisited the filter account of how feature-based attention regulates visual working memory (VWM). Originally drawing from discrete-capacity ("slot") models, the filter account proposes that attention operates like the "bouncer in the brain," preventing distracting information from being encoded so that VWM resources are reserved for relevant information. Given recent challenges to the assumptions of discrete-capacity models, we investigated whether feature-based attention plays a broader role in regulating memory. Both experiments used partial report tasks in which participants memorized the colors of circle and square stimuli, and we provided a feature-based goal by manipulating the likelihood that 1 shape would be probed over the other across a range of probabilities. By decomposing participants' responses using mixture and variable-precision models, we estimated the contributions of guesses, nontarget responses, and imprecise memory representations to their errors. Consistent with the filter account, participants were less likely to guess when the probed memory item matched the feature-based goal. Interestingly, this effect varied with goal strength, even across high probabilities where goal-matching information should always be prioritized, demonstrating strategic control over filter strength. Beyond this effect of attention on which stimuli were encoded, we also observed effects on how they were encoded: Estimates of both memory precision and nontarget errors varied continuously with feature-based attention. The results offer support for an extension to the filter account, where feature-based attention dynamically regulates the distribution of resources within working memory so that the most relevant items are encoded with the greatest precision. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Quantifying the Precision of Single-Molecule Torque and Twist Measurements Using Allan Variance.
van Oene, Maarten M; Ha, Seungkyu; Jager, Tessa; Lee, Mina; Pedaci, Francesco; Lipfert, Jan; Dekker, Nynke H
2018-04-24
Single-molecule manipulation techniques have provided unprecedented insights into the structure, function, interactions, and mechanical properties of biological macromolecules. Recently, the single-molecule toolbox has been expanded by techniques that enable measurements of rotation and torque, such as the optical torque wrench (OTW) and several different implementations of magnetic (torque) tweezers. Although systematic analyses of the position and force precision of single-molecule techniques have attracted considerable attention, their angle and torque precision have been treated in much less detail. Here, we propose Allan deviation as a tool to systematically quantitate angle and torque precision in single-molecule measurements. We apply the Allan variance method to experimental data from our implementations of (electro)magnetic torque tweezers and an OTW and find that both approaches can achieve a torque precision better than 1 pN · nm. The OTW, capable of measuring torque on (sub)millisecond timescales, provides the best torque precision for measurement times ≲10 s, after which drift becomes a limiting factor. For longer measurement times, magnetic torque tweezers with their superior stability provide the best torque precision. Use of the Allan deviation enables critical assessments of the torque precision as a function of measurement time across different measurement modalities and provides a tool to optimize measurement protocols for a given instrument and application. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Perera, Dimuthu
Diffusion weighted (DW) Imaging is a non-invasive MR technique that provides information about the tissue microstructure using the diffusion of water molecules. The diffusion is generally characterized by the apparent diffusion coefficient (ADC) parametric map. The purpose of this study is to investigate in silico how the calculation of ADC is affected by image SNR, b-values, and the true tissue ADC. Also, to provide optimal parameter combination depending on the percentage accuracy and precision for prostate peripheral region cancer application. Moreover, to suggest parameter choices for any type of tissue, while providing the expected accuracy and precision. In this research DW images were generated assuming a mono-exponential signal model at two different b-values and for known true ADC values. Rician noise of different levels was added to the DWI images to adjust the image SNR. Using the two DWI images, ADC was calculated using a mono-exponential model for each set of b-values, SNR, and true ADC. 40,000 ADC data were collected for each parameter setting to determine the mean and the standard-deviation of the calculated ADC, as well as the percentage accuracy and precision with respect to the true ADC. The accuracy was calculated using the difference between known and calculated ADC. The precision was calculated using the standard-deviation of calculated ADC. The optimal parameters for a specific study was determined when both the percentage accuracy and precision were minimized. In our study, we simulated two true ADCs (ADC 0.00102 for tumor and 0.00180 mm2/s for normal prostate peripheral region tissue). Image SNR was varied from 2 to 100 and b-values were varied from 0 to 2000s/mm2. The results show that the percentage accuracy and percentage precision were minimized with image SNR. To increase SNR, 10 signal-averagings (NEX) were used considering the limitation in total scan time. The optimal NEX combination for tumor and normal tissue for prostate peripheral region was 1: 9. Also, the minimum percentage accuracy and percentage precision were obtained when low b-value is 0 and high b-value is 800 mm2/s for normal tissue and 1400 mm2/s for tumor tissue. Results also showed that for tissues with 1 x 10-3 < ADC < 2.1 x 10-3 mm 2/s the parameter combination at SNR = 20, b-value pair 0, 800 mm 2/s with NEX = 1:9 can calculate ADC with a percentage accuracy of less than 2% and percentage precision of 6-8%. Also, for tissues with 0.6 x 10-3 < ADC < 1.25 x 10-3 mm2 /s the parameter combination at SNR = 20, b-value pair 0, 1400 mm 2/s with NEX =1:9 can calculate ADC with a percentage accuracy of less than 2% and percentage precision of 6-8%.
GIS as a vital tool for Environmental Impact Assessment and Mitigation
NASA Astrophysics Data System (ADS)
Gharehbaghi, Koorosh; Scott-Young, Christina
2018-03-01
Environmental Impact Assessment (EIA) is a course of action which provides information to various stakeholders such as planners and relevant authorities about the planned development and its subsequent effects of the environment and the immediate ambiances. Furthermore, the EIA and mitigation are the inclusive process of collecting, analyzing information and the determination of the application for development or construction approval, which could be accessible by the concerned communities and organizations. Although the set regulations of EIA and mitigation vary from jurisdictions, they are, however, very precise and need to be integrated with the specific geographical data. In addition, the Geographical Information System (GIS) is a software intended to encapsulate and present all types of physical, biological, environmental, ecological and geological information. Conversely, GIS is the integration of statistical analysis and information technology, and can also be further broken down into two different categories of; Topological Modelling and Map overlay. To ensure that the EIA and mitigation are receptive the GIS will provide the decisive apparatus. Using GIS not only improves the overall EIA and mitigation process, but also provides valuable mapping strategies, including holistic environmental system approach. Accordingly, the main objective of this paper is to discuss the importance of the GIS and Environmental Data integration progression, to further enhance the overall EIA and Mitigation processes.
Last Glacial Maximum Salinity Reconstruction
NASA Astrophysics Data System (ADS)
Homola, K.; Spivack, A. J.
2016-12-01
It has been previously demonstrated that salinity can be reconstructed from sediment porewater. The goal of our study is to reconstruct high precision salinity during the Last Glacial Maximum (LGM). Salinity is usually determined at high precision via conductivity, which requires a larger volume of water than can be extracted from a sediment core, or via chloride titration, which yields lower than ideal precision. It has been demonstrated for water column samples that high precision density measurements can be used to determine salinity at the precision of a conductivity measurement using the equation of state of seawater. However, water column seawater has a relatively constant composition, in contrast to porewater, where variations from standard seawater composition occur. These deviations, which affect the equation of state, must be corrected for through precise measurements of each ion's concentration and knowledge of apparent partial molar density in seawater. We have developed a density-based method for determining porewater salinity that requires only 5 mL of sample, achieving density precisions of 10-6 g/mL. We have applied this method to porewater samples extracted from long cores collected along a N-S transect across the western North Atlantic (R/V Knorr cruise KN223). Density was determined to a precision of 2.3x10-6 g/mL, which translates to salinity uncertainty of 0.002 gms/kg if the effect of differences in composition is well constrained. Concentrations of anions (Cl-, and SO4-2) and cations (Na+, Mg+, Ca+2, and K+) were measured. To correct salinities at the precision required to unravel LGM Meridional Overturning Circulation, our ion precisions must be better than 0.1% for SO4-/Cl- and Mg+/Na+, and 0.4% for Ca+/Na+, and K+/Na+. Alkalinity, pH and Dissolved Inorganic Carbon of the porewater were determined to precisions better than 4% when ratioed to Cl-, and used to calculate HCO3-, and CO3-2. Apparent partial molar densities in seawater were determined experimentally. We compare the high precision salinity profiles determined using our new method to profiles determined from the traditional chloride titrations of parallel samples. Our technique provides a more accurate reconstruction of past salinity, informing questions of water mass composition and distribution during the LGM.
2012-02-01
make it more difficult for veterans with PTSD to seek or maintain treatment. VHA provides treatment for PTSD at VHA hospitals , outpatient clinics ...measured in days of inpatient hospital care and outpatient clinic visits. A veteran may have had several outpatient visits on a sin- gle day, each...reproduce the same results precisely. The DSS system takes clinical and financial information from other VHA databases and uses algorithms that merge
Investigation of Space Interferometer Control Using Imaging Sensor Output Feedback
NASA Technical Reports Server (NTRS)
Leitner, Jesse A.; Cheng, Victor H. L.
2003-01-01
Numerous space interferometry missions are planned for the next decade to verify different enabling technologies towards very-long-baseline interferometry to achieve high-resolution imaging and high-precision measurements. These objectives will require coordinated formations of spacecraft separately carrying optical elements comprising the interferometer. High-precision sensing and control of the spacecraft and the interferometer-component payloads are necessary to deliver sub-wavelength accuracy to achieve the scientific objectives. For these missions, the primary scientific product of interferometer measurements may be the only source of data available at the precision required to maintain the spacecraft and interferometer-component formation. A concept is studied for detecting the interferometer's optical configuration errors based on information extracted from the interferometer sensor output. It enables precision control of the optical components, and, in cases of space interferometers requiring formation flight of spacecraft that comprise the elements of a distributed instrument, it enables the control of the formation-flying vehicles because independent navigation or ranging sensors cannot deliver the high-precision metrology over the entire required geometry. Since the concept can act on the quality of the interferometer output directly, it can detect errors outside the capability of traditional metrology instruments, and provide the means needed to augment the traditional instrumentation to enable enhanced performance. Specific analyses performed in this study include the application of signal-processing and image-processing techniques to solve the problems of interferometer aperture baseline control, interferometer pointing, and orientation of multiple interferometer aperture pairs.
Modal-Power-Based Haptic Motion Recognition
NASA Astrophysics Data System (ADS)
Kasahara, Yusuke; Shimono, Tomoyuki; Kuwahara, Hiroaki; Sato, Masataka; Ohnishi, Kouhei
Motion recognition based on sensory information is important for providing assistance to human using robots. Several studies have been carried out on motion recognition based on image information. However, in the motion of humans contact with an object can not be evaluated precisely by image-based recognition. This is because the considering force information is very important for describing contact motion. In this paper, a modal-power-based haptic motion recognition is proposed; modal power is considered to reveal information on both position and force. Modal power is considered to be one of the defining features of human motion. A motion recognition algorithm based on linear discriminant analysis is proposed to distinguish between similar motions. Haptic information is extracted using a bilateral master-slave system. Then, the observed motion is decomposed in terms of primitive functions in a modal space. The experimental results show the effectiveness of the proposed method.
Information recall using relative spike timing in a spiking neural network.
Sterne, Philip
2012-08-01
We present a neural network that is capable of completing and correcting a spiking pattern given only a partial, noisy version. It operates in continuous time and represents information using the relative timing of individual spikes. The network is capable of correcting and recalling multiple patterns simultaneously. We analyze the network's performance in terms of information recall. We explore two measures of the capacity of the network: one that values the accurate recall of individual spike times and another that values only the presence or absence of complete patterns. Both measures of information are found to scale linearly in both the number of neurons and the period of the patterns, suggesting these are natural measures of network information. We show a smooth transition from encodings that provide precise spike times to flexible encodings that can encode many scenes. This makes it plausible that many diverse tasks could be learned with such an encoding.
High spatial precision nano-imaging of polarization-sensitive plasmonic particles
NASA Astrophysics Data System (ADS)
Liu, Yunbo; Wang, Yipei; Lee, Somin Eunice
2018-02-01
Precise polarimetric imaging of polarization-sensitive nanoparticles is essential for resolving their accurate spatial positions beyond the diffraction limit. However, conventional technologies currently suffer from beam deviation errors which cannot be corrected beyond the diffraction limit. To overcome this issue, we experimentally demonstrate a spatially stable nano-imaging system for polarization-sensitive nanoparticles. In this study, we show that by integrating a voltage-tunable imaging variable polarizer with optical microscopy, we are able to suppress beam deviation errors. We expect that this nano-imaging system should allow for acquisition of accurate positional and polarization information from individual nanoparticles in applications where real-time, high precision spatial information is required.
A multicolor imaging pyrometer
NASA Technical Reports Server (NTRS)
Frish, Michael B.; Frank, Jonathan H.
1989-01-01
A multicolor imaging pyrometer was designed for accurately and precisely measuring the temperature distribution histories of small moving samples. The device projects six different color images of the sample onto a single charge coupled device array that provides an RS-170 video signal to a computerized frame grabber. The computer automatically selects which one of the six images provides useful data, and converts that information to a temperature map. By measuring the temperature of molten aluminum heated in a kiln, a breadboard version of the device was shown to provide high accuracy in difficult measurement situations. It is expected that this pyrometer will ultimately find application in measuring the temperature of materials undergoing radiant heating in a microgravity acoustic levitation furnace.
A multicolor imaging pyrometer
NASA Astrophysics Data System (ADS)
Frish, Michael B.; Frank, Jonathan H.
1989-06-01
A multicolor imaging pyrometer was designed for accurately and precisely measuring the temperature distribution histories of small moving samples. The device projects six different color images of the sample onto a single charge coupled device array that provides an RS-170 video signal to a computerized frame grabber. The computer automatically selects which one of the six images provides useful data, and converts that information to a temperature map. By measuring the temperature of molten aluminum heated in a kiln, a breadboard version of the device was shown to provide high accuracy in difficult measurement situations. It is expected that this pyrometer will ultimately find application in measuring the temperature of materials undergoing radiant heating in a microgravity acoustic levitation furnace.
Precision Medicine in Head and Neck Cancer: Myth or Reality?
Malone, Eoghan; Siu, Lillian L
2018-01-01
Standard treatment in head and neck squamous cell carcinoma (HNSCC) is limited currently with decisions being made primarily based on tumor location, histology, and stage. The role of the human papillomavirus in risk stratification is actively under clinical trial evaluations. The molecular complexity and intratumoral heterogeneity of the disease are not actively integrated into management decisions of HNSCC, despite a growing body of knowledge in these areas. The advent of the genomic era has delivered vast amounts of information regarding different cancer subtypes and is providing new therapeutic targets, which can potentially be elucidated using next-generation sequencing and other modern technologies. The task ahead is to expand beyond the existent armamentarium by exploiting beyond the genome and perform integrative analysis using innovative systems biology methods, with the goal to deliver effective precision medicine-based theragnostic options in HNSCC.
The integration of FPGA TDC inside White Rabbit node
NASA Astrophysics Data System (ADS)
Li, H.; Xue, T.; Gong, G.; Li, J.
2017-04-01
White Rabbit technology is capable of delivering sub-nanosecond accuracy and picosecond precision of synchronization and normal data packets over the fiber network. Carry chain structure in FPGA is a popular way to build TDC and tens of picosecond RMS resolution has been achieved. The integration of WR technology with FPGA TDC can enhance and simplify the TDC in many aspects that includes providing a low jitter clock for TDC, a synchronized absolute UTC/TAI timestamp for coarse counter, a fancy way to calibrate the carry chain DNL and an easy to use Ethernet link for data and control information transmit. This paper presents a FPGA TDC implemented inside a normal White Rabbit node with sub-nanosecond measurement precision. The measured standard deviation reaches 50ps between two distributed TDCs. Possible applications of this distributed TDC are also discussed.
Precision medicine in chronic disease management: The multiple sclerosis BioScreen.
Gourraud, Pierre-Antoine; Henry, Roland G; Cree, Bruce A C; Crane, Jason C; Lizee, Antoine; Olson, Marram P; Santaniello, Adam V; Datta, Esha; Zhu, Alyssa H; Bevan, Carolyn J; Gelfand, Jeffrey M; Graves, Jennifer S; Goodin, Douglas S; Green, Ari J; von Büdingen, H-Christian; Waubant, Emmanuelle; Zamvil, Scott S; Crabtree-Hartman, Elizabeth; Nelson, Sarah; Baranzini, Sergio E; Hauser, Stephen L
2014-11-01
We present a precision medicine application developed for multiple sclerosis (MS): the MS BioScreen. This new tool addresses the challenges of dynamic management of a complex chronic disease; the interaction of clinicians and patients with such a tool illustrates the extent to which translational digital medicine-that is, the application of information technology to medicine-has the potential to radically transform medical practice. We introduce 3 key evolutionary phases in displaying data to health care providers, patients, and researchers: visualization (accessing data), contextualization (understanding the data), and actionable interpretation (real-time use of the data to assist decision making). Together, these form the stepping stones that are expected to accelerate standardization of data across platforms, promote evidence-based medicine, support shared decision making, and ultimately lead to improved outcomes. © 2014 American Neurological Association.
The Case for Personalized Medicine
Abrahams, Edward; Silver, Mike
2009-01-01
Personalized medicine may be considered an extension of traditional approaches to understanding and treating disease, but with greater precision. Physicians may now use a patient's genetic variation or expression profile as well as protein and metabolic markers to guide the selection of certain drugs or treatments. In many cases, the information provided by molecular markers predicts susceptibility to conditions. The added precision introduces the possibility of a more preventive, effective approach to clinical care and reductions in the duration and cost of clinical trials. Here, we make the case, through real-world examples, that personalized medicine is delivering significant value to individuals, to industry, and to the health care system overall and that it will continue to grow in importance if we can lift the barriers that impede its adoption and build incentives to encourage its practice. PMID:20144313
Evaluating single-pass catch as a tool for identifying spatial pattern in fish distribution
Bateman, Douglas S.; Gresswell, Robert E.; Torgersen, Christian E.
2005-01-01
We evaluate the efficacy of single-pass electrofishing without blocknets as a tool for collecting spatially continuous fish distribution data in headwater streams. We compare spatial patterns in abundance, sampling effort, and length-frequency distributions from single-pass sampling of coastal cutthroat trout (Oncorhynchus clarki clarki) to data obtained from a more precise multiple-pass removal electrofishing method in two mid-sized (500–1000 ha) forested watersheds in western Oregon. Abundance estimates from single- and multiple-pass removal electrofishing were positively correlated in both watersheds, r = 0.99 and 0.86. There were no significant trends in capture probabilities at the watershed scale (P > 0.05). Moreover, among-sample variation in fish abundance was higher than within-sample error in both streams indicating that increased precision of unit-scale abundance estimates would provide less information on patterns of abundance than increasing the fraction of habitat units sampled. In the two watersheds, respectively, single-pass electrofishing captured 78 and 74% of the estimated population of cutthroat trout with 7 and 10% of the effort. At the scale of intermediate-sized watersheds, single-pass electrofishing exhibited a sufficient level of precision to be effective in detecting spatial patterns of cutthroat trout abundance and may be a useful tool for providing the context for investigating fish-habitat relationships at multiple scales.
eRAM: encyclopedia of rare disease annotations for precision medicine.
Jia, Jinmeng; An, Zhongxin; Ming, Yue; Guo, Yongli; Li, Wei; Liang, Yunxiang; Guo, Dongming; Li, Xin; Tai, Jun; Chen, Geng; Jin, Yaqiong; Liu, Zhimei; Ni, Xin; Shi, Tieliu
2018-01-04
Rare diseases affect over a hundred million people worldwide, most of these patients are not accurately diagnosed and effectively treated. The limited knowledge of rare diseases forms the biggest obstacle for improving their treatment. Detailed clinical phenotyping is considered as a keystone of deciphering genes and realizing the precision medicine for rare diseases. Here, we preset a standardized system for various types of rare diseases, called encyclopedia of Rare disease Annotations for Precision Medicine (eRAM). eRAM was built by text-mining nearly 10 million scientific publications and electronic medical records, and integrating various data in existing recognized databases (such as Unified Medical Language System (UMLS), Human Phenotype Ontology, Orphanet, OMIM, GWAS). eRAM systematically incorporates currently available data on clinical manifestations and molecular mechanisms of rare diseases and uncovers many novel associations among diseases. eRAM provides enriched annotations for 15 942 rare diseases, yielding 6147 human disease related phenotype terms, 31 661 mammalians phenotype terms, 10,202 symptoms from UMLS, 18 815 genes and 92 580 genotypes. eRAM can not only provide information about rare disease mechanism but also facilitate clinicians to make accurate diagnostic and therapeutic decisions towards rare diseases. eRAM can be freely accessed at http://www.unimd.org/eram/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Windowed R-PDLF recoupling: a flexible and reliable tool to characterize molecular dynamics.
Gansmüller, Axel; Simorre, Jean-Pierre; Hediger, Sabine
2013-09-01
This work focuses on the improvement of the R-PDLF heteronuclear recoupling scheme, a method that allows quantification of molecular dynamics up to the microsecond timescale in heterogeneous materials. We show how the stability of the sequence towards rf-imperfections, one of the main sources of error of this technique, can be improved by the insertion of windows without irradiation into the basic elements of the symmetry-based recoupling sequence. The impact of this modification on the overall performance of the sequence in terms of scaling factor and homonuclear decoupling efficiency is evaluated. This study indicates the experimental conditions for which precise and reliable measurement of dipolar couplings can be obtained using the popular R18(1)(7) recoupling sequence, as well as alternative symmetry-based R sequences suited for fast MAS conditions. An analytical expression for the recoupled dipolar modulation has been derived that applies to a whole class of sequences with similar recoupling properties as R18(1)(7). This analytical expression provides an efficient and precise way to extract dipolar couplings from the experimental dipolar modulation curves. We hereby provide helpful tools and information for tailoring R-PDLF recoupling schemes to specific sample properties and hardware capabilities. This approach is particularly well suited for the study of materials with strong and heterogeneous molecular dynamics where a precise measurement of dipolar couplings is crucial. Copyright © 2013 Elsevier Inc. All rights reserved.
Bled, Florent; Belant, Jerrold L; Van Daele, Lawrence J; Svoboda, Nathan; Gustine, David; Hilderbrand, Grant; Barnes, Victor G
2017-11-01
Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears ( Ursus arctos ), using three common data types for bear ( U . spp.) populations: repeated counts, capture-mark-recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations' management and can be readily adapted to other large carnivores.
Bled, Florent; Belant, Jerrold L.; Van Daele, Lawrence J.; Svoboda, Nathan; Gustine, David D.; Hilderbrand, Grant V.; Barnes, Victor G.
2017-01-01
Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears (Ursus arctos), using three common data types for bear (U. spp.) populations: repeated counts, capture–mark–recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations’ management and can be readily adapted to other large carnivores.
Position measurement of the direct drive motor of Large Aperture Telescope
NASA Astrophysics Data System (ADS)
Li, Ying; Wang, Daxing
2010-07-01
Along with the development of space and astronomy science, production of large aperture telescope and super large aperture telescope will definitely become the trend. It's one of methods to solve precise drive of large aperture telescope using direct drive technology unified designed of electricity and magnetism structure. A direct drive precise rotary table with diameter of 2.5 meters researched and produced by us is a typical mechanical & electrical integration design. This paper mainly introduces position measurement control system of direct drive motor. In design of this motor, position measurement control system requires having high resolution, and precisely aligning the position of rotor shaft and making measurement, meanwhile transferring position information to position reversing information corresponding to needed motor pole number. This system has chosen high precision metal band coder and absolute type coder, processing information of coders, and has sent 32-bit RISC CPU making software processing, and gained high resolution composite coder. The paper gives relevant laboratory test results at the end, indicating the position measurement can apply to large aperture telescope control system. This project is subsidized by Chinese National Natural Science Funds (10833004).
Burke, Danielle L; Ensor, Joie; Snell, Kym I E; van der Windt, Danielle; Riley, Richard D
2018-06-01
Percentage study weights in meta-analysis reveal the contribution of each study toward the overall summary results and are especially important when some studies are considered outliers or at high risk of bias. In meta-analyses of test accuracy reviews, such as a bivariate meta-analysis of sensitivity and specificity, the percentage study weights are not currently derived. Rather, the focus is on representing the precision of study estimates on receiver operating characteristic plots by scaling the points relative to the study sample size or to their standard error. In this article, we recommend that researchers should also provide the percentage study weights directly, and we propose a method to derive them based on a decomposition of Fisher information matrix. This method also generalises to a bivariate meta-regression so that percentage study weights can also be derived for estimates of study-level modifiers of test accuracy. Application is made to two meta-analyses examining test accuracy: one of ear temperature for diagnosis of fever in children and the other of positron emission tomography for diagnosis of Alzheimer's disease. These highlight that the percentage study weights provide important information that is otherwise hidden if the presentation only focuses on precision based on sample size or standard errors. Software code is provided for Stata, and we suggest that our proposed percentage weights should be routinely added on forest and receiver operating characteristic plots for sensitivity and specificity, to provide transparency of the contribution of each study toward the results. This has implications for the PRISMA-diagnostic test accuracy guidelines that are currently being produced. Copyright © 2017 John Wiley & Sons, Ltd.
Towards a compact and precise sample holder for macromolecular crystallography.
Papp, Gergely; Rossi, Christopher; Janocha, Robert; Sorez, Clement; Lopez-Marrero, Marcos; Astruc, Anthony; McCarthy, Andrew; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent
2017-10-01
Most of the sample holders currently used in macromolecular crystallography offer limited storage density and poor initial crystal-positioning precision upon mounting on a goniometer. This has now become a limiting factor at high-throughput beamlines, where data collection can be performed in a matter of seconds. Furthermore, this lack of precision limits the potential benefits emerging from automated harvesting systems that could provide crystal-position information which would further enhance alignment at beamlines. This situation provided the motivation for the development of a compact and precise sample holder with corresponding pucks, handling tools and robotic transfer protocols. The development process included four main phases: design, prototype manufacture, testing with a robotic sample changer and validation under real conditions on a beamline. Two sample-holder designs are proposed: NewPin and miniSPINE. They share the same robot gripper and allow the storage of 36 sample holders in uni-puck footprint-style pucks, which represents 252 samples in a dry-shipping dewar commonly used in the field. The pucks are identified with human- and machine-readable codes, as well as with radio-frequency identification (RFID) tags. NewPin offers a crystal-repositioning precision of up to 10 µm but requires a specific goniometer socket. The storage density could reach 64 samples using a special puck designed for fully robotic handling. miniSPINE is less precise but uses a goniometer mount compatible with the current SPINE standard. miniSPINE is proposed for the first implementation of the new standard, since it is easier to integrate at beamlines. An upgraded version of the SPINE sample holder with a corresponding puck named SPINEplus is also proposed in order to offer a homogenous and interoperable system. The project involved several European synchrotrons and industrial companies in the fields of consumables and sample-changer robotics. Manual handling of miniSPINE was tested at different institutes using evaluation kits, and pilot beamlines are being equipped with compatible robotics for large-scale evaluation. A companion paper describes a new sample changer FlexED8 (Papp et al., 2017, Acta Cryst., D73, 841-851).
Bereiter, Bernhard; Kawamura, Kenji; Severinghaus, Jeffrey P
2018-05-30
The global ocean constitutes the largest heat buffer in the global climate system, but little is known about its past changes. The isotopic and elemental ratios of heavy noble gases (krypton and xenon), together with argon and nitrogen in trapped air from ice cores, can be used to reconstruct past mean ocean temperatures (MOTs). Here we introduce two successively developed methods to measure these parameters with a sufficient precision to provide new constraints on past changes in MOT. The air from an 800-g ice sample - containing roughly 80 mL STP air - is extracted and processed to be analyzed on two independent dual-inlet isotope ratio mass spectrometers. The primary isotope ratios (δ 15 N, δ 40 Ar and δ 86 Kr values) are obtained with precisions in the range of 1 per meg (0.001‰) per mass unit. The three elemental ratio values δKr/N 2 , δXe/N 2 and δXe/Kr are obtained using sequential (non-simultaneous) peak-jumping, reaching precisions in the range of 0.1-0.3‰. The latest version of the method achieves a 30% to 50% better precision on the elemental ratios and a twofold better sample throughput than the previous one. The method development uncovered an unexpected source of artefactual gas fractionation in a closed system that is caused by adiabatic cooling and warming of gases (termed adiabatic fractionation) - a potential source of measurement artifacts in other methods. The precisions of the three elemental ratios δKr/N 2 , δXe/N 2 and δXe/Kr - which all contain the same MOT information - suggest smaller uncertainties for reconstructed MOTs (±0.3-0.1°C) than previous studies have attained. Due to different sensitivities of the noble gases to changes in MOT, δXe/N 2 provides the best constraints on the MOT under the given precisions followed by δXe/Kr, and δKr/N 2 ; however, using all of them helps to detect methodological artifacts and issues with ice quality. Copyright © 2018 John Wiley & Sons, Ltd.
Comparison of 3D point clouds produced by LIDAR and UAV photoscan in the Rochefort cave (Belgium)
NASA Astrophysics Data System (ADS)
Watlet, Arnaud; Triantafyllou, Antoine; Kaufmann, Olivier; Le Mouelic, Stéphane
2016-04-01
Amongst today's techniques that are able to produce 3D point clouds, LIDAR and UAV (Unmanned Aerial Vehicle) photogrammetry are probably the most commonly used. Both methods have their own advantages and limitations. LIDAR scans create high resolution and high precision 3D point clouds, but such methods are generally costly, especially for sporadic surveys. Compared to LIDAR, UAV (e.g. drones) are cheap and flexible to use in different kind of environments. Moreover, the photogrammetric processing workflow of digital images taken with UAV becomes easier with the rise of many affordable software packages (e.g. Agisoft, PhotoModeler3D, VisualSFM). We present here a challenging study made at the Rochefort Cave Laboratory (South Belgium) comprising surface and underground surveys. The site is located in the Belgian Variscan fold-and-thrust belt, a region that shows many karstic networks within Devonian limestone units. A LIDAR scan has been acquired in the main chamber of the cave (~ 15000 m³) to spatialize 3D point cloud of its inner walls and infer geological beds and structures. Even if the use of LIDAR instrument was not really comfortable in such caving environment, the collected data showed a remarkable precision according to few control points geometry. We also decided to perform another challenging survey of the same cave chamber by modelling a 3D point cloud using photogrammetry of a set of DSLR camera pictures taken from the ground and UAV pictures. The aim was to compare both techniques in terms of (i) implementation of data acquisition and processing, (ii) quality of resulting 3D points clouds (points density, field vs cloud recovery and points precision), (iii) their application for geological purposes. Through Rochefort case study, main conclusions are that LIDAR technique provides higher density point clouds with slightly higher precision than photogrammetry method. However, 3D data modeled by photogrammetry provide visible light spectral information for each modeled voxel and interpolated vertices that can be a useful attributes for clustering during data treatment. We thus illustrate such applications to the Rochefort cave by using both sources of 3D information to quantify the orientation of inaccessible geological structures (e.g. faults, tectonic and gravitational joints, and sediments bedding), cluster these structures using color information gathered from UAV's 3D point cloud and compare these data to structural data surveyed on the field. An additional drone photoscan was also conducted in the surface sinkhole giving access to the surveyed underground cavity to seek geological bodies' connections.
A comparison of Boolean-based retrieval to the WAIS system for retrieval of aeronautical information
NASA Technical Reports Server (NTRS)
Marchionini, Gary; Barlow, Diane
1994-01-01
An evaluation of an information retrieval system using a Boolean-based retrieval engine and inverted file architecture and WAIS, which uses a vector-based engine, was conducted. Four research questions in aeronautical engineering were used to retrieve sets of citations from the NASA Aerospace Database which was mounted on a WAIS server and available through Dialog File 108 which served as the Boolean-based system (BBS). High recall and high precision searches were done in the BBS and terse and verbose queries were used in the WAIS condition. Precision values for the WAIS searches were consistently above the precision values for high recall BBS searches and consistently below the precision values for high precision BBS searches. Terse WAIS queries gave somewhat better precision performance than verbose WAIS queries. In every case, a small number of relevant documents retrieved by one system were not retrieved by the other, indicating the incomplete nature of the results from either retrieval system. Relevant documents in the WAIS searches were found to be randomly distributed in the retrieved sets rather than distributed by ranks. Advantages and limitations of both types of systems are discussed.
Hierarchy of Information Processing in the Brain: A Novel 'Intrinsic Ignition' Framework.
Deco, Gustavo; Kringelbach, Morten L
2017-06-07
A general theory of brain function has to be able to explain local and non-local network computations over space and time. We propose a new framework to capture the key principles of how local activity influences global computation, i.e., describing the propagation of information and thus the broadness of communication driven by local activity. More specifically, we consider the diversity in space (nodes or brain regions) over time using the concept of intrinsic ignition, which are naturally occurring intrinsic perturbations reflecting the capability of a given brain area to propagate neuronal activity to other regions in a given brain state. Characterizing the profile of intrinsic ignition for a given brain state provides insight into the precise nature of hierarchical information processing. Combining this data-driven method with a causal whole-brain computational model can provide novel insights into the imbalance of brain states found in neuropsychiatric disorders. Copyright © 2017 Elsevier Inc. All rights reserved.
Rogojerov, Marin; Keresztury, Gábor; Kamenova-Nacheva, Mariana; Sundius, Tom
2012-12-01
A new analytical approach for improving the precision in determination of vibrational transition moment directions of low symmetry molecules (lacking orthogonal axes) is discussed in this paper. The target molecules are partially uniaxially oriented in nematic liquid crystalline solvent and are studied by IR absorption spectroscopy using polarized light. The fundamental problem addressed is that IR linear dichroism measurements of low symmetry molecules alone cannot provide sufficient information on molecular orientation and transition moment directions. It is shown that computational prediction of these quantities can supply relevant complementary data, helping to reveal the hidden information content and achieve a more meaningful and more precise interpretation of the measured dichroic ratios. The combined experimental and theoretical/computational method proposed by us recently for determination of the average orientation of molecules with C(s) symmetry has now been replaced by a more precise analytical approach. The new method introduced and discussed in full detail here uses a mathematically evaluated angle between two vibrational transition moment vectors as a reference. The discussion also deals with error analysis and estimation of uncertainties of the orientational parameters. The proposed procedure has been tested in an analysis of the infrared linear dichroism (IR-LD) spectra of 1-D- and 2-D-naphthalene complemented with DFT calculations using the scaled quantum mechanical force field (SQM FF) method. Copyright © 2012 Elsevier B.V. All rights reserved.
Solano-Román, Antonio; Alfaro-Arias, Verónica; Cruz-Castillo, Carlos; Orozco-Solano, Allan
2018-03-15
VizGVar was designed to meet the growing need of the research community for improved genomic and proteomic data viewers that benefit from better information visualization. We implemented a new information architecture and applied user centered design principles to provide a new improved way of visualizing genetic information and protein data related to human disease. VizGVar connects the entire database of Ensembl protein motifs, domains, genes and exons with annotated SNPs and somatic variations from PharmGKB and COSMIC. VizGVar precisely represents genetic variations and their respective location by colored curves to designate different types of variations. The structured hierarchy of biological data is reflected in aggregated patterns through different levels, integrating several layers of information at once. VizGVar provides a new interactive, web-based JavaScript visualization of somatic mutations and protein variation, enabling fast and easy discovery of clinically relevant variation patterns. VizGVar is accessible at http://vizport.io/vizgvar; http://vizport.io/vizgvar/doc/. asolano@broadinstitute.org or allan.orozcosolano@ucr.ac.cr.
Uciteli, Alexandr; Groß, Silvia; Kireyev, Sergej; Herre, Heinrich
2011-08-09
This paper presents an ontologically founded basic architecture for information systems, which are intended to capture, represent, and maintain metadata for various domains of clinical and epidemiological research. Clinical trials exhibit an important basis for clinical research, and the accurate specification of metadata and their documentation and application in clinical and epidemiological study projects represents a significant expense in the project preparation and has a relevant impact on the value and quality of these studies.An ontological foundation of an information system provides a semantic framework for the precise specification of those entities which are presented in this system. This semantic framework should be grounded, according to our approach, on a suitable top-level ontology. Such an ontological foundation leads to a deeper understanding of the entities of the domain under consideration, and provides a common unifying semantic basis, which supports the integration of data and the interoperability between different information systems.The intended information systems will be applied to the field of clinical and epidemiological research and will provide, depending on the application context, a variety of functionalities. In the present paper, we focus on a basic architecture which might be common to all such information systems. The research, set forth in this paper, is included in a broader framework of clinical research and continues the work of the IMISE on these topics.
[Study on Information Extraction of Clinic Expert Information from Hospital Portals].
Zhang, Yuanpeng; Dong, Jiancheng; Qian, Danmin; Geng, Xingyun; Wu, Huiqun; Wang, Li
2015-12-01
Clinic expert information provides important references for residents in need of hospital care. Usually, such information is hidden in the deep web and cannot be directly indexed by search engines. To extract clinic expert information from the deep web, the first challenge is to make a judgment on forms. This paper proposes a novel method based on a domain model, which is a tree structure constructed by the attributes of search interfaces. With this model, search interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from the returned web pages indexed by search interfaces. To filter the noise information on a web page, a block importance model is proposed. The experiment results indicated that the domain model yielded a precision 10.83% higher than that of the rule-based method, whereas the block importance model yielded an F₁ measure 10.5% higher than that of the XPath method.
A simulator study on information requirements for precision hovering
NASA Technical Reports Server (NTRS)
Lemons, J. L.; Dukes, T. A.
1975-01-01
A fixed base simulator study of an advanced helicopter instrument display utilizing translational acceleration, velocity and position information is reported. The simulation involved piloting a heavy helicopter using the Integrated Trajectory Error Display (ITED) in a precision hover task. The test series explored two basic areas. The effect on hover accuracy of adding acceleration information was of primary concern. Also of interest was the operators' ability to use degraded information derived from less sophisticated sources. The addition of translational acceleration to a display containing velocity and position information did not appear to improve the hover performance significantly. However, displayed acceleration information seemed to increase the damping of the man machine system. Finally, the pilots could use translational information synthesized from attitude and angular acceleration as effectively as perfect acceleration.
McInroy, John E.
2005-01-18
A precision positioning device is provided. The precision positioning device comprises a precision measuring/vibration isolation mechanism. A first plate is provided with the precision measuring mean secured to the first plate. A second plate is secured to the first plate. A third plate is secured to the second plate with the first plate being positioned between the second plate and the third plate. A fourth plate is secured to the third plate with the second plate being positioned between the third plate and the fourth plate. An adjusting mechanism for adjusting the position of the first plate, the second plate, the third plate, and the fourth plate relative to each other.
Parks, David R.; Khettabi, Faysal El; Chase, Eric; Hoffman, Robert A.; Perfetto, Stephen P.; Spidlen, Josef; Wood, James C.S.; Moore, Wayne A.; Brinkman, Ryan R.
2017-01-01
We developed a fully automated procedure for analyzing data from LED pulses and multi-level bead sets to evaluate backgrounds and photoelectron scales of cytometer fluorescence channels. The method improves on previous formulations by fitting a full quadratic model with appropriate weighting and by providing standard errors and peak residuals as well as the fitted parameters themselves. Here we describe the details of the methods and procedures involved and present a set of illustrations and test cases that demonstrate the consistency and reliability of the results. The automated analysis and fitting procedure is generally quite successful in providing good estimates of the Spe (statistical photoelectron) scales and backgrounds for all of the fluorescence channels on instruments with good linearity. The precision of the results obtained from LED data is almost always better than for multi-level bead data, but the bead procedure is easy to carry out and provides results good enough for most purposes. Including standard errors on the fitted parameters is important for understanding the uncertainty in the values of interest. The weighted residuals give information about how well the data fits the model, and particularly high residuals indicate bad data points. Known photoelectron scales and measurement channel backgrounds make it possible to estimate the precision of measurements at different signal levels and the effects of compensated spectral overlap on measurement quality. Combining this information with measurements of standard samples carrying dyes of biological interest, we can make accurate comparisons of dye sensitivity among different instruments. Our method is freely available through the R/Bioconductor package flowQB. PMID:28160404
Poutiainen, Pekka; Jaronen, Merja; Quintana, Francisco J.; Brownell, Anna-Liisa
2016-01-01
Non-invasive molecular imaging techniques can enhance diagnosis to achieve successful treatment, as well as reveal underlying pathogenic mechanisms in disorders such as multiple sclerosis (MS). The cooperation of advanced multimodal imaging techniques and increased knowledge of the MS disease mechanism allows both monitoring of neuronal network and therapeutic outcome as well as the tools to discover novel therapeutic targets. Diverse imaging modalities provide reliable diagnostic and prognostic platforms to better achieve precision medicine. Traditionally, magnetic resonance imaging (MRI) has been considered the golden standard in MS research and diagnosis. However, positron emission tomography (PET) imaging can provide functional information of molecular biology in detail even prior to anatomic changes, allowing close follow up of disease progression and treatment response. The recent findings support three major neuroinflammation components in MS: astrogliosis, cytokine elevation, and significant changes in specific proteins, which offer a great variety of specific targets for imaging purposes. Regardless of the fact that imaging of astrocyte function is still a young field and in need for development of suitable imaging ligands, recent studies have shown that inflammation and astrocyte activation are related to progression of MS. MS is a complex disease, which requires understanding of disease mechanisms for successful treatment. PET is a precise non-invasive imaging method for biochemical functions and has potential to enhance early and accurate diagnosis for precision therapy of MS. In this review we focus on modulation of different receptor systems and inflammatory aspect of MS, especially on activation of glial cells, and summarize the recent findings of PET imaging in MS and present the most potent targets for new biomarkers with the main focus on experimental MS research. PMID:27695400
Astrophysical masers - Inverse methods, precision, resolution and uniqueness
NASA Astrophysics Data System (ADS)
Lerche, I.
1986-07-01
The paper provides exact analytic solutions to the two-level, steady-state, maser problem in parametric form, with the emergent intensities expressed in terms of the incident intensities and with the maser length also given in terms of an integral over the intensities. It is shown that some assumption must be made on the emergent intensity on the nonobservable side of the astrophysical maser in order to obtain any inversion of the equations. The incident intensities can then be expressed in terms of the emergent, observable, flux. It is also shown that the inversion is nonunique unless a homogeneous linear integral equation has only a null solution. Constraints imposed by knowledge of the physical length of the maser are felt in a nonlinear manner by the parametric variable and do not appear to provide any substantive additional information to reduce the degree of nonuniqueness of the inverse solutions. It is concluded that the questions of precision, resolution and uniqueness for solutions to astrophysical maser problems will remain more of an emotional art than a logical science for some time to come.
Using sensors to measure activity in people with stroke.
Fulk, George D; Sazonov, Edward
2011-01-01
The purpose of this study was to determine the ability of a novel shoe-based sensor that uses accelerometers, pressure sensors, and pattern recognition with a support vector machine (SVM) to accurately identify sitting, standing, and walking postures in people with stroke. Subjects with stroke wore the shoe-based sensor while randomly assuming 3 main postures: sitting, standing, and walking. A SVM classifier was used to train and validate the data to develop individual and group models, which were tested for accuracy, recall, and precision. Eight subjects participated. Both individual and group models were able to accurately identify the different postures (99.1% to 100% individual models and 76.9% to 100% group models). Recall and precision were also high for both individual (0.99 to 1.00) and group (0.82 to 0.99) models. The unique combination of accelerometer and pressure sensors built into the shoe was able to accurately identify postures. This shoe sensor could be used to provide accurate information on community performance of activities in people with stroke as well as provide behavioral enhancing feedback as part of a telerehabilitation intervention.
LASSO, two-way, and GPS time comparisons: A (very) preliminary status report
NASA Technical Reports Server (NTRS)
Veillet, Christian J. L.; Feraudy, D.; Torre, J. M.; Mangin, J. F.; Grudler, P.; Baumont, Francoise S.; Gaignebet, Jean C.; Hatat, J. L.; Hanson, Wayne; Clements, A.
1990-01-01
The first results are presented on the time transfer experiments between TUG (Graz, Austria) and OCA (Grasse, France) using common view Global Positioning System (GPS) and two-way stations at both sites. The present data, providing arms of the clock offsets of 2 to 3 nanoseconds for a three month period, have to be further analyzed before any conclusions on the respective precision and accuracy of these techniques can be drawn. Two years after its start, the Laser Synchronization from Stationary Orbit (LASSO) experiment is finally giving its first results at TUG and OCA. The first analysis of three common sessions permitted researchers to conclude that the LASSO package on board Meteosat P2 is working satisfactorily, and that time transfer using this method should provide clock offsets at better than 1 nanosecond precision, and clock rates at better than 10(exp -12) s/s in a 5 to 10 minutes session. A new method for extracting this information from the raw data sent by LASSO should enhance the performances of this experiment, exploiting the stability of the on-board oscillator.
Masino, Aaron J.; Casper, T. Charles; Dean, Jonathan M.; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M.; Alpern, Elizabeth R.
2016-01-01
Summary Background Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. Objective To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Methods Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English “stop words” and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. Results There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). Conclusions NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents. PMID:27826610
Motion and gravity effects in the precision of quantum clocks.
Lindkvist, Joel; Sabín, Carlos; Johansson, Göran; Fuentes, Ivette
2015-05-19
We show that motion and gravity affect the precision of quantum clocks. We consider a localised quantum field as a fundamental model of a quantum clock moving in spacetime and show that its state is modified due to changes in acceleration. By computing the quantum Fisher information we determine how relativistic motion modifies the ultimate bound in the precision of the measurement of time. While in the absence of motion the squeezed vacuum is the ideal state for time estimation, we find that it is highly sensitive to the motion-induced degradation of the quantum Fisher information. We show that coherent states are generally more resilient to this degradation and that in the case of very low initial number of photons, the optimal precision can be even increased by motion. These results can be tested with current technology by using superconducting resonators with tunable boundary conditions.
Systems biology for nursing in the era of big data and precision health.
Founds, Sandra
2017-12-02
The systems biology framework was previously synthesized with the person-environment-health-nursing metaparadigm. The purpose of this paper is to present a nursing discipline-specific perspective of the association of systems biology with big data and precision health. The fields of systems biology, big data, and precision health are now overviewed, from origins through expansions, with examples of what is being done by nurses in each area of science. Technological advances continue to expand omics and other varieties of big data that inform the person's phenotype and health outcomes for precision care. Meanwhile, millions of participants in the United States are being recruited for health-care research initiatives aimed at building the information commons of digital health data. Implications and opportunities abound via conceptualizing the integration of these fields through the nursing metaparadigm. Copyright © 2017 Elsevier Inc. All rights reserved.
Motion and gravity effects in the precision of quantum clocks
Lindkvist, Joel; Sabín, Carlos; Johansson, Göran; Fuentes, Ivette
2015-01-01
We show that motion and gravity affect the precision of quantum clocks. We consider a localised quantum field as a fundamental model of a quantum clock moving in spacetime and show that its state is modified due to changes in acceleration. By computing the quantum Fisher information we determine how relativistic motion modifies the ultimate bound in the precision of the measurement of time. While in the absence of motion the squeezed vacuum is the ideal state for time estimation, we find that it is highly sensitive to the motion-induced degradation of the quantum Fisher information. We show that coherent states are generally more resilient to this degradation and that in the case of very low initial number of photons, the optimal precision can be even increased by motion. These results can be tested with current technology by using superconducting resonators with tunable boundary conditions. PMID:25988238
Mehrabi, Saeed; Krishnan, Anand; Roch, Alexandra M; Schmidt, Heidi; Li, DingCheng; Kesterson, Joe; Beesley, Chris; Dexter, Paul; Schmidt, Max; Palakal, Mathew; Liu, Hongfang
2015-01-01
In this study we have developed a rule-based natural language processing (NLP) system to identify patients with family history of pancreatic cancer. The algorithm was developed in a Unstructured Information Management Architecture (UIMA) framework and consisted of section segmentation, relation discovery, and negation detection. The system was evaluated on data from two institutions. The family history identification precision was consistent across the institutions shifting from 88.9% on Indiana University (IU) dataset to 87.8% on Mayo Clinic dataset. Customizing the algorithm on the the Mayo Clinic data, increased its precision to 88.1%. The family member relation discovery achieved precision, recall, and F-measure of 75.3%, 91.6% and 82.6% respectively. Negation detection resulted in precision of 99.1%. The results show that rule-based NLP approaches for specific information extraction tasks are portable across institutions; however customization of the algorithm on the new dataset improves its performance.
Du, Han; Zhang, Xingwang; Chen, Guoqiang; Deng, Jie; Chau, Fook Siong; Zhou, Guangya
2016-01-01
Photonic molecules have a range of promising applications including quantum information processing, where precise control of coupling strength is critical. Here, by laterally shifting the center-to-center offset of coupled photonic crystal nanobeam cavities, we demonstrate a method to precisely and dynamically control the coupling strength of photonic molecules through integrated nanoelectromechanical systems with a precision of a few GHz over a range of several THz without modifying the nature of their constituent resonators. Furthermore, the coupling strength can be tuned continuously from negative (strong coupling regime) to zero (weak coupling regime) and further to positive (strong coupling regime) and vice versa. Our work opens a door to the optimization of the coupling strength of photonic molecules in situ for the study of cavity quantum electrodynamics and the development of efficient quantum information devices. PMID:27097883
Analysis of Distribution of Vector-Borne Diseases Using Geographic Information Systems.
Nihei, Naoko
2017-01-01
The distribution of vector-borne diseases is changing on a global scale owing to issues involving natural environments, socioeconomic conditions and border disputes among others. Geographic information systems (GIS) provide an important method of establishing a prompt and precise understanding of local data on disease outbreaks, from which disease eradication programs can be established. Having first defined GIS as a combination of GPS, RS and GIS, we showed the processes through which these technologies were being introduced into our research. GIS-derived geographical information attributes were interpreted in terms of point, area, line, spatial epidemiology, risk and development for generating the vector dynamic models associated with the spread of the disease. The need for interdisciplinary scientific and administrative collaboration in the use of GIS to control infectious diseases is highly warranted.
Optimal firing rate estimation
NASA Technical Reports Server (NTRS)
Paulin, M. G.; Hoffman, L. F.
2001-01-01
We define a measure for evaluating the quality of a predictive model of the behavior of a spiking neuron. This measure, information gain per spike (Is), indicates how much more information is provided by the model than if the prediction were made by specifying the neuron's average firing rate over the same time period. We apply a maximum Is criterion to optimize the performance of Gaussian smoothing filters for estimating neural firing rates. With data from bullfrog vestibular semicircular canal neurons and data from simulated integrate-and-fire neurons, the optimal bandwidth for firing rate estimation is typically similar to the average firing rate. Precise timing and average rate models are limiting cases that perform poorly. We estimate that bullfrog semicircular canal sensory neurons transmit in the order of 1 bit of stimulus-related information per spike.
Photon Statistics of Propagating Thermal Microwaves.
Goetz, J; Pogorzalek, S; Deppe, F; Fedorov, K G; Eder, P; Fischer, M; Wulschner, F; Xie, E; Marx, A; Gross, R
2017-03-10
In experiments with superconducting quantum circuits, characterizing the photon statistics of propagating microwave fields is a fundamental task. We quantify the n^{2}+n photon number variance of thermal microwave photons emitted from a blackbody radiator for mean photon numbers, 0.05≲n≲1.5. We probe the fields using either correlation measurements or a transmon qubit coupled to a microwave resonator. Our experiments provide a precise quantitative characterization of weak microwave states and information on the noise emitted by a Josephson parametric amplifier.
Photon Statistics of Propagating Thermal Microwaves
NASA Astrophysics Data System (ADS)
Goetz, J.; Pogorzalek, S.; Deppe, F.; Fedorov, K. G.; Eder, P.; Fischer, M.; Wulschner, F.; Xie, E.; Marx, A.; Gross, R.
2017-03-01
In experiments with superconducting quantum circuits, characterizing the photon statistics of propagating microwave fields is a fundamental task. We quantify the n2+n photon number variance of thermal microwave photons emitted from a blackbody radiator for mean photon numbers, 0.05 ≲n ≲1.5 . We probe the fields using either correlation measurements or a transmon qubit coupled to a microwave resonator. Our experiments provide a precise quantitative characterization of weak microwave states and information on the noise emitted by a Josephson parametric amplifier.
NASA Technical Reports Server (NTRS)
Bennett, A.
1973-01-01
A guidance algorithm that provides precise rendezvous in the deterministic case while requiring only relative state information is developed. A navigation scheme employing only onboard relative measurements is built around a Kalman filter set in measurement coordinates. The overall guidance and navigation procedure is evaluated in the face of measurement errors by a detailed numerical simulation. Results indicate that onboard guidance and navigation for the terminal phase of rendezvous is possible with reasonable limits on measurement errors.
The quantitative control and matching of an optical false color composite imaging system
NASA Astrophysics Data System (ADS)
Zhou, Chengxian; Dai, Zixin; Pan, Xizhe; Li, Yinxi
1993-10-01
Design of an imaging system for optical false color composite (OFCC) capable of high-precision density-exposure time control and color balance is presented. The system provides high quality FCC image data that can be analyzed using a quantitative calculation method. The quality requirement to each part of the image generation system is defined, and the distribution of satellite remote sensing image information is analyzed. The proposed technology makes it possible to present the remote sensing image data more effectively and accurately.
Novel EO/IR sensor technologies
NASA Astrophysics Data System (ADS)
Lewis, Keith
2011-10-01
The requirements for advanced EO/IR sensor technologies are discussed in the context of evolving military operations, with significant emphasis on the development of new sensing technologies to meet the challenges posed by asymmetric threats. The Electro-Magnetic Remote Sensing (EMRS DTC) was established in 2003 to provide a centre of excellence in sensor research and development, supporting new capabilities in key military areas such as precision attack, battlespace manoeuvre and information superiority. In the area of advanced electro-optic technology, the DTC has supported work on discriminative imaging, advanced detectors, laser components/technologies, and novel optical techniques. This paper provides a summary of some of the EO/IR technologies explored by the DTC.
Flexible processing and the design of grammar.
Sag, Ivan A; Wasow, Thomas
2015-02-01
We explore the consequences of letting the incremental and integrative nature of language processing inform the design of competence grammar. What emerges is a view of grammar as a system of local monotonic constraints that provide a direct characterization of the signs (the form-meaning correspondences) of a given language. This "sign-based" conception of grammar has provided precise solutions to the key problems long thought to motivate movement-based analyses, has supported three decades of computational research developing large-scale grammar implementations, and is now beginning to play a role in computational psycholinguistics research that explores the use of underspecification in the incremental computation of partial meanings.
Discovering gene annotations in biomedical text databases
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-01-01
Background Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. Results In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. Conclusion GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values. PMID:18325104
High Precision Sunphotometer using Wide Dynamic Range (WDR) Camera Tracking
NASA Astrophysics Data System (ADS)
Liss, J.; Dunagan, S. E.; Johnson, R. R.; Chang, C. S.; LeBlanc, S. E.; Shinozuka, Y.; Redemann, J.; Flynn, C. J.; Segal-Rosenhaimer, M.; Pistone, K.; Kacenelenbogen, M. S.; Fahey, L.
2016-12-01
High Precision Sunphotometer using Wide Dynamic Range (WDR) Camera TrackingThe NASA Ames Sun-photometer-Satellite Group, DOE, PNNL Atmospheric Sciences and Global Change Division, and NASA Goddard's AERONET (AErosol RObotic NETwork) team recently collaborated on the development of a new airborne sunphotometry instrument that provides information on gases and aerosols extending far beyond what can be derived from discrete-channel direct-beam measurements, while preserving or enhancing many of the desirable AATS features (e.g., compactness, versatility, automation, reliability). The enhanced instrument combines the sun-tracking ability of the current 14-Channel NASA Ames AATS-14 with the sky-scanning ability of the ground-based AERONET Sun/sky photometers, while extending both AATS-14 and AERONET capabilities by providing full spectral information from the UV (350 nm) to the SWIR (1,700 nm). Strengths of this measurement approach include many more wavelengths (isolated from gas absorption features) that may be used to characterize aerosols and detailed (oversampled) measurements of the absorption features of specific gas constituents. The Sky Scanning Sun Tracking Airborne Radiometer (3STAR) replicates the radiometer functionality of the AATS-14 instrument but incorporates modern COTS technologies for all instruments subsystems. A 19-channel radiometer bundle design is borrowed from a commercial water column radiance instrument manufactured by Biospherical Instruments of San Diego California (ref, Morrow and Hooker)) and developed using NASA funds under the Small Business Innovative Research (SBIR) program. The 3STAR design also incorporates the latest in robotic motor technology embodied in Rotary actuators from Oriental motor Corp. having better than 15 arc seconds of positioning accuracy. Control system was designed, tested and simulated using a Hybrid-Dynamical modeling methodology. The design also replaces the classic quadrant detector tracking sensor with a wide dynamic range camera that provides a high precision solar position tracking signal as well as an image of the sky in the 45° field of view around the solar axis, which can be of great assistance in flagging data for cloud effects or other factors that might impact data quality.
Discovering gene annotations in biomedical text databases.
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-03-06
Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values.
The use and limits of scientific names in biological informatics.
Remsen, David
2016-01-01
Scientific names serve to label biodiversity information: information related to species. Names, and their underlying taxonomic definitions, however, are unstable and ambiguous. This negatively impacts the utility of names as identifiers and as effective indexing tools in biological informatics where names are commonly utilized for searching, retrieving and integrating information about species. Semiotics provides a general model for describing the relationship between taxon names and taxon concepts. It distinguishes syntactics, which governs relationships among names, from semantics, which represents the relations between those labels and the taxa to which they refer. In the semiotic context, changes in semantics (i.e., taxonomic circumscription) do not consistently result in a corresponding and reflective change in syntax. Further, when syntactic changes do occur, they may be in response to semantic changes or in response to syntactic rules. This lack of consistency in the cardinal relationship between names and taxa places limits on how scientific names may be used in biological informatics in initially anchoring, and in the subsequent retrieval and integration, of relevant biodiversity information. Precision and recall are two measures of relevance. In biological taxonomy, recall is negatively impacted by changes or ambiguity in syntax while precision is negatively impacted when there are changes or ambiguity in semantics. Because changes in syntax are not correlated with changes in semantics, scientific names may be used, singly or conflated into synonymous sets, to improve recall in pattern recognition or search and retrieval. Names cannot be used, however, to improve precision. This is because changes in syntax do not uniquely identify changes in circumscription. These observations place limits on the utility of scientific names within biological informatics applications that rely on names as identifiers for taxa. Taxonomic systems and services used to organize and integrate information about taxa must accommodate the inherent semantic ambiguity of scientific names. The capture and articulation of circumscription differences (i.e., multiple taxon concepts) within such systems must be accompanied with distinct concept identifiers that can be employed in association with, or in replacement of, traditional scientific names.
Doe, John E.; Lander, Deborah R.; Doerrer, Nancy G.; Heard, Nina; Hines, Ronald N.; Lowit, Anna B.; Pastoor, Timothy; Phillips, Richard D.; Sargent, Dana; Sherman, James H.; Young Tanir, Jennifer; Embry, Michelle R.
2016-01-01
Abstract The HESI-coordinated RISK21 roadmap and matrix are tools that provide a transparent method to compare exposure and toxicity information and assess whether additional refinement is required to obtain the necessary precision level for a decision regarding safety. A case study of the use of a pyrethroid, “pseudomethrin,” in bed netting to control malaria is presented to demonstrate the application of the roadmap and matrix. The evaluation began with a problem formulation step. The first assessment utilized existing information pertaining to the use and the class of chemistry. At each stage of the step-wise approach, the precision of the toxicity and exposure estimates were refined as necessary by obtaining key data which enabled a decision on safety to be made efficiently and with confidence. The evaluation demonstrated the concept of using existing information within the RISK21 matrix to drive the generation of additional data using a value-of-information approach. The use of the matrix highlighted whether exposure or toxicity required further investigation and emphasized the need to address the default uncertainty factor of 100 at the highest tier of the evaluation. It also showed how new methodology such as the use of in vitro studies and assays could be used to answer the specific questions which arise through the use of the matrix. The matrix also serves as a useful means to communicate progress to stakeholders during an assessment of chemical use. PMID:26517449
Visual long-term memory has the same limit on fidelity as visual working memory.
Brady, Timothy F; Konkle, Talia; Gill, Jonathan; Oliva, Aude; Alvarez, George A
2013-06-01
Visual long-term memory can store thousands of objects with surprising visual detail, but just how detailed are these representations, and how can one quantify this fidelity? Using the property of color as a case study, we estimated the precision of visual information in long-term memory, and compared this with the precision of the same information in working memory. Observers were shown real-world objects in random colors and were asked to recall the colors after a delay. We quantified two parameters of performance: the variability of internal representations of color (fidelity) and the probability of forgetting an object's color altogether. Surprisingly, the fidelity of color information in long-term memory was comparable to the asymptotic precision of working memory. These results suggest that long-term memory and working memory may be constrained by a common limit, such as a bound on the fidelity required to retrieve a memory representation.
Distinguishing Provenance Equivalence of Earth Science Data
NASA Technical Reports Server (NTRS)
Tilmes, Curt; Yesha, Ye; Halem, M.
2010-01-01
Reproducibility of scientific research relies on accurate and precise citation of data and the provenance of that data. Earth science data are often the result of applying complex data transformation and analysis workflows to vast quantities of data. Provenance information of data processing is used for a variety of purposes, including understanding the process and auditing as well as reproducibility. Certain provenance information is essential for producing scientifically equivalent data. Capturing and representing that provenance information and assigning identifiers suitable for precisely distinguishing data granules and datasets is needed for accurate comparisons. This paper discusses scientific equivalence and essential provenance for scientific reproducibility. We use the example of an operational earth science data processing system to illustrate the application of the technique of cascading digital signatures or hash chains to precisely identify sets of granules and as provenance equivalence identifiers to distinguish data made in an an equivalent manner.
Functional precision cancer medicine-moving beyond pure genomics.
Letai, Anthony
2017-09-08
The essential job of precision medicine is to match the right drugs to the right patients. In cancer, precision medicine has been nearly synonymous with genomics. However, sobering recent studies have generally shown that most patients with cancer who receive genomic testing do not benefit from a genomic precision medicine strategy. Although some call the entire project of precision cancer medicine into question, I suggest instead that the tools employed must be broadened. Instead of relying exclusively on big data measurements of initial conditions, we should also acquire highly actionable functional information by perturbing-for example, with cancer therapies-viable primary tumor cells from patients with cancer.
Precision medicine at the crossroads.
Olson, Maynard V
2017-10-11
There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.
A grid for a precise analysis of daily activities.
Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E
2010-01-01
Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.
Visual tracking strategies for intelligent vehicle highway systems
NASA Astrophysics Data System (ADS)
Smith, Christopher E.; Papanikolopoulos, Nikolaos P.; Brandt, Scott A.; Richards, Charles
1995-01-01
The complexity and congestion of current transportation systems often produce traffic situations that jeopardize the safety of the people involved. These situations vary from maintaining a safe distance behind a leading vehicle to safely allowing a pedestrian to cross a busy street. Environmental sensing plays a critical role in virtually all of these situations. Of the sensors available, vision sensors provide information that is richer and more complete than other sensors, making them a logical choice for a multisensor transportation system. In this paper we present robust techniques for intelligent vehicle-highway applications where computer vision plays a crucial role. In particular, we demonstrate that the controlled active vision framework can be utilized to provide a visual sensing modality to a traffic advisory system in order to increase the overall safety margin in a variety of common traffic situations. We have selected two application examples, vehicle tracking and pedestrian tracking, to demonstrate that the framework can provide precisely the type of information required to effectively manage the given situation.
Influence of local topography on precision irrigation management
USDA-ARS?s Scientific Manuscript database
Precision irrigation management is currently accomplished using spatial information about soil properties through soil series maps or electrical conductivity (EC measurements. Crop yield, however, is consistently influenced by local topography, both in rain-fed and irrigated environments. Utilizing ...
Precision aerial application for site-specific rice crop management
USDA-ARS?s Scientific Manuscript database
Precision agriculture includes different technologies that allow agricultural professional to use information management tools to optimize agriculture production. The new technologies allow aerial application applicators to improve application accuracy and efficiency, which saves time and money for...
Biased normalized cuts for target detection in hyperspectral imagery
NASA Astrophysics Data System (ADS)
Zhang, Xuewen; Dorado-Munoz, Leidy P.; Messinger, David W.; Cahill, Nathan D.
2016-05-01
The Biased Normalized Cuts (BNC) algorithm is a useful technique for detecting targets or objects in RGB imagery. In this paper, we propose modifying BNC for the purpose of target detection in hyperspectral imagery. As opposed to other target detection algorithms that typically encode target information prior to dimensionality reduction, our proposed algorithm encodes target information after dimensionality reduction, enabling a user to detect different targets in interactive mode. To assess the proposed BNC algorithm, we utilize hyperspectral imagery (HSI) from the SHARE 2012 data campaign, and we explore the relationship between the number and the position of expert-provided target labels and the precision/recall of the remaining targets in the scene.
On the use of orientation filters for 3D reconstruction in event-driven stereo vision
Camuñas-Mesa, Luis A.; Serrano-Gotarredona, Teresa; Ieng, Sio H.; Benosman, Ryad B.; Linares-Barranco, Bernabe
2014-01-01
The recently developed Dynamic Vision Sensors (DVS) sense visual information asynchronously and code it into trains of events with sub-micro second temporal resolution. This high temporal precision makes the output of these sensors especially suited for dynamic 3D visual reconstruction, by matching corresponding events generated by two different sensors in a stereo setup. This paper explores the use of Gabor filters to extract information about the orientation of the object edges that produce the events, therefore increasing the number of constraints applied to the matching algorithm. This strategy provides more reliably matched pairs of events, improving the final 3D reconstruction. PMID:24744694
[Myocardial perfusion scintigraphy - short form of the German guideline].
Lindner, O; Burchert, W; Hacker, M; Schaefer, W; Schmidt, M; Schober, O; Schwaiger, M; vom Dahl, J; Zimmermann, R; Schäfers, M
2013-01-01
This guideline is a short summary of the guideline for myocardial perfusion scintigraphy published by the Association of the Scientific Medical Societies in Ger-many (AWMF). The purpose of this guideline is to provide practical assistance for indication and examination procedures as well as image analysis and to present the state-of-the-art of myocardial-perfusion-scintigraphy. After a short introduction on the fundamentals of imaging, precise and detailed information is given on the indications, patient preparation, stress testing, radiopharmaceuticals, examination protocols and techniques, radiation exposure, data reconstruction as well as information on visual and quantitative image analysis and interpretation. In addition possible pitfalls, artefacts and key elements of reporting are described.
Haemophilia A: carrier detection and prenatal diagnosis by linkage analysis using DNA polymorphism.
Tuddenham, E G; Goldman, E; McGraw, A; Kernoff, P B
1987-01-01
Restriction fragment length polymorphisms (RFLPs) within or close to the factor VIII locus are very useful for genetic linkage analysis. Such RFLPs allow a mutant allele to be tracked in a family, segregating haemophilia A even when, as is usually the case, the precise mutation causing failure to synthesise factor VIII is unknown. To date two markers tightly linked to the factor VIII locus have been described, one of which is highly polymorphic and therefore informative in most kindreds. A significant crossover rate, however, does not make diagnosis absolute. Three intragenic RFLPs have been defined, which, taken together, are informative in about 70% of women, providing virtually deterministic genetic diagnosis. PMID:2889753
U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf systematics of returned Mars samples
NASA Technical Reports Server (NTRS)
Tatsumoto, M.; Premo, W. R.
1988-01-01
The advantage of studying returned planetary samples cannot be overstated. A wider range of analytical techniques with higher sensitivities and accuracies can be applied to returned samples. Measurement of U-Th-Pb, Sm-Nd, Rb-Sr, and Lu-Hf isotopic systematics for chronology and isotopic tracer studies of planetary specimens cannot be done in situ with desirable precision. Returned Mars samples will be examined using all the physical, chemical, and geologic methods necessary to gain information on the origin and evolution of Mars. A returned Martian sample would provide ample information regarding the accretionary and evolutionary history of the Martian planetary body and possibly other planets of our solar system.
Constraining the mass and radius of neutron star by future observations
NASA Astrophysics Data System (ADS)
Kwak, Kyujin; Lee, Chang-Hwan; Kim, Myungkuk; Kim, Young-Min
2018-04-01
The mass and radius of neutron star (NS) in the low mass X-ray binary (LMXB) can be measured simultaneously from the evolving spectra of the photospheric radius expansion (PRE) X-ray bursts (XRBs). Precise measurements require the distance to the target, information on the radiating surface, and the composition of accreted material. Future observations with large ground-based telescopes such as Giant Magellan Telescope (GMT) and Thirty Meter Telescope (TMT) may reduce the uncertainties in the estimation of the mass and radius of NS because they could provide information on the composition of accreted material by identifying the companion stars in LMXBs. We investigate these possibilities and present our results for selected targets.
Semipermanent GPS (SPGPS) as a volcano monitoring tool: Rationale, method, and applications
Dzurisin, Daniel; Lisowski, Michael; Wicks, Charles W.
2017-01-01
Semipermanent GPS (SPGPS) is an alternative to conventional campaign or survey-mode GPS (SGPS) and to continuous GPS (CGPS) that offers several advantages for monitoring ground deformation. Unlike CGPS installations, SPGPS stations can be deployed quickly in response to changing volcanic conditions or earthquake activity such as a swarm or aftershock sequence. SPGPS networks can be more focused or more extensive than CGPS installations, because SPGPS equipment can be moved from station to station quickly to increase the total number of stations observed in a given time period. SPGPS networks are less intrusive on the landscape than CGPS installations, which makes it easier to satisfy land-use restrictions in ecologically sensitive areas. SPGPS observations are preferred over SGPS measurements because they provide better precision with only a modest increase in the amount of time, equipment, and personnel required in the field. We describe three applications of the SPGPS method that demonstrate its utility and flexibility. At the Yellowstone caldera, Wyoming, a 9-station SPGPS network serves to densify larger preexisting networks of CGPS and SGPS stations. At the Three Sisters volcanic center, Oregon, a 14-station SPGPS network complements an SGPS network and extends the geographic coverage provided by 3 CGPS stations permitted under wilderness land-use restrictions. In the Basin and Range province in northwest Nevada, a 6-station SPGPS network has been established in response to a prolonged earthquake swarm in an area with only sparse preexisting geodetic coverage. At Three Sisters, the estimated precision of station velocities based on annual ~ 3 month summertime SPGPS occupations from 2009 to 2015 is approximately half that for nearby CGPS stations. Conversely, SPGPS-derived station velocities are about twice as precise as those based on annual ~ 1 week SGPS measurements. After 5 years of SPGPS observations at Three Sisters, the precision of velocity determinations is estimated to be 0.5 mm/yr in longitude, 0.6 mm/yr in latitude, and 0.8 mm/yr in height. We conclude that an optimal approach to monitoring volcano deformation includes complementary CGPS and SPGPS networks, periodic InSAR observations, and measurements from in situ borehole sensors such as tiltmeters or strainmeters. This comprehensive approach provides the spatial and temporal detail necessary to adequately characterize a complex and evolving deformation pattern. Such information is essential to multi-parameter models of magmatic or tectonic processes that can help to guide research efforts, and also to inform hazards assessments and land-use planning decisions.
Semipermanent GPS (SPGPS) as a volcano monitoring tool: Rationale, method, and applications
NASA Astrophysics Data System (ADS)
Dzurisin, Daniel; Lisowski, Michael; Wicks, Charles W.
2017-09-01
Semipermanent GPS (SPGPS) is an alternative to conventional campaign or survey-mode GPS (SGPS) and to continuous GPS (CGPS) that offers several advantages for monitoring ground deformation. Unlike CGPS installations, SPGPS stations can be deployed quickly in response to changing volcanic conditions or earthquake activity such as a swarm or aftershock sequence. SPGPS networks can be more focused or more extensive than CGPS installations, because SPGPS equipment can be moved from station to station quickly to increase the total number of stations observed in a given time period. SPGPS networks are less intrusive on the landscape than CGPS installations, which makes it easier to satisfy land-use restrictions in ecologically sensitive areas. SPGPS observations are preferred over SGPS measurements because they provide better precision with only a modest increase in the amount of time, equipment, and personnel required in the field. We describe three applications of the SPGPS method that demonstrate its utility and flexibility. At the Yellowstone caldera, Wyoming, a 9-station SPGPS network serves to densify larger preexisting networks of CGPS and SGPS stations. At the Three Sisters volcanic center, Oregon, a 14-station SPGPS network complements an SGPS network and extends the geographic coverage provided by 3 CGPS stations permitted under wilderness land-use restrictions. In the Basin and Range province in northwest Nevada, a 6-station SPGPS network has been established in response to a prolonged earthquake swarm in an area with only sparse preexisting geodetic coverage. At Three Sisters, the estimated precision of station velocities based on annual 3 month summertime SPGPS occupations from 2009 to 2015 is approximately half that for nearby CGPS stations. Conversely, SPGPS-derived station velocities are about twice as precise as those based on annual 1 week SGPS measurements. After 5 years of SPGPS observations at Three Sisters, the precision of velocity determinations is estimated to be 0.5 mm/yr in longitude, 0.6 mm/yr in latitude, and 0.8 mm/yr in height. We conclude that an optimal approach to monitoring volcano deformation includes complementary CGPS and SPGPS networks, periodic InSAR observations, and measurements from in situ borehole sensors such as tiltmeters or strainmeters. This comprehensive approach provides the spatial and temporal detail necessary to adequately characterize a complex and evolving deformation pattern. Such information is essential to multi-parameter models of magmatic or tectonic processes that can help to guide research efforts, and also to inform hazards assessments and land-use planning decisions.